Belisarius Cawl
kiwifarms.net
- Joined
- Mar 16, 2024
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
I think I figured out what my problem was. I am new to c# and I realized that I probably have to add something likeNo you can put the script anywhere in the tree but the root can address any node, nodes below the root cant address the root. I'm gonna guess you're addressing the wrong node but think you're addressing another one, that's why it only works when your script is attached to it but fails silently otherwise. Use % unique names in the tree for your node and address using that
Node optionbuttion = New Node();
optionbuttion.GetNode(nodepath)Select(int);
I thought it would be a swastika but it's just some pajeet letter. Got stuck for a moment on not(), that was clever.
It's the Amogus letterI thought it would be a swastika but it's just some pajeet letter. Got stuck for a moment on not(), that was clever.
That's why I use VimView attachment 6554128
>Using VS Code
>Pc starts running really fucking slow
>Pull up task manager
>Console window host using 70% of my cpu
>Try saving and closing down everything
>Its still using up all my cpu
>Look up solution online
>Saaar you have to right click then click right where it says end task to ending the task saar
>Try ending task
>Screen goes black
>MFW
>Using VS Code>Pozzing my neghole
public OptionButton _SizeOption()
{
return GetNode<OptionButton>("%SizeOption");
}
public Vector2I pozMyViewport(int x, int y)
{
var ScreenSize = DisplayServer.WindowGetSize();
int xPoz = (ScreenSize[0] - x)/2;
int yPoz = (Screensize[1] - y)/2;
return new Vector2I(xPoz, YPoz);
}
int[] xDim = [640,1280,1920];
int[] yDim = [360,720,1080];
public void adjustResolution()
{
var s = _SizeOption().GetSelected();
Vector2I newSize = new Vector2I(xDim[s], yDim[s]);
DisplayServer.WindowSetSize(newSize);
DisplayServer.WindowSetPosition(pozMyViewport(xDim[s], yDim[s]));
}
That's a different sort of bug-chasing, surelyFunny enough, I wrote a method called poz my viewport.
C#:public OptionButton _SizeOption() { return GetNode<OptionButton>("%SizeOption"); } public Vector2I pozMyViewport(int x, int y) { var ScreenSize = DisplayServer.WindowGetSize(); int xPoz = (ScreenSize[0] - x)/2; int yPoz = (Screensize[1] - y)/2; return new Vector2I(xPoz, YPoz); } int[] xDim = [640,1280,1920]; int[] yDim = [360,720,1080]; public void adjustResolution() { var s = _SizeOption().GetSelected(); Vector2I newSize = new Vector2I(xDim[s], yDim[s]); DisplayServer.WindowSetSize(newSize); DisplayServer.WindowSetPosition(pozMyViewport(xDim[s], yDim[s])); }
Typed this on my phone from memory so it might have a few errors.
public partial class Play : Node2D
{
public class Player : Sprite2D
{
//All the code for my player node goes here
}
}
I hate the "every language above machine code is high level" argument. C/C++ code when read it's easy to map the lines to assembler code. I understand compilers will generate what they want when optimising.I really think this to be a misnomer, outside of academic discussions. The machine code is low level, as are thin layers above it, but past incredibly small languages like Forth it becomes a pissing match.
The most important things for a programmer to learn, and which most fail to do, is to learn how to learn and to learn how to think. A calculator with the ability to store short programs and therefore automate basic tasks would be better than half the shit I see people recommend. Throwing newcomers into the unacceptable hellscape of modern programming isn't the way to do it, unless someone wants to scare them off programming forever.
I think I probably talked in here earlier about how most programming languages are descendants of ALGOL and there is a ton that will carry over from one fairly common language to the next. With JavaScript, to my mind, there are (at least) two very fundamental differences that stand out, weak typing and the traditional rejection of a class-based object system in favor of a prototype-based object inspired by Self, a language developed by mainly by Sun Microsystems as a successor to Smalltalk. In the latter case, maybe someone could school me, but I'm not really convinced that prototypes are vastly better than classes. And in case you didn't know what I'm talking about already:Still, I'm happy I learned Javascript first. I think its more accessible to beginners and I'm shocked to see that almost all the shit I learned from writing Javascript translates over to writing in C#.
class
syntax that's been in JavaScript since the ES6 standard should be abandoned altogether but I'm fortunately not so involved in unswell JavaScript faggotry currently to evaluate these claims. What I have a bug up my ass about is the weak typing. This meme mostly portrays the situation well:0.1 + 0.2 != 0.3
is just as true in Python because it's inherently difficult to represent decimal numbers in pure binary. On a related note, if you want to handle currency accurately in any programming language, you need to use a dedicated decimal data type or use integer arithmetic on cents / pennies and then convert to dollars or Euros or whatever at the last possible moment. (I even remember solving British math puzzles from well before 1970 when they finally did decimalization where I had to convert pounds (20 shillings) and shillings (12 pence) to pence to make the algebra work and the New York Stock Exchange once reported eighths of a dollar into the 80s, which you'll know if you watch Wall Street.)bool
values and dividing by len()
to come up with a proportion of which outcomes were True
over how many total attempts, but if you look under the hood, it's just magic methods rather than nebulously forcing one type of data to become another. JavaScript and Perl, venerable web programming languages, both made the Faustian pact of making things a tiny bit more convenient temporarily only to fuck things up years later. That's why ===
(triple equal sign) exists in JavaScript and not in a good way like it exists in Ruby. It wasn't thought that JavaScript would ever be used for anything other than tasks like simple form validation in the early- to mid-90s and now we are continuing to pay the price of temporary convenience.Sorry, nah. The value of C and assembly language(s) isn't lost on me but consider all of the people who want to acquire data science knowledge who are coming from social / behavioral sciences and/or a medical background or even other scientific or mathematical backgrounds. Should they immediately have to deal with memory management or should they be let to hit the bricks running with tools like Python and/or R? It isn't the 80s or 90s anymore and they should be able to dive right into visualizing and modeling data without having to worry aboutC is the first language you want to learn, you get to understand how the system works and it also becomes very easy to learn other languages as they all borrow very heavily from C.
malloc()
and free()
. Deeper understanding of the Blessed Machine can come later.I've explained that here:This looks like a lot of fun, but how does typing all of those crazy symbols work?
Something else nice about how the entire program fits on a few lines occurs to me when I consider alternative implementation strategies. It may happen that an APL program can't reasonably be optimized for some superior implementation, but rewriting the program is fine when it's a single line, isn't it?Someone wrote an APL mode for GNU Emacs, gnu-apl-mode, which I use. The input method uses a period as the prefix key. That may seem inconvenient, but it's not when the entire program fits on a few lines anyway.
Why, because it's true? I mentioned Forth as an example, which truly does eschew many abstractions, but most languages would fall into the middling category, low level enough to be a pain in the ass and not high level enough to be worth a fuck.I hate the "every language above machine code is high level" argument.
It's certainly easy for one to fool himself into thinking he can do that.C/C++ code when read it's easy to map the lines to assembler code.
Yes, like that.I understand compilers will generate what they want when optimising.
Yes, and for almost everything it's a huge mistake.Having a language with pointers and manual memory management + alignment and so forth is a HUGE difference between a language like Java
This is lunacy. It's good for tricking oneself into believing he knows how the system works, and it's excellent if someone wants to learn many languages that are all the same besides minor differences.C is the first language you want to learn, you get to understand how the system works and it also becomes very easy to learn other languages as they all borrow very heavily from C.
This is a lie. Many of the languages I enjoy existed beforehand. Of course, I guess this statement is true if one ignores every instance in which it's incorrect. The will to rewrite history is strong with every cult.they all borrow very heavily from C
While I agree the skill level for professional programming should be raised, that includes filtering the midwits who think unnecessary hardship to be good. Half the complications people have with pointers in that language is the unbelievably shitty syntax.Honestly, if newcomers get filtered by pointers early it saves them time with the fact they stop pursuing programming and software from having a shit dev.
Say no more. I'm sold.an APL mode for GNU Emacs
Of course, I guess this statement is true if one ignores every instance in which it's incorrect.
That's most programming tbf.unnecessary hardship
That's most programming tbf.
The fundamental concept here is indirection, not pointers. Someone who can understand indirection can understand pointers, unique database keys likely implemented as pointers, and references.In my experience, it's the mechanics behind pointers (memory addressing, dereferencing, etc.) that trip most students up. It takes a certain mindset to work with, and most people simply aren't well equipped to work within that mindset.
People will write Fortran in any language.we're starting to see the deleterious effects of permitting this sort of intellectual laziness where people don't care to learn how things work and what the best practices are by extension
I can't think of a single good reason for someone to learn the C language for these things over a machine code. The machine code is simpler, and really does lack training wheels. Someone who wants to learn about the machine implementation of concepts like indirection should write a simple machine code program, if he wants practical experience at all. I may not have a good perspective here, because I did all of this anyway, but I can easily envision someone just as competent who didn't need it.So I guess my point is: how much of this hardship is truly unnecessary in the long run?
I simply light incense and meditate while looking at the screen until the Omnissiah tells me what to type next.Deeper understanding of the Blessed Machine can come later.
Agreed. At the core of my argument is the eternal debate around the Fundamental Theorem of Software Engineering, but centered around education rather than practicality and efficiency. This further generalizes to any sort of conceptual abstraction, not limited to CS.The fundamental concept here is indirection, not pointers. Someone who can understand indirection can understand pointers, unique database keys likely implemented as pointers, and references.
People will write Fortran in any language.
I agree 100%, in principle. x86 is pretty high-level, relatively speaking. The thing is you and I aren't wired like most people. There are guys like us who naturally gravitate towards the deeper theoretical stuff, and there are guys who simply want to write better, more speedier programs, who need a practical framework to apply this conceptual stuff without needing our autistic levels of patience. C (on a Unix system, as the Lord intended) serves as a remarkably solid compromise here.I can't think of a single good reason for someone to learn the C language for these things over a machine code. The machine code is simpler, and really does lack training wheels. Someone who wants to learn about the machine implementation of concepts like indirection should write a simple machine code program, if he wants practical experience at all. I may not have a good perspective here, because I did all of this anyway, but I can easily envision someone just as competent who didn't need it.
None because coding is for stinky nerds with no bitches. You want bitches? Learn to weld.Which coding language will get you the most bitches?
Thats the shit we need to figure out.
HaskellWhich coding language will get you the most bitches?