Programming thread

JS was able to dominate because you had enormous amount of programers that know it due to web as such by supply and demand hiring JS fag was and is cheaper than anyone else. World would have been way better place if scheme were the language of the web like it was originally supposed to do but ofc java hype needed to fuck that as well.
 
"Innovations" in languages these days are just about making them as pajeet-proofed as possible. Thus cargo/go get replacing NPM, thus a single source of truth replacing a spec and a variety of compilers and platforms, etc.
While I mostly agree, anyone liking npm should really question if they were dropped as a child. Assuming by npm you mean node package manager. I think even the original author tries to distance themself from that dogshit.
At the end of the day, language creators need to understand that what they're doing is essentially giving a programmer a loaded gun. You can cover the gun in warnings and all kinds of devices to try to make sure they use it properly, but you can't really stop someone who is determined to shoot their cock off with it from doing exactly that.
This is why C remains one of my favorites even if I barely use it these days. As nice as Rust can be, with it's type system and whatever else, those same features just get in the way. Yes I want to access raw memory and cast it to whatever. I (think I) know what I am doing. If I make a mistake nobody's going to blame the compiler. When a program borks your computer or business nobody blames the language it was written in, they blame the authors of the program.
 
Only (relatively) recently did hardware get good enough to run Lisps.
i will never understand why garbage collectors piss retards off so much
Still, I wish I had proper Lisp classes, that language is really cool.
you can at least learn it yourself! just use a package manager of your choice to install a lisp or scheme implementation and start playing in the repl today
you will be an obnoxious "lisp is the best language ever and every other language is so backwards and stupid" nigger in no time
What happened to local software development? Had corporations at large got lazy and decide that having internet is mandatory?
probably because html+css-based ui seems a lot more accessible and runs on more platforms than major c/c++ widget toolkits
JS was able to dominate because you had enormous amount of programers that know it due to web as such by supply and demand hiring JS fag was and is cheaper than anyone else. World would have been way better place if scheme were the language of the web like it was originally supposed to do but ofc java hype needed to fuck that as well.
i think a bit of the scheme still rubbed off on javascript and that may be part of why you can never escape it these days
This is why C remains one of my favorites even if I barely use it these days. As nice as Rust can be, with it's type system and whatever else, those same features just get in the way. Yes I want to access raw memory and cast it to whatever. I (think I) know what I am doing. If I make a mistake nobody's going to blame the compiler. When a program borks your computer or business nobody blames the language it was written in, they blame the authors of the program.
rustniggers&co. don't realize exactly how based c is: it's partially designed to be used as an IR for other languages
 
i will never understand why garbage collectors piss retards off so much
If you have ever used a minecraft server with the gui activated, you will see something like to this image:
1741897799791.png

This is the Garbage Collector manifest. The performance characteristics may be acceptable in the absolute measure, but conceptually they are offensive— memory usage increases at a ridiculous rate, then the program must spend a large amount of time (compared to normal operation) clearing the garbage.

and cast it to whatever
Transmute?
 
If you have ever used a minecraft server with the gui activated, you will see something like to this image:
1741897799791.png

This is the Garbage Collector manifest. The performance characteristics may be acceptable in the absolute measure, but conceptually they are offensive— memory usage increases at a ridiculous rate, then the program must spend a large amount of time (compared to normal operation) clearing the garbage.
memory usage graphs looking like that is how a garbage collected runtime is supposed to look
additionally: if your knowledge of gc comes from the minecraft server gui memory usage graph you probably don't know enough about gc to criticize it accurately
Transmute?
casting is when you tell c to treat some piece of data as a different type than what it was marked as
it makes a quite dramatic appearance in the infamous quake 3 fast inverse square root function
 
memory usage graphs looking like that is how a garbage collected runtime is supposed to look
Nobody claimed otherwise. By the way, I agree that Garbage Collection is good, I am just presenting the common argument against it.
additionally: if your knowledge of gc comes from the minecraft server gui memory usage graph you probably don't know enough about gc to criticize it accurately
I hope this isn't directed at me. It's simply a fun example that happens to be demonstrative.
casting is when you tell c to treat some piece of data as a different type than what it was marked as
👍"Transmute" is one of two Rust things similar to c(++) casting.
 
  • Like
Reactions: Evil Whitey
it makes a quite dramatic appearance in the infamous quake 3 fast inverse square root function
More specifically, the fancy trick is treating the 32-bit float as an int, subtracting from a magic number, and casting back to float.

C:
float Q_rsqrt( float number )
{
    long i;
    float x2, y;
    const float threehalfs = 1.5F;

    x2 = number * 0.5F;
    y  = number;
    i  = * ( long * ) &y;                        // evil floating point bit level hacking
    i  = 0x5f3759df - ( i >> 1 );               // what the fuck?
    y  = * ( float * ) &i;
    y  = y * ( threehalfs - ( x2 * y * y ) );   // 1st iteration
//    y  = y * ( threehalfs - ( x2 * y * y ) );   // 2nd iteration, this can be removed

#ifndef Q3_VM
#ifdef __linux__
    assert( !isnan(y) ); // bk010122 - FPE?
#endif
#endif
    return y;
}
source

The magic number is a float approximation of sqrt(2^127) (01011111 00110111 01011001 11011111). Using it in the way described above results in an approximation of the inverse square root, which is then made more accurate using Newton's method.
 
Nobody claimed otherwise. By the way, I agree that Garbage Collection is good, I am just presenting the common argument against it.
better arguments would be to point out the requirement to bundle a gc, or the potential issues of a 3ms gc freeze when your industrial control software needs to do something in 2ms or people die, or the weird issues (and security vulnerabilities!) you can end up having with finalization
credit where it's due: you did point out what the average gc hater says when you ask them why gc is bad and slow
I hope this isn't directed at me. It's simply a fun example that happens to be demonstrative.
it's directed at whoever thinks a sawtooth-shaped memory usage graph is proof that gc is inefficient, which i hope doesn't include you either
👍"Transmute" is one of two Rust things similar to c(++) casting.
of course rustroglodytes have to call it something slightly different for absolutely no reason
 
  • Like
Reactions: Evil Whitey
the potential issues of a 3ms gc freeze
This is the less common (not retarded) argument.
whoever thinks a sawtooth-shaped memory usage graph is proof that gc is inefficient
The aforementioned retards.
call it something slightly different for absolutely no reason
"explicit type conversion (casting) [in rust] follow C conventions generally, except in cases where C has undefined behavior." REF
Transmute is something else that may have been relevant for my original reply regarding "cast[ing raw memory] ... to whatever".
 
i will never understand why garbage collectors piss retards off so much
The average retard thinks that malloc/free is in some sense foundational and therefore every other memory management scheme must necessarily be bloat on top of malloc/free. Never mind that it's one of many schemes and that there are all sorts of variations and tradeoffs even between different mallocs. It doesn't help that most people's exposure to performance issues is video games, which have unusually latency-focused performance requirements that clash with many common GC designs. Meanwhile, latency borders on irrelevant for most programs unless you do something really stupid.
 
  • Agree
Reactions: ADHD Mate
The average retard thinks that malloc/free is in some sense foundational and therefore every other memory management scheme must necessarily be bloat on top of malloc/free. Never mind that it's one of many schemes and that there are all sorts of variations and tradeoffs even between different mallocs.
say "compacting collector" and "bump allocator" and maybe "free list" and get a funny confused stare from these memory management experts
It doesn't help that most people's exposure to performance issues is video games, which have unusually latency-focused performance requirements that clash with many common GC designs. Meanwhile, latency borders on irrelevant for most programs unless you do something really stupid.
note: most of the games they bring up have horrible issues that would make them chug even without a gc
 
The java memory management system totally sucks I have no clue why anyone would defend it

Did they accidentally swap from an alternate timeline where enterprise software is actually good or something?
 
Re: memory usage sawtooth graphs, what do people imagine a memory usage graph should look like? That's the pattern of dynamic memory usage, a bunch of stuff gets allocated and eventually falls out of scope and are (or should be) cleaned up. But I suppose it's just the magical thinking of retards imagining that somehow freeing memory costs nothing when a human does it and everything when a runtime does.
 
👍"Transmute" is one of two Rust things similar to c(++) casting.
I am not familiar with that part of Rust. To be fair I was exaggerating with my C comment, as I tend to not need to do those things myself.
Nobody claimed otherwise. By the way, I agree that Garbage Collection is good, I am just presenting the common argument against it.
GC is good, for the most part. Provided you know about what it will be doing. The problem with it is a wider one: lot's of people have learnt not to give a fuck about memory and end up writing awful code.

I've had to fix multiple bits of production code where colleagues have just been concatenating ever increasing strings (in a language where strings are immutable), not realising that it allocates more and more memory, with the old string hanging around for a GC pass.

Memory management is one of the things I like about Rust, but simultaneously makes it annoying to code in because of ownership.

If I recall correctly, one of the issues with earlier Minecraft versions was literally that Notch never really thought about the GC.
 
If I recall correctly, one of the issues with earlier Minecraft versions was literally that Notch never really thought about the GC.
People loved to blame the performance on Java and GC back in the day, but as far as I know the actual issue was the renderer, which used the ancient OpenGL 1 and did ridiculous shit like issuing a draw call for every block or so. Once the overhaul of the renderer landed (around 1.8?), performance got noticeably better. I'm not sure how you'd run specifically into GC issues in Minecraft either to be honest, outside of Java pointer soup being taxing in general.
I've had to fix multiple bits of production code where colleagues have just been concatenating ever increasing strings (in a language where strings are immutable), not realising that it allocates more and more memory, with the old string hanging around for a GC pass.
Was this Python and its awful reference counting GC? Most GCs are very good at cleaning up lots of short-lived garbage quickly. Not that I'm defending concatenate loops, those are always wasteful.
 
The problem with it is a wider one: lot's of people have learnt not to give a fuck about memory and end up writing awful code.

By far, one of the worst aspect of modern programming is the complacency/ignorance towards performance and resource usage, which will only worsen as "AI" code generation becomes more prevalent. It doesn't help that everything is perpetually in development and nothing can ever be finished (even HTML is a "living standard" now) so there's no target to polish.
 
People loved to blame the performance on Java and GC back in the day, but as far as I know the actual issue was the renderer, which used the ancient OpenGL 1 and did ridiculous shit like issuing a draw call for every block or so. Once the overhaul of the renderer landed (around 1.8?), performance got noticeably better.
True, but then Mojang started using immutable data-object classes for their basic Vector3f, BlockPos, Quaternion, etc. math structures, which were allocated en masse every single frame for every single mathematical calculation. I forget exactly which version this happened in, but I do remember it being a problem.

This is one of the places where C# really does outperform Java, since while value types aren't a magical panacea, they are at least able to be put on the stack, or inlined inside of array data, without needing to worry too much about creating new heap objects for the GC to track.

In Java, a better approach would be to minimize constructions of geometry objects with widespread mutability and perhaps object pooling as well, though the latter has become less necessary in recent years. Not as good as C#'s structs, but until Project Valhalla stops being constantly reinvented and starts being implemented and polished, it's what Java game developers need to resort to.
 
People loved to blame the performance on Java and GC back in the day, but as far as I know the actual issue was the renderer, which used the ancient OpenGL 1 and did ridiculous shit like issuing a draw call for every block or so. Once the overhaul of the renderer landed (around 1.8?), performance got noticeably better. I'm not sure how you'd run specifically into GC issues in Minecraft either to be honest, outside of Java pointer soup being taxing in general.

Was this Python and its awful reference counting GC? Most GCs are very good at cleaning up lots of short-lived garbage quickly. Not that I'm defending concatenate loops, those are always wasteful.
Ah that's interesting. I thought it was a lot of construction and destruction of objects but that doesn't track now that I think about it.

This was Python yes. I thought RC wasn't that bad but it is the most basic technique. I've been spoilt by the Java GC so was shocked by how shit Python's was.
Not as good as C#'s structs, but until Project Valhalla stops being constantly reinvented and starts being implemented and polished, it's what Java game developers need to resort to.
I thought java was getting struts or some lightweight data types in a more recent version? I've given up paying attention to Valhalla.
 
Back