Programming thread

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
I'm now sitting in a meeting, hungover, listening to my coworkers drone on about "customer experiences" and "enterprise features" and it's got me considering a career change. Maybe roofing.
Isn't today a holiday? Shouldn't you be off?
 
  • Feels
Reactions: Marvin
Jesus Christ. I fucking hate all the web frameworks that try to make everything super easy with all the hard stuff baked in. Not because I don't like easy stuff, but because most developers cannot tell the difference between "easy" and "opaque as fuck".

So at work, one of my coworkers whipped up some web frontend in Python using django or some shit.

I'm having trouble logging into our app, so I'm picking through his code trying to see if maybe it's not connecting to the right database. (This coworker also has a bad habit of doing dumb shit like baking db credentials into docker images or leaving them as literals in source code and all kinds of dumbass stuff.)

So I start picking through his code. I'm doing printf debugging because no matter whatever anyone says, printf debugging will get your problem solved and fixed quicker than using a specialized tool 90% of the time.

And because of all the "easy to use" crap, I cannot figure out what bit of code gets called when I do curl -v -X POST --data-binary 'blah blah blah' http://localhost:8000/api/login.

I have done printf debugging with this project successfully on other endpoints, but not the login one because it's hooked up to use some django auth library.

Which is fine, except I can't even find where his code does the "fetch user, verify password" logic on behalf of that django auth library. I just want to go there, do print(f"debug delet this: {... login credentials}"), so I can possibly do the DB query by hand to see where it's fucking up.

I've been fucking around with this all night and I'm getting somewhat suspicious that there is no such "fetch user, verify password" logic in his code, and django is just doing some automagical shit on its own to fetch that. Which means that there is nothing that I can poke at in his code to debug.

Anyone designing APIs: stop this nonsense. I know why people do this. They think "well, we can have it guess the typical database fields and then provide alternatives as silent optional config fields". Stop it. The linkage between my code and undebuggable, opaque fucking external libraries is not superfluous. In fact, it's absolutely essential. It's essential for a stranger to the code to be able to read the code and get up to speed on what's happening.

And I don't know, maybe a Python/django fan here will come in and correct me (and it's fine if you do) and point out where I should be looking for this.

But that doesn't really address the overall problem of poor API design where they choose to hide essential attributes of the application for convenience.

Like I'm not against sane defaults. Really, I'd say that burdening the programmer with too much complexity is also a mistake in the other direction. No one can be expected to make the technically best decision immediately upon reading the API spec, so having sane defaults is important to be productive. Later you might want to come back and fine tune the settings, but nothing would get done without sane defaults.

I guess with a lot of webshit the programmers designing these libraries just have a garbage sense of design and think "wouldn't it be neat if everything worked magically without having to write any code at all?" If you think that way, why are you even in programming to begin with?

Fuckin pajeets.
I feel that. Trying to debug AI libraries in Python back in the day meant poking through a bunch of PyTorch internals and becoming an expert in various optimizer types. Big fucking headache.

For Python, pdb is pretty decent when printf debugging fails. Just toss a breakpoint in there and step through. The ipython of pdb version adds some nice extra QoL features.

I think this is about optimizing for the examples page to drive adoption. "Shitmongler.js - Simple, Elegant, Opinionated. Do routine tasks in only a single line: shitmongler.magicallyFigureOutWhatTheFuckToDo();" And God help you if what you want to do wasn't considered by the idiot who wrote the thing.
It is my experience that almost all "opinionated" projects are annoying, a configuration nightmare, and often end up causing widespread adoption of bad practices and style. Rust's faggot compiler bitching at me to use snake case and those GitHub bots that issue PRs on Python projects that use tabs instead of 4 spaces are a major part of what's wrong with modern programming and CS.
 
Rust's faggot compiler bitching at me to use snake case and those GitHub bots that issue PRs on Python projects that use tabs instead of 4 spaces are a major part of what's wrong with modern programming and CS.
Enforcing strict compilation settings and a linter came around because of one problem and it's Pajeets. Preventing inscrutable Pajeet code is a win.
 
Enforcing strict compilation settings and a linter came around because of one problem and it's Pajeets. Preventing inscrutable Pajeet code is a win.
So then make an "opinionated" linter like pylint and leave it the fuck alone in the compiler. If GCC complained about me not following the GNU C style, I would track down Stallman, pay my respects, and then promptly punch him in the throat. Rust is perfectly capable of working and compiling without doing all the bullshit formatting stuff the compiler bitches about, so it shouldn't be the compiler's job to police said formatting.
 
I hate the industry's fixation on safety. I hate the way it worms its way into every conversation thanks to people who simply cannot imagine programming in any other context besides doing it professionally and as part of a team where the general technical proficiency is not very high. Maybe it's just me, but it really feels like a cultural shift away from the individualistic, anti-authoritarian philosophy of a lot of boomer and gen-x programmers towards zoomers who were trained to run to a teacher at the first sign of a dispute. If you publicly express that you like working in C, someone will come running over the horizon to frantically warn you that entire language has been deprecated. If you discuss C++, inevitably a group of dudes in fursuits will show up to explain to you that C++ is dead, and you need to start programming in Rust, because Rust is SAFE and Rust has a committee of benevolent shepherds who love you, and love your code, and would never let you wander astray, or use slurs in your code.

I don't know, it's one thing if someone wants to use a language that makes sure everyone on their team isn't retarded enough to cause a memory leak. On the other hand I feel like the total lack of responsibility is just cooking zoomer brains. One guy got really adamant I was wrong for liking C as a programming language because the standard library in C is not very good. I asked him "so what?" and he just demanded over and over that if a programming language doesn't have xyz features it's completely unusable. I don't know the guy personally so idk what was going through his head, but his entire argument boiled down to just repeating the same five Rust evangelist talking points over and over and then getting mad when I said I didn't care.
 
I hate the industry's fixation on safety. I hate the way it worms its way into every conversation thanks to people who simply cannot imagine programming in any other context besides doing it professionally and as part of a team where the general technical proficiency is not very high. Maybe it's just me, but it really feels like a cultural shift away from the individualistic, anti-authoritarian philosophy of a lot of boomer and gen-x programmers towards zoomers who were trained to run to a teacher at the first sign of a dispute. If you publicly express that you like working in C, someone will come running over the horizon to frantically warn you that entire language has been deprecated. If you discuss C++, inevitably a group of dudes in fursuits will show up to explain to you that C++ is dead, and you need to start programming in Rust, because Rust is SAFE and Rust has a committee of benevolent shepherds who love you, and love your code, and would never let you wander astray, or use slurs in your code.

I don't know, it's one thing if someone wants to use a language that makes sure everyone on their team isn't retarded enough to cause a memory leak. On the other hand I feel like the total lack of responsibility is just cooking zoomer brains. One guy got really adamant I was wrong for liking C as a programming language because the standard library in C is not very good. I asked him "so what?" and he just demanded over and over that if a programming language doesn't have xyz features it's completely unusable. I don't know the guy personally so idk what was going through his head, but his entire argument boiled down to just repeating the same five Rust evangelist talking points over and over and then getting mad when I said I didn't care.
All this language crap is because programmers aren't very good.
You can do things totally safely in C, or Assembler, or any other language. But it's haaaarrrddddd. Far easier to not think about all those things and make the language do it. Then your app is 3x slower so your employer just pays 3x the AWS bill or you tell customers that your game only runs on systems with at least a 12GHz CPU and 1TB RAM.

I feel similarly about programming where exception handling is used as code. If there's a 10% chance a field might be null, then it's not an exception, it's normal code and you shouldn't be using exception handling to deal with it.

My lawn, onion on my belt, kids these days, etc.
 
I hate the industry's fixation on safety. I hate the way it worms its way into every conversation thanks to people who simply cannot imagine programming in any other context besides doing it professionally and as part of a team where the general technical proficiency is not very high. Maybe it's just me, but it really feels like a cultural shift away from the individualistic, anti-authoritarian philosophy of a lot of boomer and gen-x programmers towards zoomers who were trained to run to a teacher at the first sign of a dispute. If you publicly express that you like working in C, someone will come running over the horizon to frantically warn you that entire language has been deprecated. If you discuss C++, inevitably a group of dudes in fursuits will show up to explain to you that C++ is dead, and you need to start programming in Rust, because Rust is SAFE and Rust has a committee of benevolent shepherds who love you, and love your code, and would never let you wander astray, or use slurs in your code.

I don't know, it's one thing if someone wants to use a language that makes sure everyone on their team isn't retarded enough to cause a memory leak. On the other hand I feel like the total lack of responsibility is just cooking zoomer brains. One guy got really adamant I was wrong for liking C as a programming language because the standard library in C is not very good. I asked him "so what?" and he just demanded over and over that if a programming language doesn't have xyz features it's completely unusable. I don't know the guy personally so idk what was going through his head, but his entire argument boiled down to just repeating the same five Rust evangelist talking points over and over and then getting mad when I said I didn't care.
I completely agree. C is flexible and powerful, and I think a lot of zoomers find it intimidating for that reason. Modern languages are adopting a "Think like we do. Why would you even want to do it your way?" philosophy to their design, which I think lends itself well to modern styles of thinking. No one knows how to do anything anymore, so they expect there will be someone else out there who has already figured it out for them. You're right about the general lack of responsibility and independent thinking being encouraged in the newer generations.
People who think C's std library is bad are just butthurt they have to manipulate strings manually. For C, strtok in string.h feels like hax.

To go a bit tinfoil hat for a moment: I think the ubiquity of Apple devices, which are famous for that "Do it our way. We know what you want." mindset, has helped cement this line of thinking in those who have used these devices for a majority of their lives and aren't used to anything else. "Android? That's for poor people."

It's good to let others implement certain things like crypto libs, and I begrudgingly accept that Rust's memory safety is very useful for web services. Even Apple's XNU kernel devs frequently leave pointers dangling. It's unfortunate, but expecting people to do things properly isn't realistic, especially these days. Otherwise, I like doing things the hard way.
 
I hate the industry's fixation on safety. I hate the way it worms its way into every conversation thanks to people who simply cannot imagine programming in any other context besides doing it professionally and as part of a team where the general technical proficiency is not very high. Maybe it's just me, but it really feels like a cultural shift away from the individualistic, anti-authoritarian philosophy of a lot of boomer and gen-x programmers towards zoomers who were trained to run to a teacher at the first sign of a dispute. If you publicly express that you like working in C, someone will come running over the horizon to frantically warn you that entire language has been deprecated. If you discuss C++, inevitably a group of dudes in fursuits will show up to explain to you that C++ is dead, and you need to start programming in Rust, because Rust is SAFE and Rust has a committee of benevolent shepherds who love you, and love your code, and would never let you wander astray, or use slurs in your code.

I don't know, it's one thing if someone wants to use a language that makes sure everyone on their team isn't retarded enough to cause a memory leak. On the other hand I feel like the total lack of responsibility is just cooking zoomer brains. One guy got really adamant I was wrong for liking C as a programming language because the standard library in C is not very good. I asked him "so what?" and he just demanded over and over that if a programming language doesn't have xyz features it's completely unusable. I don't know the guy personally so idk what was going through his head, but his entire argument boiled down to just repeating the same five Rust evangelist talking points over and over and then getting mad when I said I didn't care.
You don't need to be retarded to cause a memory leak.

Normal, intelligent, talented programmers regularly cause memory leaks.

And the guys in the trenches working with the code rarely happen to be the same guys handling hiring. You don't even have any input on hiring even normal programmers who'll still cause memory leaks. You'll have to work with slick, well spoken morons who cause statistically more memory leaks that you'll have to clean up. Your company will probably lose contracts as a result of those memory leaks.

No one working professionally in the field actually criticizes memory safety. Mmmmaybe you get a few who've been promoted to a managerial role but no longer actually write code anymore.

Btw don't let yourself get promoted out of writing code on a daily basis. Sure, accept the higher paycheck. Maybe do managerial stuff most of the time. But make sure you still write code at least some of the time. It keeps your brain from going soft.
It's good to let others implement certain things like crypto libs, and I begrudgingly accept that Rust's memory safety is very useful for web services. Even Apple's XNU kernel devs frequently leave pointers dangling. It's unfortunate, but expecting people to do things properly isn't realistic, especially these days. Otherwise, I like doing things the hard way.
You get so close to the proper answer but veer off at the last moment.

Rolling your own crypto is exactly the kind of hubris that sinks businesses. But if your work is at all essential to the business, then everything is crypto for some reason or another.
 
Rolling your own crypto is exactly the kind of hubris that sinks businesses. But if your work is at all essential to the business, then everything is crypto for some reason or another.
The word "otherwise" in my last sentence is the key here. Even though I once wrote papers about cryptography, I wouldn't be retarded enough to use anything but OpenSSL or something of similar repute. There are certainly some things that people shouldn't do themselves, and crypto is definitely one of them.
 
The word "otherwise" in my last sentence is the key here. Even though I once wrote papers about cryptography, I wouldn't be retarded enough to use anything but OpenSSL or something of similar repute. There are certainly some things that people shouldn't do themselves, and crypto is definitely one of them.
Fair enough.

I've read a lot about crypto, among other things. I really like Matt Might's blog for these kinds of things.

Still, in my day-to-day work, I'm less bothered by programmers who aren't macho enough, and much, much more bothered by programmers who aren't scared by their own potential fallibility enough.

Overconfidence in programmers makes my job much harder than underconfidence.

You should be scared by writing production code in C. Not because you're not a good programmer, but because you know your own limitations. You'll probably pull it off. Good for you. But memory safety isn't mocking the size of your testicles, it's there in case you're sleepy or hungover or just desperate to leave early on a friday afternoon, like all human beings.
 
I've read a lot about crypto, among other things. I really like Matt Might's blog for these kinds of things.
Ooh, Scheme; you have my attention. Thanks for the link :)

Still, in my day-to-day work, I'm less bothered by programmers who aren't macho enough, and much, much more bothered by programmers who aren't scared by their own potential fallibility enough.

Overconfidence in programmers makes my job much harder than underconfidence.

You should be scared by writing production code in C. Not because you're not a good programmer, but because you know your own limitations. You'll probably pull it off. Good for you. But memory safety isn't mocking the size of your testicles, it's there in case you're sleepy or hungover or just desperate to leave early on a friday afternoon, like all human beings.
Yeah, anything with stuff like text manipulation is better handled by a higher level language. I would sooner shit in my own hand and clap than write something like my chat client in C, trying to finagle a bunch of file descriptors. A majority of the console hacking stuff I occasionally work on involves finding unbounded writes to buffers or controllable size parameters, which Nintendo have historically been fond of using in their OSes.

Super big data in C has performance benefits, but you have to be vigilant. I think we will see non-pickle tensors that exploit bugs in the C/C++ side of PyTorch pretty soon.

The arrogant variant of overconfidence is the worst. It has made for some of the most painful experiences in my entire life back when I was working on a team with a person like that. They're the kind of people you really have to fight the urge to call a retard if you want to keep your job.

Edit: I'm remembering that an instrumental flaw used in PS3 hacking was a flaw in their ECDSA implementation where they used a static param that should have been randomized when generating a new key. That makes obtaining the keys a matter of using elementary algebra.
 
Last edited:
Is it possible to own a website anonymously in this day and age? Basically, can you register a domain, get webhosting, and get ddos protection without having to give personal information and not having to worried about companies dropping support for your website?
The short answer that you got was "no", but I want to interject a "maybe". The folks saying "no" say this because to the extent that these things exist online, they are vulnerable. However, for all intents and purposes, I run a few websites that leave no trace of my identity. Domain registry and hosting were both done with my real name. DDOS protection is a non-issue unless you're doing something viral or contentious. But neither my hosts nor my registrar leave any trace as to who I am.

If there's no motivation to breach my anonymity, then no one is going to bother, because it would involve tortious interference with my contractual agreements with my host and registrar regarding privacy. But if you're problematic enough, there's not enough law in the world to keep you safe from ne'er-do-wells.

tl;dr: You can be functionally and legally anonymous so long as you don't piss in the wrong person's cornflakes.
 
Ok, speaking of overconfidence, and I know this doesn't have anything to do with memory safety, but still, my coworker pushed out a docker image with his new changes. His code uses some sort of python library that uses libmagic.

His docker image did the proper pip install whatever, but I guess there's no checks for the native library being installed. So when I killed our old image, started up our new image, it immediately crashed because it's missing libmagic. So I'm sitting around waiting for him to build a new image and push it out.

If our government contract hinged on this, I'd make him give the congressman a handy. He's on handjob duty.

Like properly speaking, we should have some kind of jenkins job that would autobuild the new image and check the health endpoint to make sure it's running and blah blah blah. I've had that kind of setup at previous jobs. It was nice. But sometimes you need to cut the non-essentials and fancy build servers and tests often are the first things to get cut.
Yeah, anything with stuff like text manipulation is better handled by a higher level language. I would sooner shit in my own hand and clap than write something like my chat client in C, trying to finagle a bunch of file descriptors. A majority of the console hacking stuff I occasionally work on involves finding unbounded writes to buffers or controllable size parameters, which Nintendo have historically been fond of using in their OSes.

Super big data in C has performance benefits, but you have to be vigilant. I think we will see non-pickle tensors that exploit bugs in the C/C++ side of PyTorch pretty soon.
Yeah, in the past I've written high intensity code in C and wrapped it and loaded it into scripting languages for the higher level logic. There's definitely a time and a place for C.
The arrogant variant of overconfidence is the worst. It has made for some of the most painful experiences in my entire life back when I was working on a team with a person like that. They're the kind of people you really have to fight the urge to call a retard if you want to keep your job.
I've worked with some pretty terrible pajeets in the past. It can get bad.
 
There's definitely a time and a place for C.
My hot take is that the more you grasp the actual mechanics of what's going on with your code, the more appealing C looks. I experienced this with the oft-bemoaned C string handling in a recent project of mine. Yes, doing things like concat, search, etc. take more effort, but they get ugly in other languages too.

In my case, the program was designed to take a list of files in argv and work with them. But because of how C works, I can just leave all that data in the original argv array. Need string lengths? Iterate across argv subtracting the two pointers. This will break should argv be non-contiguous. This is not guaranteed by spec, but I haven't found a case where it breaks. And this is what it feels like to code in C: you come face to face with the difference between spec and reality. Because the project is hardly mission critical, ship it. When you can test and verify absolutely that this does not break on all the cases that you're responsible for, and you've documented the potential for breakage, and heavens, even added checking code to validate your assumption, is a different solution called for? Of course not. Is it "bad code"? Well, bad for what? Bad for other coders, sure. But it works, it's tight, and it wasn't hard to debug or reason about.

memcpy that block to a memory-mapped file, close it, to save that list. Iterate across the chars of the input to "remove" a common prefix by storing the number of chars in common. I imagine a hand-rolled asm loop would be faster. None of this is complicated. The issue comes when people overcomplicate string handling. The wonderful thing about C is that if you need additional functionality, the libraries could not be better supported.

But my goodness. My final corporate experience before self-employment was working with SharePoint. C#. ASPX. T-SQL. Etc. I handled both front-end and back-end. Working with SharePoint felt like this: I wrote an extension class (forget precise terminology here) to Microsoft's XML handler to get it to produce CAML XML that SharePoint would accept. This is two products from one vendor, albeit the VS team and the Office team look at things very differently. None of these incompatibilities were documented, but nothing worked, and as my job was to figure out why, well. Compared to the documented weirdness where non-contiguous argv might happen? Which would I prefer? I think the framing makes the answer to that obvious.

Earlier C project also has a circumstance where I call COM interfaces. C++ is supposed to do this better, but C makes the underlying data structures clearer, and I get I'm a weirdo for it, but I do prefer working with COM in C now. There's just less ambiguity.
 
My hot take is that the more you grasp the actual mechanics of what's going on with your code, the more appealing C looks. I experienced this with the oft-bemoaned C string handling in a recent project of mine. Yes, doing things like concat, search, etc. take more effort, but they get ugly in other languages too.

In my case, the program was designed to take a list of files in argv and work with them. But because of how C works, I can just leave all that data in the original argv array. Need string lengths? Iterate across argv subtracting the two pointers. This will break should argv be non-contiguous. This is not guaranteed by spec, but I haven't found a case where it breaks. And this is what it feels like to code in C: you come face to face with the difference between spec and reality. Because the project is hardly mission critical, ship it. When you can test and verify absolutely that this does not break on all the cases that you're responsible for, and you've documented the potential for breakage, and heavens, even added checking code to validate your assumption, is a different solution called for? Of course not. Is it "bad code"? Well, bad for what? Bad for other coders, sure. But it works, it's tight, and it wasn't hard to debug or reason about.

memcpy that block to a memory-mapped file, close it, to save that list. Iterate across the chars of the input to "remove" a common prefix by storing the number of chars in common. I imagine a hand-rolled asm loop would be faster. None of this is complicated. The issue comes when people overcomplicate string handling. The wonderful thing about C is that if you need additional functionality, the libraries could not be better supported.
Yeah argv and file descriptors are a C programmer's best friend (after pointers). I remember back in uni having to write a Scheme compiler in C and making heavy use of strtok_r. Once the AST is made, you're flying with the help of some trampolines.
 
  • Like
Reactions: analrapist
Here we see a sufferer of Microsoft Stockholm Syndrome. Prolonged exposure to Microsoft's broken garbage APIs has warped this man's sense of taste to the point where he sees "calculate C string lengths" and reaches for a hack that only works through a flaky undocumented implementation detail despite the fact that lens[i] = strlen(argv[i]); is easier to read, easier to write, idiomatic, and guaranteed to be correct. And he likes it.

Microsoft: Not Even Once.
 
You don't need to be retarded to cause a memory leak.
Never said you did
Normal, intelligent, talented programmers regularly cause memory leaks.
Of course
And the guys in the trenches working with the code rarely happen to be the same guys handling hiring. You don't even have any input on hiring even normal programmers who'll still cause memory leaks. You'll have to work with slick, well spoken morons who cause statistically more memory leaks that you'll have to clean up. Your company will probably lose contracts as a result of those memory leaks.
Like I said, if it fits your specific project, great. I certainly wouldn't propose using a specific programming language because it brings me joy, I think it's important to use the right tool for the job. That said, I think we're talking about two different things here.
To go a bit tinfoil hat for a moment: I think the ubiquity of Apple devices, which are famous for that "Do it our way. We know what you want." mindset, has helped cement this line of thinking in those who have used these devices for a majority of their lives and aren't used to anything else. "Android? That's for poor people."
Shit I'll go a step further, I think we oversold the idea of academia as this brilliant ecosystem for the intellectual elite to flourish and with it this idea of teachers and researchers being these larger than life fountains of wisdom and expertise. I think there's two important factors, one being that liberals have been pushed in the direction of obedient trust in authority (like with the covid vaccines), the other being the fact that it's socially acceptable for teachers to try to be best friends with their students.

The result is a lot of young people in STEM who have never really been "kicked out of the nest", so to speak. The school provides them with the sense of belonging and socialization they need, and they carry that into adulthood, Then you end up with a bunch of people in STEM who expect their bosses to be involved to the same degree their teachers were, which is fine for children but flat-out patronizing for an adult.

That's why I feel younger people have largely embraced this idea that most STEM subjects are too difficult for anyone to understand without validation from an academic institution. I once had a young person fresh out of college (about four years ago) argue at me with complete confidence that there was no way someone could use vim and GCC in a professional setting, that it was simply impossible to code well without intellisense. I don't think they were parroting anyone else's ideas, I think they were just that confident that if the people they consider authorities don't do it, then it simply never happens.

I think this is why "safe" languages appeal to these people so much. There's a list of things you're "supposed" to do, you memorize it, you implement it, you get into arguments with people who don't follow it, and you have the authority of the gang of ex-Google employees who invented it on your side, like so many other "current thing" opinions. When I was learning programming, you were expected to learn to implement pointers, even if you were never going to use them in your own code. You would get humbled, struggle to make your program work, and by that process develop a respect for the attention to detail it takes to make the tools you use. That attitude seems to be diminishing, with coding bootcamps and programs meant to teach the basics in order to shit out some student who knows how to paint by numbers.

Anyway, it's just a minor gripe. Mostly I just hate having a decent discussion only for some twentysomething to show up and start blathering about how C++ is going to kill my dog.
 
Back