Programming thread

Programming isn't engineering; programming is mathematics, and perfection is expected with mathematics.
I remember saying stuff like this too before I started working on device drivers. An imperfect system that solves the problem well enough is still worthy of respect. Furthermore, the only reason that your perfect programming language can exist is because someone has already abstracted away the sheer fucking pain of working with computers.

If you don't know it, read the night watch to learn more about what life is like. I quote:

One time I tried to create a list<map<int>>, and my syntax errors caused the dead to walk among the living. Such things are clearly unfortunate. Thus, I fully support high-level languages in which pointers are hidden and types are strong and the declaration of data structures does not require you to solve a syntactical puzzle generated by a malevolent extraterrestrial species. That being said, if you find yourself drinking a martini and writing programs in garbage-collected, object-oriented Esperanto, be aware that the only reason that the Esperanto runtime works is because there are systems people who have exchanged any hope of losing their virginity for the exciting opportunity to think about hex numbers and their relationships with the operating system, the hardware, and ancient blood rituals that Bjarne Stroustrup performed at Stonehenge.

A systems programmer will know what to do when society breaks down, because the systems programmer already lives in a world without law.
 
I cannot fathom making an account here to post in this thread of all threads. That said, the Farms tech corner is a gem.
I think it's because almost everyone here either has a job, has had a job, or is trying to get a job. There's not a ton of weird NEET manchild shit happening so things stay pretty grounded.
 
I remember saying stuff like this too before I started working on device drivers. An imperfect system that solves the problem well enough is still worthy of respect. Furthermore, the only reason that your perfect programming language can exist is because someone has already abstracted away the sheer fucking pain of working with computers.

If you don't know it, read the night watch to learn more about what life is like. I quote:



A systems programmer will know what to do when society breaks down, because the systems programmer already lives in a world without law.
That whole article is gold.
 
Null and its consequences have been a disaster for programming.
NullReferenceException is the bane of my existence. I really like C# in spite of all the troons that have been orbiting it lately, but anytime a null gets involved, you can guarantee it'll lead to some retarded crash that's difficult to troubleshoot as the real error was buried.
 
  • Like
Reactions: Bruce Springsteen
Well, writing to Josh does lead to undefined behaviour, so I can see where he gets his name from.
NullReferenceException is the bane of my existence. I really like C# in spite of all the troons that have been orbiting it lately, but anytime a null gets involved, you can guarantee it'll lead to some retarded crash that's difficult to troubleshoot as the real error was buried.
Ironically the whole reason null exists on an OS level in the first place is because the first 4096 bytes in memory space is mapped to always trigger a page fault to crash the program. The logic being if you set an uninitialized pointer/object to 0 (i.e NULL) and you try read/write it must be because something's not initialized so the program should die rather than dereference something unexpected. Also this means initialization crashes should have their crash address within that 0-4095 range and thus theoretically easy to debug. Although once there's hundreds of objects to deal with in a language and/or when null becomes its own thing in order to raise exceptions understandably it's no longer that simple lol.
 
Last edited:
Ironically the whole reason null exists on an OS level in the first place is because the first 4096 bytes in memory space is mapped to always trigger a page fault to crash the program. The logic being if you set an uninitialized pointer/object to 0 (i.e NULL) and you try read/write it must be because something's not initialized so the program should die rather than dereference something unexpected. Also this means initialization crashes should have their crash address within that 0-4095 range and thus theoretically easy to debug. Although once there's hundreds of objects to deal with in a language and/or when null becomes its own thing in order to raise exceptions understandably it's no longer that simple lol.

C, and thus NULL, was invented before MMUs were widespread, so while that's a nice side-effect it's not why it was chosen in the first place. If you try to read from NULL on an 8086 for instance, you're just going to read the interrupt table.
 
Although once there's hundreds of objects to deal with in a language and/or when null becomes its own thing in order to raise exceptions understandably it's no longer that simple lol.
IMO exceptions are pretty annoying to use for error handling, as you can't see from just the function definition whether it throws or not. I much prefer getting an Option<T> or Result<T,U> back.
 
C, and thus NULL, was invented before MMUs were widespread, so while that's a nice side-effect it's not why it was chosen in the first place. If you try to read from NULL on an 8086 for instance, you're just going to read the interrupt table.
I forgot about that with older CPUs! Considering 0 means nothing to us humans yeah it definitely would have been around back then without MMUs. I did a quick search on 8086 and correct me if I'm wrong but addresses 0x00000 -0x9FFFF are RAM so therefore writable? Writing to NULL would then overwrite the divide by zero vector? Or worse other vectors if you're writing a stream of bytes to NULL, the bugs older developers dealt with :story:.

Another interesting behavior of "NULL" (or lack thereof) on an older CPU, I worked with the NES 6502 assembly/CPU before and similar to CPUs with MMUs it had a "zero page" (bytes 0x00-0xFF). Except this page was always filled with useful variables and function arguments with bytes 0x00-0x0F being for your function args and the rest for local+global variables, meaning there is no NULL at all (I think). It's been a while since I looked at this but the reasoning probably is that you can save space and CPU time by fetching an address from an 8-bit immediate instead of a 16-bit immediate (although this method is also available). They also probably do this because 6502 was a cheap chip with only 3 working 8 bit registers; two index registers (x,y) and an accumulator (a).

IMO exceptions are pretty annoying to use for error handling, as you can't see from just the function definition whether it throws or not. I much prefer getting an Option<T> or Result<T,U> back.
Unfortunately (or fortunately depending on your perspective) I haven't had to deal with C# but I have with Java. Not knowing what exception and where it is thrown sounds pretty bad and catching each one sounds like a nightmare. In my opinion exceptions fundamentally are nice because throwing an error is much more verbose than C's fuck you message (Segmentation fault lol). But it should be reserved for when you can't recover otherwise it becomes the trial and error of catching everything. As you said it's so nicer just to receive an error code or something so you don't have to worry about some random function in some library shitting the bed for the rest of the code especially if it can be safely ignored.
 
Unfortunately (or fortunately depending on your perspective) I haven't had to deal with C# but I have with Java. Not knowing what exception and where it is thrown sounds pretty bad and catching each one sounds like a nightmare. In my opinion exceptions fundamentally are nice because throwing an error is much more verbose than C's fuck you message (Segmentation fault lol). But it should be reserved for when you can't recover otherwise it becomes the trial and error of catching everything. As you said it's so nicer just to receive an error code or something so you don't have to worry about some random function in some library shitting the bed for the rest of the code especially if it can be safely ignored.
Yeah, for things that REALLY can't be recovered, exceptions make sense. But a LOT of programmers are relying on exceptions as a form of control flow. (Looking at you, python) While with discriminated/tagged unions ( https://en.wikipedia.org/wiki/Tagged_union https://learn.microsoft.com/en-us/dotnet/fsharp/language-reference/discriminated-unions ) you get much nicer control flow you can see from outside. Think of a int.tryparse returning an Option<int> instead of a tuple<bool,int> or a bool as return value and the int as an out value, to indicate success/failure. (In the option case, you literally wouldn't be able to access any int if there isn't one)

I hope my message is somewhat readable.
 
  • Like
Reactions: HRT Heisenberg
Speaking of null

Null and its consequences have been a disaster for programming.
It's annoying but I can't see an alternative to knowing if an object is defined or not. I also use writing to NULL to trigger segfault in cuda so I could know which function did something bad.
Yeah, for things that REALLY can't be recovered, exceptions make sense. But a LOT of programmers are relying on exceptions as a form of control flow. (Looking at you, python) While with discriminated/tagged unions ( https://en.wikipedia.org/wiki/Tagged_union https://learn.microsoft.com/en-us/dotnet/fsharp/language-reference/discriminated-unions ) you get much nicer control flow you can see from outside. Think of a int.tryparse returning an Option<int> instead of a tuple<bool,int> or a bool as return value and the int as an out value, to indicate success/failure. (In the option case, you literally wouldn't be able to access any int if there isn't one)

I hope my message is somewhat readable.
I could justify using exceptions for cases where asking permission is retardedly complicated and just seeing if you manage to do it is far more intuitive (as long as it's not a thing you do constantly).
 
  • Like
Reactions: Kosher Salt
IMO exceptions are pretty annoying to use for error handling, as you can't see from just the function definition whether it throws or not. I much prefer getting an Option<T> or Result<T,U> back.
I think you've missed the point of exceptions slightly - you shouldn't need to know whether a function throws or not. Rust shifts the burden on unwrapping etc. from the compiler onto the programmer, which hardly seems like an improvement. Plus Rust's way of handling genuinely exceptional events seems to be that any function can panic at any time, which always makes me very nervous.

Yeah, for things that REALLY can't be recovered, exceptions make sense. But a LOT of programmers are relying on exceptions as a form of control flow. (Looking at you, python) While with discriminated/tagged unions ( https://en.wikipedia.org/wiki/Tagged_union https://learn.microsoft.com/en-us/dotnet/fsharp/language-reference/discriminated-unions ) you get much nicer control flow you can see from outside. Think of a int.tryparse returning an Option<int> instead of a tuple<bool,int> or a bool as return value and the int as an out value, to indicate success/failure. (In the option case, you literally wouldn't be able to access any int if there isn't one)

I hope my message is somewhat readable.
The main problem with exceptions is that they are slow, due to the retarded way in which they were implemented. I don't think there'd be any problem with using them for control flow if it wasn't for that. I agree that parsing numbers shouldn't throw an exception: C++'s stoi is particularly stupid for that, and from_chars is only slightly better. There should be something which takes a std::string_view and returns std:: optional<int>, but there isn't.
 
I think you've missed the point of exceptions slightly
Naw, I just suck at putting my thoughts into words. Stuff that is expected to fail in a recoverable way, shouldnt use exceptions. Like losing a network connection. Or tryong to read from an empty stream. Or trying to write to a file which doesnt exist.

While for something like a OOM error, it makes sense to throw an exception.
 
  • Agree
Reactions: Marvin
The main problem with exceptions is that they are slow, due to the retarded way in which they were implemented. I don't think there'd be any problem with using them for control flow if it wasn't for that. I agree that parsing numbers shouldn't throw an exception: C++'s stoi is particularly stupid for that, and from_chars is only slightly better. There should be something which takes a std::string_view and returns std:: optional<int>, but there isn't.
They're "fast" in Python which is really the only language where you should be using exceptions as flow control.

In Java you get a stack trace every time you throw, This is...not fast.
 
I hate when the priority level is the inverted to the value and the documentation isn't clear on whether it returns the lowest value or the value corresponding to the lowest level.
 
It's over 30 years old and Carmack has been up front he didn't come up with it. If you trace it back to its roots it's so great because the guy who came up with it is the main guy behind the IEEE 754 floating point standard and it's taking advantage of how IEEE 754 floats work.

Yes, the technique was discovered in 1986.

But, Carmack’s programmers properly documented the technique when they added the comment: “what the fuck”
 
IMO exceptions are pretty annoying to use for error handling, as you can't see from just the function definition whether it throws or not. I much prefer getting an Option<T> or Result<T,U> back.
Results are only a clear winner over the Exception if all of your errors can be handled where the function is being called. If you ever find an error being passed back up a call chain or rewrapped from across a module boundary you've found a place where (at least in terms of semantics) try/catch would fit the problem better.
 
Back