- Joined
- Aug 18, 2020
@UERISIMILITUDO should be proud. Here less than 3 days and he got Null himself to respond to one of his posts. It's almost impressive.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
I remember saying stuff like this too before I started working on device drivers. An imperfect system that solves the problem well enough is still worthy of respect. Furthermore, the only reason that your perfect programming language can exist is because someone has already abstracted away the sheer fucking pain of working with computers.Programming isn't engineering; programming is mathematics, and perfection is expected with mathematics.
One time I tried to create a list<map<int>>, and my syntax errors caused the dead to walk among the living. Such things are clearly unfortunate. Thus, I fully support high-level languages in which pointers are hidden and types are strong and the declaration of data structures does not require you to solve a syntactical puzzle generated by a malevolent extraterrestrial species. That being said, if you find yourself drinking a martini and writing programs in garbage-collected, object-oriented Esperanto, be aware that the only reason that the Esperanto runtime works is because there are systems people who have exchanged any hope of losing their virginity for the exciting opportunity to think about hex numbers and their relationships with the operating system, the hardware, and ancient blood rituals that Bjarne Stroustrup performed at Stonehenge.
i'm equally impressed that the programming thread turned into the thunderdome for a couple of pages out of nowhere.@UERISIMILITUDO should be proud. Here less than 3 days and he got Null himself to respond to one of his posts. It's almost impressive.
I think it's because almost everyone here either has a job, has had a job, or is trying to get a job. There's not a ton of weird NEET manchild shit happening so things stay pretty grounded.I cannot fathom making an account here to post in this thread of all threads. That said, the Farms tech corner is a gem.
That whole article is gold.I remember saying stuff like this too before I started working on device drivers. An imperfect system that solves the problem well enough is still worthy of respect. Furthermore, the only reason that your perfect programming language can exist is because someone has already abstracted away the sheer fucking pain of working with computers.
If you don't know it, read the night watch to learn more about what life is like. I quote:
A systems programmer will know what to do when society breaks down, because the systems programmer already lives in a world without law.
Speaking of nullI'm assuming that guy was a returning banned lolcow or something, but I prefer to believe Null just has a seething, raging hatred for APL.
NullReferenceException is the bane of my existence. I really like C# in spite of all the troons that have been orbiting it lately, but anytime a null gets involved, you can guarantee it'll lead to some retarded crash that's difficult to troubleshoot as the real error was buried.Null and its consequences have been a disaster for programming.
Well, writing to Josh does lead to undefined behaviour, so I can see where he gets his name from.
Ironically the whole reason null exists on an OS level in the first place is because the first 4096 bytes in memory space is mapped to always trigger a page fault to crash the program. The logic being if you set an uninitialized pointer/object to 0 (i.e NULL) and you try read/write it must be because something's not initialized so the program should die rather than dereference something unexpected. Also this means initialization crashes should have their crash address within that 0-4095 range and thus theoretically easy to debug. Although once there's hundreds of objects to deal with in a language and/or when null becomes its own thing in order to raise exceptions understandably it's no longer that simple lol.NullReferenceException is the bane of my existence. I really like C# in spite of all the troons that have been orbiting it lately, but anytime a null gets involved, you can guarantee it'll lead to some retarded crash that's difficult to troubleshoot as the real error was buried.
Ironically the whole reason null exists on an OS level in the first place is because the first 4096 bytes in memory space is mapped to always trigger a page fault to crash the program. The logic being if you set an uninitialized pointer/object to 0 (i.e NULL) and you try read/write it must be because something's not initialized so the program should die rather than dereference something unexpected. Also this means initialization crashes should have their crash address within that 0-4095 range and thus theoretically easy to debug. Although once there's hundreds of objects to deal with in a language and/or when null becomes its own thing in order to raise exceptions understandably it's no longer that simple lol.
IMO exceptions are pretty annoying to use for error handling, as you can't see from just the function definition whether it throws or not. I much prefer getting an Option<T> or Result<T,U> back.Although once there's hundreds of objects to deal with in a language and/or when null becomes its own thing in order to raise exceptions understandably it's no longer that simple lol.
I forgot about that with older CPUs! Considering 0 means nothing to us humans yeah it definitely would have been around back then without MMUs. I did a quick search on 8086 and correct me if I'm wrong but addresses 0x00000 -0x9FFFF are RAM so therefore writable? Writing to NULL would then overwrite the divide by zero vector? Or worse other vectors if you're writing a stream of bytes to NULL, the bugs older developers dealt withC, and thus NULL, was invented before MMUs were widespread, so while that's a nice side-effect it's not why it was chosen in the first place. If you try to read from NULL on an 8086 for instance, you're just going to read the interrupt table.
Unfortunately (or fortunately depending on your perspective) I haven't had to deal with C# but I have with Java. Not knowing what exception and where it is thrown sounds pretty bad and catching each one sounds like a nightmare. In my opinion exceptions fundamentally are nice because throwing an error is much more verbose than C's fuck you message (Segmentation fault lol). But it should be reserved for when you can't recover otherwise it becomes the trial and error of catching everything. As you said it's so nicer just to receive an error code or something so you don't have to worry about some random function in some library shitting the bed for the rest of the code especially if it can be safely ignored.IMO exceptions are pretty annoying to use for error handling, as you can't see from just the function definition whether it throws or not. I much prefer getting an Option<T> or Result<T,U> back.
Yeah, for things that REALLY can't be recovered, exceptions make sense. But a LOT of programmers are relying on exceptions as a form of control flow. (Looking at you, python) While with discriminated/tagged unions ( https://en.wikipedia.org/wiki/Tagged_union https://learn.microsoft.com/en-us/dotnet/fsharp/language-reference/discriminated-unions ) you get much nicer control flow you can see from outside. Think of a int.tryparse returning an Option<int> instead of a tuple<bool,int> or a bool as return value and the int as an out value, to indicate success/failure. (In the option case, you literally wouldn't be able to access any int if there isn't one)Unfortunately (or fortunately depending on your perspective) I haven't had to deal with C# but I have with Java. Not knowing what exception and where it is thrown sounds pretty bad and catching each one sounds like a nightmare. In my opinion exceptions fundamentally are nice because throwing an error is much more verbose than C's fuck you message (Segmentation fault lol). But it should be reserved for when you can't recover otherwise it becomes the trial and error of catching everything. As you said it's so nicer just to receive an error code or something so you don't have to worry about some random function in some library shitting the bed for the rest of the code especially if it can be safely ignored.
It's annoying but I can't see an alternative to knowing if an object is defined or not. I also use writing to NULL to trigger segfault in cuda so I could know which function did something bad.Speaking of null
Null and its consequences have been a disaster for programming.
I could justify using exceptions for cases where asking permission is retardedly complicated and just seeing if you manage to do it is far more intuitive (as long as it's not a thing you do constantly).Yeah, for things that REALLY can't be recovered, exceptions make sense. But a LOT of programmers are relying on exceptions as a form of control flow. (Looking at you, python) While with discriminated/tagged unions ( https://en.wikipedia.org/wiki/Tagged_union https://learn.microsoft.com/en-us/dotnet/fsharp/language-reference/discriminated-unions ) you get much nicer control flow you can see from outside. Think of a int.tryparse returning an Option<int> instead of a tuple<bool,int> or a bool as return value and the int as an out value, to indicate success/failure. (In the option case, you literally wouldn't be able to access any int if there isn't one)
I hope my message is somewhat readable.
I think you've missed the point of exceptions slightly - you shouldn't need to know whether a function throws or not. Rust shifts the burden on unwrapping etc. from the compiler onto the programmer, which hardly seems like an improvement. Plus Rust's way of handling genuinely exceptional events seems to be that any function can panic at any time, which always makes me very nervous.IMO exceptions are pretty annoying to use for error handling, as you can't see from just the function definition whether it throws or not. I much prefer getting an Option<T> or Result<T,U> back.
The main problem with exceptions is that they are slow, due to the retarded way in which they were implemented. I don't think there'd be any problem with using them for control flow if it wasn't for that. I agree that parsing numbers shouldn't throw an exception: C++'s stoi is particularly stupid for that, and from_chars is only slightly better. There should be something which takes a std::string_view and returns std:: optional<int>, but there isn't.Yeah, for things that REALLY can't be recovered, exceptions make sense. But a LOT of programmers are relying on exceptions as a form of control flow. (Looking at you, python) While with discriminated/tagged unions ( https://en.wikipedia.org/wiki/Tagged_union https://learn.microsoft.com/en-us/dotnet/fsharp/language-reference/discriminated-unions ) you get much nicer control flow you can see from outside. Think of a int.tryparse returning an Option<int> instead of a tuple<bool,int> or a bool as return value and the int as an out value, to indicate success/failure. (In the option case, you literally wouldn't be able to access any int if there isn't one)
I hope my message is somewhat readable.
Naw, I just suck at putting my thoughts into words. Stuff that is expected to fail in a recoverable way, shouldnt use exceptions. Like losing a network connection. Or tryong to read from an empty stream. Or trying to write to a file which doesnt exist.I think you've missed the point of exceptions slightly
They're "fast" in Python which is really the only language where you should be using exceptions as flow control.The main problem with exceptions is that they are slow, due to the retarded way in which they were implemented. I don't think there'd be any problem with using them for control flow if it wasn't for that. I agree that parsing numbers shouldn't throw an exception: C++'s stoi is particularly stupid for that, and from_chars is only slightly better. There should be something which takes a std::string_view and returns std:: optional<int>, but there isn't.
It's over 30 years old and Carmack has been up front he didn't come up with it. If you trace it back to its roots it's so great because the guy who came up with it is the main guy behind the IEEE 754 floating point standard and it's taking advantage of how IEEE 754 floats work.
![]()
William Kahan - Wikipedia
en.wikipedia.org
Results are only a clear winner over the Exception if all of your errors can be handled where the function is being called. If you ever find an error being passed back up a call chain or rewrapped from across a module boundary you've found a place where (at least in terms of semantics) try/catch would fit the problem better.IMO exceptions are pretty annoying to use for error handling, as you can't see from just the function definition whether it throws or not. I much prefer getting an Option<T> or Result<T,U> back.