Programming thread

Okay guys very cool, but whoever's neural network this is, might I suggest not using the A&N board as the training dataset?
we already got a powerleveling troon, why not go deeper?
In his defense, I greatly prefer physical books as well. Just easier to concentrate on and take notes.
eh, you can always print the PDF's... still it falls on preference and roll with it, find the editions that had the notes part scanned too.
At the time javascirpt was designed, I don't know if anybody envisioned just how monstrous and bloated the scripts would become. I don't think anybody's said that javascript is a bad language for it's purpose, but it's a bad language for it's modern scope and we just desperately need something with more type safety. And to be clear there ARE versions of that that exist! But it seems like nobody uses them.
only programmerfags call JS bad, however it's kind of bloated that it's a bit of pain to learn it as much as CPP for some because most of how they behave is trying to understand what the language does rather than concepts of programming to search in the language, i only learned this shit on the bootcamp even though i had a cpp crash course demanded by my college, as you can guess the teacher only made us repeat shit rather than teaching the concepts using the pseudocode, when we jamped to CPP then he tried teaching the concepts and workarounds for newbies. the horror was the casualties of people getting extremely frustrated to his application, still was the best CPP teacher most of the students had but it was still a bump we suffered... also what cases of JS that are more safe to write? there's documentations to it? even a github main page that tries to explain works already since programmers are known to be terrible at explaining, the course at the moment is making learners suffer the same shit of being lost as they were once by some programmers or either spoon feeding everything like some unpaid freelance thing. searching for safe ways to deal with scripts on google is really dependable because google books getting in the fucking way.
I think you might be conflating 'real programmer' with 'low level programmer'. The two are unrelated. John's point is saying that scripting languages are often used as a means to let people code who are not competent to write that code, but think they can because the language looks fluffy and englishy.
thanks for ruining the joke. appreciate it, homie.
 
Last edited:
  • DRINK!
Reactions: Yotsubaaa
you can link statically typed libraries into dynamically typed languages without too much heartache by creating wrappers. Statically typed languages are favored for large programs because static typing enables vastly greater understandably by making every function have basic, front facing guarantees of what it will accept as input. These give you strong assurances about what the function you're calling does and does not do, letting you fluidly write using code without having to analyze if what the function is doing is safe with the types that you're passing in (which may not even be clear depending on what level of abstraction you're at in a dynamic language).

With dynamic languages, you have to either blindly trust or manually verify that a function will operate in a sane matter on the types that you're passing in. It creates a yoyo effect of going up and down and up and down and up and down multiple layers of abstraction just to make sure what what you write isn't going to bomb out at runtime. Remember this also might be in the context of 10 or 20 other coders that may not even be competent in their jobs and you have to use whatever they wrote. Static typing gives no guarantee of correct behavior, but the scope of mistakes you can make is vastly narrower.

Reiterating: the problems i mention here are not a big deal for small scripts. Flexible code and manual verification is not a major hindrance if the code remains small. As the code expands and (more importantly) becomes deeper, adding features gets rapidly more difficult.


compiling and interpreting are vastly different. Compilers treat generic code as a template for making new code (hence the name in C++). They will actually generate copies of those functions for each type that you end up using. This enables the compiler to do static analysis on that code ahead of time as if the generics were never there in the first place.

Interpreters with dynamic languages have no such advantage because nothing is known before runtime. There ARE some interpreters that play out possibilities for what types could be based on what they see for literal declarations, but doing this comprehensively in the program is completely infeasible as it would require every piece of code to try every permutation of every possible combination of types... and even then it would not be helpful because the compiler has no realistic way of knowing what you even plan to pass in from... say... a file or the command line. It would just be flagging random stuff that you never intended to do.

the simplest example of why code analysis fails for dynamic languages is this: say do_a_thing succeeds on numbers and fails on strings (being generous to assume it would know that to begin with).

Python:
var = 0

if some_conditon:
    var = "hello"
else:
    var = 5
 
result = do_a_thing (var)

What does the static analyzer make of do_a_thing? Does it force itself to evaluate both true and false conditions of the branch? Perhaps it could give a warning that the top branch would result in an error. But what about this:

Python:
var    = read_json(root)["var"]
result = do_a_thing (var)
This
dude there's tons of PDF's on the interwebz... i am recently doing a fullstack bootcamp because why not? uses react, css, vue and node, gotta play with postgre and sqlite.
found a cool program that is VSCode which is miles better than sublime text especially with the live server plugin. real cool shit.

found the JS dev.

THE MARKET DEMANDS IT. so i gotta dive through that autism...
how old are you, matusalem? still you did give a good take for programmers to think on, that is not some shit you'd find explained on stackoverflow or that any programmer usually would rely on it unless he spent a vacation on guantanamo bay... however isn't JS's whole point of integrating the backend with frontend with little hassle as possible while still being dynamic with the results? of course the rest of the lang being more draconic than CPP to deal with is another story but many of it is eased by learning concepts rather than trying to figure the lang itself.
that's why cobol exists don't it? also i'm kinda young in the programming world but in college we started it with a pseudocode then jumped into CPP... the horror.
dude i think it's a bit optimistic, the most likely to happen is some dude trying to create his own lang with support to the one he has a beef with, looking at you ruby. for his own ego rubbing of getting a masters/doctorate, sadly the academia has too much gay shit for people to actively think on something good for everyone, shit i'd think CPP is the closest but it still a mess to deal with that people often pick their favorite poison, too much infighting and faggotry hardly allows that to happen, too much work for what is already settled and too little autism to make adjusting things to be viable.
The reality that the market demands it is sadly true, hence my utter disgust with the complacent nature of the programming community when it comes to languages like JS at the moment. Don't get me wrong, it's utility for small web based scripting is powerful, hence I see the utility when you want to use it for say maybe a bit of extra flare to your webpage. Though JS uses a lot of shorthand that maybe nice for the Python savvy, but is terrible at a lot of it's uses when you go beyond the scope of minimalistic extensions. Especially when running in client side web browsers, JS has always had an issue with bogging down the browsing experience in it of itself. Dynamically typed languages are just fucking terrible, because it makes a lot of inferences that are only related to the interpreters assumptions. I prefer statically typed, because at the end of the day when it comes to debugging, it makes it a lot easier for me to see what is tripping up my code.

When learning Swift for instance, I had a lot of issue with the dynamic flow of that language, sure it was nice when it comes to reducing the amount of pseudo-code, but unless you are familiar with the nuances of statically typed languages; You are going to be spending half of your time trying to reverse engineer the logic, which in debugging time is fucking astronomical and unnecessary.

I find with the younger Zoomer, heh heh, learn to code fucks. Is that most of them make their assumptions that every language is like Python. These people do not tend to realize that every language when it comes to development, all have different purposes, and are all designed to do different things. Hence the standards in industry are what they are for that reason, while you are learning.

This is why despite how much of a pain in the ass C is in regards of arrays and pointers, it is still a fucking great language that held it's weight over time. Most modern languages borrow something from C as well, hence if one wants to learn the actual involvement of how these concepts work; I think that should be everyone's first language. Especially in the forefront of debugging, languages like Python are nice for academics, because when you are doing large statistical calculations that rely on pure computation; You want something that is loosely typed and a bit more dynamic, and not bog yourself down with the nuances of having to say port it to multiple machines. Hence when it comes to language like that for say the common data-scientist, when learning it's nice. Though languages like R are a lot more powerful in that forefront as well, but again it's a matter of personal preference, and efficiency.

When it comes to discussion of what are the flaws or pros of one language compared to another, one thing it seems is a universal constant in the development community, is that JS tries to do too much. Hence it fails at what it's intended application and design is, then you have a bunch of fags pumping it up. JS just like a tranny, is confused about it's intended purpose in this world, and is in a constant flux of an identity crisis. It thinks it knows better, but it's degenerate and twisted lifestyle based on embracement of the mainstream will be it's own undoing. Then instead of residing with the 99%, it's embrace with the 43%. Thank you for coming to my TED Talk nigger.
 
Hardening program I work on. See openssl in compiler info. It's dependency of other library used. Find library after turning dependencies off; guess 'n check. Find dependency breaks program. realise I should save problem for later. On branch changing API to be more lazy happy comfy. Type eraser. No inheritance. Cheating with macros. More problems need fix.

I'm discovering I'm really bad at explaining technical things to normies. All the words are nonsense to most people. Explaining things simply is not automatic if not thought about beforehand. 80 year old nigger turns computer off during bios update and doesn't know why computer can't find it's boot area. Despite being literate most don't read and avoid reading. Nigger would have been told to wait for computer to finish, but holds how power button to disrupt the automatic process.

OpenSSL is probably captured by the fed so that they have an easy low hanging fruit target to fuck people with. I can't think of a better reason to have a popular internet security library be such a unmitigated disaster. I bet heartbleed had been an in-house if it weren't just deliberate retardation. reqwest glows in the dark. What the fuck is it with Mozilla naming everything like a Downsyndrome sufferer is speaking. They do it with their networking library too. They call it netwerk. Stupid. They also can't spell neko correctly. Wanting to use ureq.

Japan got nuked twice. That must be why they always make a 2sing with their fingers to show that they got nuked twice.

Thank you for reading my blog post.
 
This is why despite how much of a pain in the ass C is in regards of arrays and pointers, it is still a fucking great language that held it's weight over time. Most modern languages borrow something from C as well, hence if one wants to learn the actual involvement of how these concepts work; I think that should be everyone's first language. Especially in the forefront of debugging, languages like Python are nice for academics, because when you are doing large statistical calculations that rely on pure computation; You want something that is loosely typed and a bit more dynamic, and not bog yourself down with the nuances of having to say port it to multiple machines.
I think the average academic doing numerical computation in Python is not doing any of the significant compute in Python itself. That's being done by calling out to C and Fortran libraries, and pushing CUDA code to the GPU.

It's weird to deprecate dynamic types and then praise C. Let's not pretend that C, where it's absolutely fucking standard to cast to and from void*, use undiscriminated unions and where most of the possible behaviours of variable argument functions like printf is undefined, isn't really fucking loosely typed. C combines this pervasive loose typing with a commitment not to do any runtime checks, which I guess some masochists find fun, but I find inexcusably demented for the overwhelming majority of coding I do.

I think people should learn C because that's what the core of their operating systems are written in. It's sad that these are the only operating systems we have to choose from, giving C a lot of undeserved influence on our computing experience, but I've probably already moaned that tune in this thread.
 
It's weird to deprecate dynamic types and then praise C.
Except that C is functionally deprecated for the reasons you list. The difference is that C's ad-hoc "dynamic typing" is not a feature, but rather a workaround for a lack of features. No sane person would declare that (void*) is a superior way to program.

The only reason to use C today is when you need a dead simple, minimalist compiler, a more strict relationship with asm or trivial bindings to other languages.
 
Except that C is functionally deprecated for the reasons you list. The difference is that C's ad-hoc "dynamic typing" is not a feature, but rather a workaround for a lack of features. No sane person would declare that (void*) is a superior way to program.
Thing about C is, the compiler will throw you tons of warnings or errors if you make a function and then call it with arguments with the wrong types, but if you ignore that and go ahead with the program, the machine code won't know any better. It goes ahead and handles the bits the wrong way.

If you have a function that uses void star, you have to cast it to another type (unless you pass it to another function which accepts a void star). The void* type doesn't mean no type, it itself is a specific type you have to cast to when you pass a pointer to the function. Using void* is simply a way of muting the error in the compiler, only to be used in a small number of outcomes in which the programmer is fully aware of the unsafe nature of void* (One example I saw was casting the variable pointer of any type to a char pointer and writing it to a file as a byte array). nobody goes around writing their C functions with all void star pointers. Most of the time you don't want to cast variables to other types for obvious reasons. The truth is, in a certain semantic way you can say C is less dynamically typed than a language like Java because in that language errors of the wrong type get caught at runtime too. But, it is obvious that C uses types very strictly at the compiler level to make the programmer make sure they don't pass the wrong types to functions, even if the machine code operates on whatever bits it gets.
 
Thing about C is, the compiler will throw you tons of warnings or errors if you make a function and then call it with arguments with the wrong types, but if you ignore that and go ahead with the program, the machine code won't know any better. It goes ahead and handles the bits the wrong way.

If you have a function that uses void star, you have to cast it to another type (unless you pass it to another function which accepts a void star). The void* type doesn't mean no type, it itself is a specific type you have to cast to when you pass a pointer to the function. Using void* is simply a way of muting the error in the compiler, only to be used in a small number of outcomes in which the programmer is fully aware of the unsafe nature of void* (One example I saw was casting the variable pointer of any type to a char pointer and writing it to a file as a byte array). nobody goes around writing their C functions with all void star pointers. Most of the time you don't want to cast variables to other types for obvious reasons. The truth is, in a certain semantic way you can say C is less dynamically typed than a language like Java because in that language errors of the wrong type get caught at runtime too. But, it is obvious that C uses types very strictly at the compiler level to make the programmer make sure they don't pass the wrong types to functions, even if the machine code operates on whatever bits it gets.
We probably got off on the wrong foot even comparing C's (void*) to 'dynamic typing'. It's completely apples and oranges. It's just a notation for a raw memory address, pretty much.

But yeah, that's why I say C is good for when you need a closer relationship to the assembly level. C++ can generate a metric shitton of assembly with very few lines of code. With C, it's hard to be surprised by whatever it generates.
 
Besides, in C you can at least create meaningful abstractions by wrapping whatever pointers and whatnot into structs, thus having both semantic meaning and type checking. In Python...there is no escaping the insanity.
 
  • Agree
Reactions: Coffee Shits
We probably got off on the wrong foot even comparing C's (void*) to 'dynamic typing'. It's completely apples and oranges. It's just a notation for a raw memory address, pretty much.

But yeah, that's why I say C is good for when you need a closer relationship to the assembly level. C++ can generate a metric shitton of assembly with very few lines of code. With C, it's hard to be surprised by whatever it generates.
I called C loosely typed, which is a conceputally different dimension to the static/dynamic one, but which I mostly take as evidence that terms like "dynamic typing" are bullshit terms with no rigorous mathematical basis (same goes for "dynamic" in general, which is probably why the conversation was such a mess).

One guy who likes serious mathematical theories in his computer science once proposed that all languages are statically typed. So even Python is statically typed. It's just that it has only one type for all programs, the "Any value" type. Because everything has this type, it can be made implicit, and so programmers forget it's there at all. But this one type is there, and we can say the language is unityped, especially because it's a pun on "untyped."

A reason this view is more coherent is because Harper knows where type theory takes us, and that mainstream programming languages don't scratch the surface. There are type systems where the length of an array is carried in the static type, eliminating any need for runtime checks. With this in mind, we'd have to conclude that a language like Rust, which can't do such an encoding, must instead resort to "dynamic type checking" on its array accesses, and then conclude that Rust has dynamically typed array sizes.

But another way to look at it is just to say that some languages have more expressive type systems than others, with the least possible expressivity being languages whose type systems express nothing more than "variables have Any type." When thinking this way, C is closer to those languages because it makes frequent use of "void *" being "pointer to any value."

The only other relevant dimension is then soundness: how often does my type-checker lie to me? Here, C fails hands down. The typechecker will happily and routinely lie to you, and let you watch your program blow up in unspecified ways when you run it. Python, on the other hand, gets a thumbs up by having a "type checker" making the easy but binding promise that a Python expression could be Any value.

The best place, of course, is to have expressive and sound type systems.
 
Last edited:
  • Like
Reactions: awoo and Yotsubaaa
The only other relevant dimension is then soundness: how often does my type-checker lie to me? Here, C fails hands down. The typechecker will happily and routinely lie to you, and let you watch your program blow up in unspecified ways when you run it.
This is a gross oversimplification. C gives you the enough rope to either hang yourself or build abstractions. If you choose not to do the latter, well...
 
One guy who likes serious mathematical theories in his computer science once proposed that all languages are statically typed. So even Python is statically typed. It's just that it has only one type for all programs, the "Any value" type. Because everything has this type, it can be made implicit, and so programmers forget it's there at all. But this one type is there, and we can say the language is unityped, especially because it's a pun on "untyped."

[...]

The only other relevant dimension is then soundness: how often does my type-checker lie to me? Here, C fails hands down. The typechecker will happily and routinely lie to you, and let you watch your program blow up in unspecified ways when you run it. Python, on the other hand, gets a thumbs up by having a "type checker" making the easy but binding promise that a Python expression could be Any value.
As my old boss used to say, "Technically correct and entirely unhelpful".
 
does my type-checker lie to me? Here, C fails hands down. The typechecker will happily and routinely lie to you,
Really? In my experience the compiler has always been very specific about passing wrong types to functions, even to the point where it's annoying differentiating between unsigned longs and ints and things like that. It gets to the point where I annoyingly have to make my for loop use a unsigned long instead of int because that's what I'm comparing it to in an if statement. But once you clear compiler errors, it proves that you are always passing the correct types to functions, if statements, etc, so what's the problem? As said before the chip doesn't give a damn between wether the bits of a register are a different type, it only cares about wether the compiler made it do an "unsigned add" or "signed add" or something like that. But the compiler makes sure the programmer doesn't mix up these commands (or the compiler inserts either assembly command when appropriate).

P.S. length-specific array types is a good idea, but I see that in languages like Go which have more options for how to deal with arrays or similar structs you don't know the length of at compile time (slices). But of course in C if you have a variable which is an "array" that is just the pointer to the first item in the array. I think the way C is, was decided to be that way so you pass the length to the function, or #define the array length at compile-time, so you can deal with dynamically-allocated arrays in C's special barebones style. (Think argv and argc in main())
 
Last edited:
This is a gross oversimplification. C gives you the enough rope to either hang yourself or build abstractions. If you choose not to do the latter, well...
Here's qsort:

Code:
void qsort(void *base, size_t nitems, size_t size, int (*compar)(const void *, const void*))

Where's the abstraction saving me from using "void *"? It isn't there, because C can't do generic code without simply bypassing the typesystem like this. One of the first abstractions I want in my typesystem is parametric polymorphism so that I can correctly type the above code.

I can, of course, absolutely forgive C for not having it, having been designed before type polymorphism was properly figured out. But let's not pretend that this isn't a massive weakness of C in the face of modern alternatives.

P.S. length-specific array types is a good idea, but I see that in languages like Go which have more options for how to deal with arrays or similar structs you don't know the length of at compile time (slices). But of course in C if you have a variable which is an "array" that is just the pointer to the first item in the array. I think the way C is, was decided to be that way so you pass the length to the function, or #define the array length at compile-time, so you can deal with dynamically-allocated arrays in C's special barebones style. (Think argv and argc in main())
They're not necessarily a good idea. Once you're statically typing for the length of your arrays, you find yourself having to do arbitrary proofs about arithmetic calculations in your program. At this level, type systems become full blown mathematical theorem provers, which comes at a high complexity cost on the part of the programmer.

Some people want to go down that road. I'm very much undecided.
 
Here's qsort:

Code:
void qsort(void *base, size_t nitems, size_t size, int (*compar)(const void *, const void*))

Where's the abstraction saving me from using "void *"? It isn't there, because C can't do generic code without simply bypassing the typesystem like this. One of the first abstractions I want in my typesystem is parametric polymorphism so that I can correctly type the above code.

I can, of course, absolutely forgive C for not having it, having been designed before type polymorphism was properly figured out. But let's not pretend that this isn't a massive weakness of C in the face of modern alternatives.


They're not necessarily a good idea. Once you're statically typing for the length of your arrays, you find yourself having to do arbitrary proofs about arithmetic calculations in your program. At this level, type systems become full blown mathematical theorem provers, which comes at a high complexity cost on the part of the programmer.

Some people want to go down that road. I'm very much undecided.
Well that is kind of where C++ comes in.

You get almost the speed of C and the more modern options like polymorphism, now it has it's own stupid shit. How ever i code mostly in C++ outside of the PLC that is it's own retarded shit. Where i tend to use a lot of C style options to speed op my code.
 
Here's qsort:

Code:
void qsort(void *base, size_t nitems, size_t size, int (*compar)(const void *, const void*))

Where's the abstraction saving me from using "void *"?
I specifically wrote that C gives you enough rope to build the abstraction yourself. You quote the "generic" qsort() as a counterpoint, which is not a counterpoint precisely because it is not your abstraction. The point is moot.

And no, I don't consider this a copout or language-lawyery pedantic dickweedery on my part. This is exactly what I mean. And I agree with the sentiment that plain C along with its standard library has very weak abstractions. That's the point. Whether one considers it a "weakness" or a "feature" is in the eye of the beholder (or toolchain implementer on a new platform) I guess. FWIW I prefer C++ over C anytime and I'm not keen on defending C, quite the opposite. But let's at least call spade a spade.
Really? In my experience the compiler has always been very specific about passing wrong types to functions, even to the point where it's annoying differentiating between unsigned longs and ints and things like that.
As I understand the point of the discussion, the issue is not integer or floating-point arithmetics but C's very weak typechecking re: pointer casting. For example: calling malloc() returns a void-ptr, which you can store into any other pointer type without explicit typecasting. I consider this a major gaping hole in the type system and something to be extremely aware of at all times. C++ has "rectified" the issue with enforcing stronger type checking on pointer casts: if you want your malloc() result as anything other than void-ptr, you have to use reinterpret_cast. Or just call operator new, which carries the type information.

And compiler warnings on mixing signed and unsigned arithmetics, comparisons or casts are very often valid due to risk of undefined behavior.
 
Are there any real examples of people being productive and not just academic with 'literate programming. I've read the description of it and it sounds like an absolutely horrid idea for production level code. Anybody who's tried to use macros in a commercial project to hide stuff will tell you that it looks super elegant and great.... until something goes wrong, and then it's impossible to debug.
 
Are there any real examples of people being productive and not just academic with 'literate programming.
I've never seen anything written in it besides WEB/CWEB itself and maybe one other little project by a Knuth fanboy.
 
Are there any real examples of people being productive and not just academic with 'literate programming. I've read the description of it and it sounds like an absolutely horrid idea for production level code. Anybody who's tried to use macros in a commercial project to hide stuff will tell you that it looks super elegant and great.... until something goes wrong, and then it's impossible to debug.
Somewhere, sometime, some stupid cunt boss who knows fuck all about computers has forced it upon their team...
 
wow I didn't expect to see Bob Harper here lol

did I post these yet? I probably did but if I ever enter academia I will have a page like this

View attachment 1614733546597.png

1614733565262.png



1614733447258.png
 
Last edited:
Back