Programming thread

I've always guess that the reason large program are written strongly typed is because what libraries are available and compiling and distributing tends to favour already prominent strongly typed languages.
you can link statically typed libraries into dynamically typed languages without too much heartache by creating wrappers. Statically typed languages are favored for large programs because static typing enables vastly greater understandably by making every function have basic, front facing guarantees of what it will accept as input. These give you strong assurances about what the function you're calling does and does not do, letting you fluidly write using code without having to analyze if what the function is doing is safe with the types that you're passing in (which may not even be clear depending on what level of abstraction you're at in a dynamic language).

With dynamic languages, you have to either blindly trust or manually verify that a function will operate in a sane matter on the types that you're passing in. It creates a yoyo effect of going up and down and up and down and up and down multiple layers of abstraction just to make sure what what you write isn't going to bomb out at runtime. Remember this also might be in the context of 10 or 20 other coders that may not even be competent in their jobs and you have to use whatever they wrote. Static typing gives no guarantee of correct behavior, but the scope of mistakes you can make is vastly narrower.

Reiterating: the problems i mention here are not a big deal for small scripts. Flexible code and manual verification is not a major hindrance if the code remains small. As the code expands and (more importantly) becomes deeper, adding features gets rapidly more difficult.

If the compiler or interpreter can figure out the typing, why can't said tools that don't work just use the languages implementation when they need to do type work?
compiling and interpreting are vastly different. Compilers treat generic code as a template for making new code (hence the name in C++). They will actually generate copies of those functions for each type that you end up using. This enables the compiler to do static analysis on that code ahead of time as if the generics were never there in the first place.

Interpreters with dynamic languages have no such advantage because nothing is known before runtime. There ARE some interpreters that play out possibilities for what types could be based on what they see for literal declarations, but doing this comprehensively in the program is completely infeasible as it would require every piece of code to try every permutation of every possible combination of types... and even then it would not be helpful because the compiler has no realistic way of knowing what you even plan to pass in from... say... a file or the command line. It would just be flagging random stuff that you never intended to do.

the simplest example of why code analysis fails for dynamic languages is this: say do_a_thing succeeds on numbers and fails on strings (being generous to assume it would know that to begin with).

Python:
var = 0

if some_conditon:
    var = "hello"
else:
    var = 5
   
result = do_a_thing (var)

What does the static analyzer make of do_a_thing? Does it force itself to evaluate both true and false conditions of the branch? Perhaps it could give a warning that the top branch would result in an error. But what about this:

Python:
var    = read_json(root)["var"]
result = do_a_thing (var)
 
Last edited:
With dynamic languages, you have to either blindly trust or manually verify that a function will operate in a sane matter on the types that you're passing in. It creates a yoyo effect of going up and down and up and down and up and down multiple layers of abstraction just to make sure what what you write isn't going to bomb out at runtime.

This is a very important point that I am glad you made; out of the great three "silver bullet" lies that have plagued programming over the last two decades (JIT, garbage collection, dynamic languages), dynamic languages are the only one that are an objective failure in every context. Dynamic languages confer none of the benefits they promise.

The notion is that with a dynamic language you trade performance for productivity. The truth is that dynamic languages are a trap for the unwary that push every error (semantic (standard) AND syntactic) into the runtime (nightmare) and generate a much higher base cognitive load at every moment. You have to memorize APIs and types in a much deeper way than anyone does with a static language, variable naming often must use Hungarian-like notation to remain sane, and you have to be far more paranoid in your defensive programming. It's like programming in a static language in notepad with no tooling, except even the compiler can't help you fix common syntactical errors. There is only one type of person for whom a dynamic language is the most productive choice: the one who doesn't know a single static language. Writing programs in dynamic languages is a lose-lose scenario; your program runs like shit and is a nightmare to maintain and scale. But there are tons of libraries and thank god I don't have to type those nasty semicolons after my statements!

The only situation where dynamic languages are even remotely appealing is for fast, ad-hoc scripting situations. perl -e kind of situations or some glue for toolchains, but why even bother learning a new language when you could just use the os scripting language instead? There's absolutely no place for dynamic languages in a future with aggressive type inference in languages like Go, Nim, Crystal...but Ruby, Python, et. al will plague us for the rest of time because the userbase is too big to fail, and the devil's deal of quick results with massive libraries will keep tempting new, naive programmers to go down the path of madness.
 
Last edited:
This is a very important point that I am glad you made; out of the great three "silver bullet" lies that have plagued programming over the last two decades (JIT, garbage collection, dynamic languages), dynamic languages are the only one that are an objective failure in every context. Dynamic languages confer none of the benefits they promise.
Would you say that "visual ease" in other words, being easy to understand the function of code (not including comments) at a glance was one of the advertised benefits of dynamic languages? Another falsehood, because nowadays even if you are writing in the most old hard to read languages like C++ or Java, you have
  • Google your question or error and you will find stackoverflow answers explaining the situation
  • Online documentation is detailed for all but the most mainstream libraries
  • Complex visual editors that saved my ass when I had to write some shit in java, that spoonfeed you all the information you need about function parameters, errors, and even write in lots of basic stuff like "public static void main(string[] args)" or whatever the fuck it is for you.
And of course you can just go ahead and try a few times and see what the compiler tells you.

Also, I would be interested what you say about the other two silver bullets you mentioned, if you are so inclined.
 
  • Like
Reactions: TVStactic and Ledj
This is a very important point that I am glad you made; out of the great three "silver bullet" lies that have plagued programming over the last two decades (JIT, garbage collection, dynamic languages), dynamic languages are the only one that are an objective failure in every context. Dynamic languages confer none of the benefits they promise.

The notion is that with a dynamic language you trade performance for productivity. The truth is that dynamic languages are a trap for the unwary that push every error (semantic (standard) AND syntactic) into the runtime (nightmare) and generate a much higher base cognitive load at every moment. You have to memorize APIs and types in a much deeper way than anyone does with a static language, variable naming often must use Hungarian-like notation to remain sane, and you have to be far more paranoid in your defensive programming. It's like programming in a static language in notepad with no tooling, except even the compiler can't help you fix common syntactical errors. There is only one type of person for whom a dynamic language is the most productive choice: the one who doesn't know a single static language. Writing programs in dynamic languages is a lose-lose scenario; your program runs like shit and is a nightmare to maintain and scale. But there are tons of libraries and thank god I don't have to type those nasty semicolons after my statements!

The only situation where dynamic languages are even remotely appealing is for fast, ad-hoc scripting situations. perl -e kind of situations or some glue for toolchains, but why even bother learning a new language when you could just use the os scripting language instead? There's absolutely no place for dynamic languages in a future with aggressive type inference in languages like Go, Nim, Crystal...but Ruby, Python, et. al will plague us for the rest of time because the userbase is too big to fail, and the devil's deal of quick results with massive libraries will keep tempting new, naive programmers to go down the path of madness.

I have vast personal experience with this topic and it's why I get sent on such a 'tism spree when people bring up this notion.

I've been on projects that started with statically typed languages, and then the engineers (myself included) got the bright idea that pushing everything into dynamic scripts would be a great idea. Very, very VERY long story short, the end result was:
  • much slower
  • less maintainable
  • eventually harder to even write for because nobody knew what the fuck the lower calls actually did
  • Code that never hard crashed, but was MORE unstable
  • HELLISH documentation trying to specify what calls did and what they would except
The final 'fix' we tried to do was integrating code linters that would automatically flag what was acceptable and what wasnt. And at that point... why the hell are you even using a dynamic language anymore? We eventually bit the bullet and just ported EVERYTHING into the backend. My god, what a world of difference.

The initial appeal for moving things into scripts was because we had a lot of visual elements that we had to tweak constantly and realtime recompilation was a god send, but eventually that stopped being useful as well and we discovered better ways of tweaking the backend.

Realtime tweaking like this is really the ONLY advantage of interpreted languages, and it's led me to the conclusion that scripts should not be whole programs, they should be parameters for whole programs. They function best as a form of dependency injection. You need a ball to move around on the screen. Write the ball in a static language. Have it ask a dynamic language about it's movement routine, if you even need that.
 
Would you say that "visual ease" in other words, being easy to understand the function of code (not including comments) at a glance was one of the advertised benefits of dynamic languages?
Absolutely; dynamic languages have always boasted about their superior approachability and legibility, even from people who can't program. Of course this only makes sense if you think of dynamic languages as something closer to a natural language where ambiguity is constant and there is a massive burden on the listener / reader to disambiguate via context. Python is often described as looking nearly like pseudocode; I agree, in that just like pseudocode, it's nearly impenetrable without paragraphs preceding it to give it context.

The success of the dynamic languages is simple: they attribute their success to a set of myths, but instead "succeed" due to unrelated factors. Dynamic languages seemed to offer faster productivity than other languages not because of their type systems or syntax; rather they tended to have vast, expansive standard libraries and made importing and running source code a simple affair. Early on, it seemed the myths were true because so few popular static languages were as easy to setup and rarely had comprehensive standard libraries (in the sense of getting high level tasks accomplished quickly). It's not the language, it's the ecosystem.

The "lines of code" arguments have always been misleading; the only reason most Python programs are terse is because the complexity is hidden by the standard library. Good libraries make you very, very fast. A CLI program's main function written in C++ can be nearly as terse as Python and just as readable (moreso, really), it all depends on how the code is structured. Now time has passed and the situation has changed. Dynamic languages no longer are alone in offering easy, comprehensive standard libraries and package managers. Newer static languages have made compilation as painless as invoking the interpreter. Now you can see the myths exposed for what they are every day. Glance at some Nim source code and then some Python source code and tell me if you even could tell the difference (other than minor syntactical differences). Now benchmark them; then there will be no question which is which.

Also, I would be interested what you say about the other two silver bullets you mentioned, if you are so inclined.
These are pretty complicated subjects but I'll just make a few quick, general thoughts here. All three of these lies follow a similar pattern: a "new" paradigm / technique is discovered, and adherents / zealots for each always proclaim that this new paradigm will supplant the old. Dynamic languages talked about being the default position for software development because of their productivity, and static languages would be contained and a "last resort" for systems and other performance critical areas. We all know how that turned out.

Here's the summary of GC and JIT: Unlike dynamic languages, both of these actually have viable contexts, and they are interesting and useful ideas. However, they are never going to replace manual memory management or AOT compilation. They are strategies that have useful applications, they are not default "better" alternatives. GC will never outperform good manual memory management, and JIT will never outperform a fresh, properly optimized AOT compiler. GC is the most useful and the most "successful" in that it has become the default position for most new languages, but it is still predicated on some large myths.

Garbage collection was advertised as a panacea to memory problems; sure, it simplifies memory management and there are plenty of contexts where it is useful, but it doesn't solve the hard problems. GC always is promoted as being not only safe, but theoretically more performant than manual memory strategies. Go read D, C# or any language that cares about performance and you will always find articles about how GC really isn't the bottleneck you think it is, and how it can outperform manual memory management in a theoretical edge case about fragmentation involving naive manual memory implementations. Guess what? It never lives up to this hype. GC is always less performant and there has never been a runtime model that does what they claim it will do. Furthermore, GC doesn't sanitize memory as much as everyone says it does. Yes is obviates ownership collisions and stops double free scenarios, but it does not stop memory leaks. There is a true memory leak that happens in non-GC scenarios (the address is no longer reachable); there is a logical (soft) memory leak that happens when you accidentally keep an object alive (usually this means a reference counter > 0). This happens all the time in modern software, because the people using GC languages are told they don't need to worry about memory stuff. To avoid memory leaks you have to be incredibly diligent with your references, event subscriptions, etc. GC is not a magical fix that makes memory problems go away. Leaks are still there, programmer responsibility is still there (just often ignored), and the performance is always worse. Use it when you need to; don't when you don't. (Ultimate irony: a true memory leak is detectable with tooling, a soft memory leak is not!)

I'll end this since it's getting too long; it's the same story with JIT. Notions of programs that are years old inheriting future optimizations, theoretical arguments that JIT could outperform AOT; again, not true. In edge cases, sure. All of these runtime model ideas have a cost that always will exceed or negate the performance benefits from the model itself. A fresh AOT compilation is always going to win. JIT has a purpose and place, but it's not going to end AOT.
Realtime tweaking like this is really the ONLY advantage of interpreted languages, and it's led me to the conclusion that scripts should not be whole programs, they should be parameters for whole programs.
Yes, and luckily all of the more contemporary languages have put great efforts into reducing compile time. In the future I expect rational developers to ditch dynamic languages entirely. Using nim -r / crystal -eval on small source files as if they were scripts is very usable; systems like Nimscript are another attractive option. For the times when you are truly stuck with bad compile times in the C/C++/Rust world, a top-layer with a scripting language like lua / chaiscript with entrypoints into the core application can give you some nice flexibility (just keep the scripts short!).
 
Absolutely; dynamic languages have always boasted about their superior approachability and legibility, even from people who can't program
To quote John Carmack: "you don't want people who 'aren't really programmers' working on your code. You will suffer for it."

Python is often described as looking nearly like pseudocode; I agree, in that just like pseudocode, it's nearly impenetrable without paragraphs preceding it to give it context.
This is probably another thing giving python a big boost in colleges. All of my algorithms professors made us use Python. The reasoning was pretty simple: we were most interested in pure logic/pseudocode. The thing about algorithms in the math-y sense is that they don't operate on types. They operate on objects with stated properties. As long as the object has those properties, the code will work. For that, Python IS a good language. It's a great academic exercise.

In production code, however...

what.png
 
The "lines of code" arguments have always been misleading; the only reason most Python programs are terse is because the complexity is hidden by the standard library. Good libraries make you very, very fast.
I agree, and this leads to python java and even some compiled java-like language like go programs even more bloated than if you wrote exactly what you needed in C (no matter how long that would take)...and that means slower speeds for your chip to load the program, go through each instruction, etc. Never mind that the filesizes would be very different because I'm talking about 3 different architectures here, interpreted script, JIT, and compiled executable. All 3 of these languages contain way more overhead, especially java and python. Things like having way more methods to standard classes like strings and lists. I don't know how python developers made lists that you can put stuff of multiple types in (at least shell scripts make all variables strings and all programs take strings as input lmao) but you bet your ass that will take up way more space than an array with compiler-time length.
I'll end this since it's getting too long; it's the same story with JIT. Notions of programs that are years old inheriting future optimizations, theoretical arguments that JIT could outperform AOT; again, not true. In edge cases, sure. All of these runtime model ideas have a cost that always will exceed or negate the performance benefits from the model itself. A fresh AOT compilation is always going to win. JIT has a purpose and place, but it's not going to end AOT.
This is why I like go, imo it is very similar to java in that it has lots of standard libraries ready for you to use as well as standard type methods and garbage collection, so it keeps the "busywork" coding down (to it's detriment as you guys pointed out) but the runtime is included in the binary, so you don't need to run the program in a java runtime, which I always thought was stupid and useless, especially because for commercial uses you have to license the java runtime. (And i could never wrap my head around the java class structure, I always made the ide write that stuff) For me it's a good fit for those high-level programming jobs that I think most people would use python for.
 
  • Agree
Reactions: Ledj
This is why I like go, imo it is very similar to java in that it has lots of standard libraries ready for you to use as well as standard type methods and garbage collection, so it keeps the "busywork" coding down (to it's detriment as you guys pointed out) but the runtime is included in the binary, so you don't need to run the program in a java runtime, which I always thought was stupid and useless, especially because for commercial uses you have to license the java runtime. (And i could never wrap my head around the java class structure, I always made the ide write that stuff) For me it's a good fit for those high-level programming jobs that I think most people would use python for.
GC only is a detriment if Go is intending to compete with the fastest systems languages; otherwise it seems to fit perfectly in Go's design goals. In that way it's a great example of where GC is useful to use as a design choice when deciding between pure performance and programmer efficiency. Go has an elegant simplicity that makes it a great candidate for any project where cutting edge performance isn't a concern (it's not like its performance is bad, it's much faster than the majority of popular languages out there). I think you're right, Go is a perfect counterpoint to Python and its ilk; it is very easy and fast to write simple, performant programs, but also scales much better when you want to take things further.
 
I don't know how python developers made lists that you can put stuff of multiple types in
Ruby:
[FALSE, ", ruby i", 5, " the b", 3, "st languag", 3, " evar!"].join
=> "false, ruby i5 the b3st languag3 evar!"
Python:
my_list = ["And it is ", True, " that Python ", 1, "s a cl", [], 5, 3, " second."]
"".join([str(y) for y in my_list])
=> 'And it is True that Python 1s a cl[]53 second.'

If loving hetero lists is wrong, then I don't wanna be right.
 
Ruby:
[FALSE, ", ruby i", 5, " the b", 3, "st languag", 3, " evar!"].join
=> "false, ruby i5 the b3st languag3 evar!"
Python:
my_list = ["And it is ", True, " that Python ", 1, "s a cl", [], 5, 3, " second."]
"".join([str(y) for y in my_list])
=> 'And it is True that Python 1s a cl[]53 second.'

If loving hetero lists is wrong, then I don't wanna be right.
seriously though, what can you accomplish with a hetero list that can not either be done (less bug-prone too) with generic interfaces or with a struct/object-like implementation?
 
Last edited:
  • Feels
Reactions: Yotsubaaa
This conversation is a total mess. It's conflating typed, untyped, interpreted, AOT compiled and all sorts.
Pedantry is, unfortunately, rather potent in programming discussions to shut down general discussions. It's always possible to point to some edge case (C++ has duck typing in templates so it's not purely static!) or plead a very specific case for overloaded jargon. It's an informal discussion assessing some qualities that characterize popular dynamic languages (it also spawned a few tangential, unrelated topics). I'd be much more interested if you or anyone else could argue for merits in dynamic languages that we might have overlooked, rather than shift to language lawyer battles or arguments over semantics.

You can find languages that have any combination of these.
And so what? The argument was never whether or not the ideas / concepts existed and were implemented in varying degrees, but rather the usefulness or merit of the ideas. What's next, the perennial "revelation" that Lisp has been doing "everything" these other modern languages have been doing since 1958?
 
Anybody else start teaching themselves programming this week?View attachment 1938529
View attachment 1938835
dude there's tons of PDF's on the interwebz... i am recently doing a fullstack bootcamp because why not? uses react, css, vue and node, gotta play with postgre and sqlite.
found a cool program that is VSCode which is miles better than sublime text especially with the live server plugin. real cool shit.
nice legs tranny
found the JS dev.
Is it just me, or is JavaScript like the programmers equivalent to a botched tranny surgery. Like I will have to say out of all of the languages I've learned, this has to be one of the worst in the bunch I have encountered. What do you guys think about JS?
THE MARKET DEMANDS IT. so i gotta dive through that autism...
I have vast personal experience with this topic and it's why I get sent on such a 'tism spree when people bring up this notion.
I've been on projects that started with statically typed languages, and then the engineers (myself included) got the bright idea that pushing everything into dynamic scripts would be a great idea. Very, very VERY long story short, the end result was:
  • much slower
  • less maintainable
  • eventually harder to even write for because nobody knew what the fuck the lower calls actually did
  • Code that never hard crashed, but was MORE unstable
  • HELLISH documentation trying to specify what calls did and what they would except
The final 'fix' we tried to do was integrating code linters that would automatically flag what was acceptable and what wasnt. And at that point... why the hell are you even using a dynamic language anymore? We eventually bit the bullet and just ported EVERYTHING into the backend. My god, what a world of difference.
The initial appeal for moving things into scripts was because we had a lot of visual elements that we had to tweak constantly and realtime recompilation was a god send, but eventually that stopped being useful as well and we discovered better ways of tweaking the backend.
Realtime tweaking like this is really the ONLY advantage of interpreted languages, and it's led me to the conclusion that scripts should not be whole programs, they should be parameters for whole programs. They function best as a form of dependency injection. You need a ball to move around on the screen. Write the ball in a static language. Have it ask a dynamic language about it's movement routine, if you even need that.
how old are you, matusalem? still you did give a good take for programmers to think on, that is not some shit you'd find explained on stackoverflow or that any programmer usually would rely on it unless he spent a vacation on guantanamo bay... however isn't JS's whole point of integrating the backend with frontend with little hassle as possible while still being dynamic with the results? of course the rest of the lang being more draconic than CPP to deal with is another story but many of it is eased by learning concepts rather than trying to figure the lang itself.
To quote John Carmack: "you don't want people who 'aren't really programmers' working on your code. You will suffer for it."
that's why cobol exists don't it? also i'm kinda young in the programming world but in college we started it with a pseudocode then jumped into CPP... the horror.
Yes, and luckily all of the more contemporary languages have put great efforts into reducing compile time. In the future I expect rational developers to ditch dynamic languages entirely. Using nim -r / crystal -eval on small source files as if they were scripts is very usable; systems like Nimscript are another attractive option. For the times when you are truly stuck with bad compile times in the C/C++/Rust world, a top-layer with a scripting language like lua / chaiscript with entrypoints into the core application can give you some nice flexibility (just keep the scripts short!).
dude i think it's a bit optimistic, the most likely to happen is some dude trying to create his own lang with support to the one he has a beef with, looking at you ruby. for his own ego rubbing of getting a masters/doctorate, sadly the academia has too much gay shit for people to actively think on something good for everyone, shit i'd think CPP is the closest but it still a mess to deal with that people often pick their favorite poison, too much infighting and faggotry hardly allows that to happen, too much work for what is already settled and too little autism to make adjusting things to be viable.
 
dude there's tons of PDF's on the interwebz... i am recently doing a fullstack bootcamp because why not? uses react, css, vue and node, gotta play with postgre and sqlite.
found a cool program that is VSCode which is miles better than sublime text especially with the live server plugin. real cool shit.

found the JS dev.

THE MARKET DEMANDS IT. so i gotta dive through that autism...
how old are you, matusalem? still you did give a good take for programmers to think on, that is not some shit you'd find explained on stackoverflow or that any programmer usually would rely on it unless he spent a vacation on guantanamo bay... however isn't JS's whole point of integrating the backend with frontend with little hassle as possible while still being dynamic with the results? of course the rest of the lang being more draconic than CPP to deal with is another story but many of it is eased by learning concepts rather than trying to figure the lang itself.
that's why cobol exists don't it? also i'm kinda young in the programming world but in college we started it with a pseudocode then jumped into CPP... the horror.
dude i think it's a bit optimistic, the most likely to happen is some dude trying to create his own lang with support to the one he has a beef with, looking at you ruby. for his own ego rubbing of getting a masters/doctorate, sadly the academia has too much gay shit for people to actively think on something good for everyone, shit i'd think CPP is the closest but it still a mess to deal with that people often pick their favorite poison, too much infighting and faggotry hardly allows that to happen, too much work for what is already settled and too little autism to make adjusting things to be viable.
Okay guys very cool, but whoever's neural network this is, might I suggest not using the A&N board as the training dataset?
 
dude there's tons of PDF's on the interwebz...
In his defense, I greatly prefer physical books as well. Just easier to concentrate on and take notes.

however isn't JS's whole point of integrating the backend with frontend with little hassle as possible while still being dynamic with the results? of course the rest of the lang being more draconic than CPP to deal with is another story but many of it is eased by learning concepts rather than trying to figure the lang itself.
At the time javascirpt was designed, I don't know if anybody envisioned just how monstrous and bloated the scripts would become. I don't think anybody's said that javascript is a bad language for it's purpose, but it's a bad language for it's modern scope and we just desperately need something with more type safety. And to be clear there ARE versions of that that exist! But it seems like nobody uses them.

that's why cobol exists don't it? also i'm kinda young in the programming world but in college we started it with a pseudocode then jumped into CPP... the horror.
I think you might be conflating 'real programmer' with 'low level programmer'. The two are unrelated. John's point is saying that scripting languages are often used as a means to let people code who are not competent to write that code, but think they can because the language looks fluffy and englishy.
 
Back