Programming thread

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
I stand by it. Before people get foundational knowledge, they'll have no idea what kind of stack they actually need, and they'll have a much harder time learning other languages first. Jumping into the deep end first or starting with something very high level that abstracts away all of the lessons they need to learn isn't setting them up for success.

I don't consider Rust to be the deep end because of the compiler. It's baby's first systems language. Very powerful tool for a learner
 
I stand by it. Before people get foundational knowledge, they'll have no idea what kind of stack they actually need, and they'll have a much harder time learning other languages first. Jumping into the deep end first or starting with something very high level that abstracts away all of the lessons they need to learn isn't setting them up for success.

I don't consider Rust to be the deep end because of the compiler. It's baby's first systems language. Very powerful tool for a learner

Personally if I can do it over again I would have started learning lower level languages sooner, but I think what really matters is that somebody just starts learning something.

I always tell other people who are just starting out to just pick a language, and stick with it until they have a solid grasp of that coding language, then move onto something else.

The most important thing for somebody new at coding, IMO is just picking a language and sticking to it, you'll learn alot of shit anyways and then you can move onto other languages after you have a solid grasp of the first coding language you picked.
 
lower level languages
I really think this to be a misnomer, outside of academic discussions. The machine code is low level, as are thin layers above it, but past incredibly small languages like Forth it becomes a pissing match.
This abstraction is cringe, but this abstraction is based and redpilled.
The most important things for a programmer to learn, and which most fail to do, is to learn how to learn and to learn how to think. A calculator with the ability to store short programs and therefore automate basic tasks would be better than half the shit I see people recommend. Throwing newcomers into the unacceptable hellscape of modern programming isn't the way to do it, unless someone wants to scare them off programming forever.
 
A calculator with the ability to store short programs and therefore automate basic tasks would be better than half the shit I see people recommend.
Another good one is making a cash register that calculates exact change. I made something like that then tried making a game where you counterfeit money and buy stuff and it calculates the exact change and learned alot in the process. Made it easier to go on to more complicated things and start learning stuff that will make me money.

Hell, I set up godot to start learning C# :story: I'm not even planning on being a game developer, I just think its a fun way to learn.
 
  • Like
Reactions: UERISIMILITUDO
Another good one is making a cash register that calculates exact change. I made something like that then tried making a game where you counterfeit money and buy stuff and it calculates the exact change and learned alot in the process. Made it easier to go on to more complicated things and start learning stuff that will make me money.

Hell, I set up godot to start learning C# :story: I'm not even planning on being a game developer, I just think its a fun way to learn.
I meant programming an old HP calculator or something like that.
 
  • Like
Reactions: AbendCode12
I really think this to be a misnomer, outside of academic discussions. The machine code is low level, as are thin layers above it, but past incredibly small languages like Forth it becomes a pissing match.
There are some things you simply can not do in languages which don't have features like memory address pointers. Brainfuck might be 'lower level' than C, but who cares?
 
  • Agree
Reactions: UERISIMILITUDO
There are some things you simply can not do in languages which don't have features like memory address pointers.
This is usually only used to refer to desirable qualities, but there are some things one simply can't do in languages which have features like manual address manipulation, such as be free from all memory errors. Lisp features what are usually called references, which have all useful features of raw addresses and lack disadvantages like use after free, buffer overflow, and similar nonsense some computers had eliminated in their hardware before the C language existed.
Brainfuck might be 'lower level' than C, but who cares?
Brainfuck is the perfect example of simplicity fetishism. It's easy to implement and easy to learn. Why wouldn't anyone want to use it? Lift this argument up a little, and we get the common arguments for the C language.
 
Linux is a first-class .NET citizen (at least on paper)
Very much on paper. I tried out C# using .NET Core and remember being impressed at how easy it was to translate instructions for Visual Studio into editing the underlying textual build files with Vim but as soon as it came to LINQ in the tutorial material it was all ogre. Maybe things have changed since then but that was also my experience with PowerShell Core on Linux.
I agree it's not a bad language to start out with, but I question if starting out with a language so entangled in an ecosystem like .NET could lead to some headaches. There are a lot of reasons to shit on Python as a first language, but to give credit where it's due, any retard and their dog can install Python on just about any machine and run a script. Perhaps starting with a more generalized (for lack of a better term) language may be beneficial in the long run for someone starting out.
I have no idea why anyone would shit on Python as a first language, or at least why anyone would shit hard.

To start with, I don't think anyone should be forced to start with having to manage memory because it isn't the 80s or early 90s.

Also, beyond semantic whitespace, most of the constructs you learn in Python will carry over to other widely used languages in some form or another. It is in the ALGOL family of languages, like most others. When you see attempts at language-neutral pseudocode implementations of algorithms in books or articles that kind of look like ALGOL (or perhaps more accurately fellow ALGOL descendant Pascal), they will readily carry over into Python. Python supports imperative, object-oriented and functional features. The object system might not be the best and was a clear afterthought in Python's design but most of the concepts will carry over to other OO languages, at least once certain warts are removed. The functional features will not be adequate for anyone who insists on Haskell and the like for everything (though oddly enough list comprehensions in Python come from Haskell) but that crowd frankly is not worth listening to. lambdas kind of suck but otherwise there is a decent core of FP features in Python.

Python can also be used as a beginner and then used indefinitely afterwards. (Arguably the same if not more in some cases applies to JavaScript but JS is a lot more mired in complete fucking garbage.) I can't give direct recommendations on where to learn Python as a beginner anymore because that was in high school and the resources I used are now forgotten about and/or completely obsolete. I remember implementing factorial(n) recursively years ago as a simple exercise and now I am using Python to learn time series forecasting. (After that material I understand it gets even easier.) The basic Python data science ecosystem is a bit clunkier than tidyverse in R but R is maybe a worse first language than even JavaScript. Plus you can use Python (effectively) for so much more than R. I honestly can't think of a better first language in le current year 2024.
 
Last edited:
Python encourages this insanity of using 5 billion libraries per project where what the programmer wanted to write is then reduced to twenty lines of code that branch out into all the dependencies. Not only does that teach an aspiring programmer absolutely nothing, it's also a genuinely shit way to do things since you are beholden to external libraries that change all the time and often have bizarre interactions, side effects and bugs and you end up literally not understanding how to do anything or how anything works. Granted, this is not an intrinsic python problem per se, but using python like that is strongly encouraged by any beginner material you can read on python. Then you end up with a bunch of people that are "python experts" and call themselves "experienced programmers" but run for the hills the first time they encounter a problem a prefab library/framework/engine can't fix. (or are asked to do anything in any language that does not have exactly the same rich ecosystem) copy pasting some lines together from stack overflow/chatgpt or CTRL-Fing an API reference isn't programming. Also load bearing whitespace is completely mental.

If somebody really understands the logic that goes into writing a computer program, I'd argue the language does not matter at all. To get there mentally is the problem, and obviously many people just do not which is interestingly demonstrated by things like that godot drama. People care so much about specific engines and frameworks because they don't have the mental model how these things actually work and mesh with systems and basically just get to their goals by memorizing very specific abstractions. It's the difference between understanding a language fluently in a way that you can appreciate language-specific puns and humor and knowing how to order a coffee by repeating a specific set of sounds. Both will get you somewhere but one is a lot more useful than the other. (The "language" here is not Python, Rust, or C either.) A proper programmer should not really have to worry about language. Outside of a few things where I was forced to I never used python, but of course I can read and write python programs. That's where you want somebody to get as a programmer and I personally don't think python is a good language to get a person to there because it encourages bad habits.

I learned programming in a time where it was easy for one person with a book to understand the entire system they are working with down to CPU specific quirks. It was an amazing way to understand computers, logic and to learn the pros and cons of different programming languages and even programming philosophies. (It started my torrid love affair with Forth which is a language I always returned to up to today - I also recommended "Thinking Forth" in this very thread - not even to learn Forth, but to learn to apply logic to problem solving. How many of you do conditional code with arithmetic only, not using IF constructs?) I don't really know how you could replicate that experience today. Even the simplest $5 MCUs are much more complex on a hardware level and have a lot more periphery and complex CPUs with complex instruction sets than these computers ever had.
 
Python encourages this insanity of using 5 billion libraries per project where what the programmer wanted to write is then reduced to twenty lines of code that branch out into all the dependencies.
In the case of Python, I find this is very subfield-dependent. The scientific computing part of the Python world (think Numpy/Scipy) has a much stronger tendency towards megalibs, and the stdlib is pretty extensive. In contrast, Rust and Javascript have libraries sprawl all over no matter where you look.
 
Tbh. I suspect There is no such thing as a good coding language, I think computers are Andy Ditch levels of retarded and I'm only doing coding so I can work from home because I fell for the "work blue collar and be le heckin based tough guy" meme like a retard and realized you actually just get treated like a fucking nigger and your boss will literally skim money off your paycheck.
 
Tbh. I suspect There is no such thing as a good coding language, I think computers are Andy Ditch levels of retarded and I'm only doing coding so I can work from home because I fell for the "work blue collar and be le heckin based tough guy" meme like a retard and realized you actually just get treated like a fucking nigger and your boss will literally skim money off your paycheck.
I like to say that if programming were cooking, people wouldn't have yet learned not to shit where they eat. None of the languages I use are perfect, although APL comes close, but some languages are undeniably better than others. Very importantly, I treat my chosen languages as the floor underneath my feet, and refuse to use anything beneath them. Common Lisp, APL, and Ada aren't perfect, but the ways they solve problems differently makes me a better programmer even when I don't use them, and I use their designs to judge other languages and ways of operating. I'll give an example I like to think about.

So, we're familiar with memory indirection. Commonly, this is implemented in the instruction, but the Lisp Machines allowed an address to be indirect. So, a memory access could cause another memory access, and so on. They used this to make it easier to move around memory for garbage collection, and later compacted it when it became a problem. What happens when the memory indirection be recursive? They had an amusing optimization: The machine would follow the indirections eight times or so before running an algorithm to check that it weren't in endless loop; they did this in the microcode. This prevented the common case from slowing down the machine, and prevented the machine from hanging because of a memory error.

Now, compare this to the UNIX attitude. The filesystem has symbolic links, and those can describe neverending relationships. Here's what they did:
Too many symbolic links were encountered in translating the pathname, or the O_NOFOLLOWflag was specified and the target is a symbolic link.
Why "too many" and not something else here? Many UNIXen have a hardcoded limit to the amount of symbolic links that are allowed, before the implementation bails out. Those fuckers only solved half the problem, didn't they? Once someone is aware of a proper way to do things, like not shitting where he eats, it becomes horrifying to see this dumbassery everywhere, and impossible not to notice it.
 
Now, compare this to the UNIX attitude. The filesystem has symbolic links, and those can describe neverending relationships. Here's what they did:
https://man.openbsd.org/open#ELOOP Why "too many" and not something else here? Many UNIXen have a hardcoded limit to the amount of symbolic links that are allowed, before the implementation bails out. Those fuckers only solved half the problem, didn't they? Once someone is aware of a proper way to do things, like not shitting where he eats, it becomes horrifying to see this dumbassery everywhere, and impossible not to notice it.
POSIX: https://pubs.opengroup.org/onlinepubs/009695399/basedefs/limits.h.html
{SYMLOOP_MAX}
Maximum number of symbolic links that can be reliably traversed in the resolution of a pathname in the absence of a loop.
Minimum Acceptable Value: {_POSIX_SYMLOOP_MAX}
...
{_POSIX_SYMLOOP_MAX}
The number of symbolic links that can be traversed in the resolution of a pathname in the absence of a loop.
Value: 8
So also 8 as it happens, but potentially more depending on the implementation.
It's not clear what point you're trying to make here.
 
Not only does that teach an aspiring programmer absolutely nothing, it's also a genuinely shit way to do things since you are beholden to external libraries that change all the time and often have bizarre interactions, side effects and bugs and you end up literally not understanding how to do anything or how anything works.
Python has a debugger and you can go through your code and library code step by step and examine all the interactions and edit variables on the fly. It's the best starter language because it's easy to understand how babby is formed. It's easy to install (on Windows, and on Linux you already have it).
mango_chapter_1_bro.png

Python libraries are very good for boxing the shit you don't (yet) know but want/need to work with, understanding of which must be built on a stronk foundation the beginner programmer doesn't have.

At the end of the first programming lesson in my life, I walked out wide-eyed and said, "Mom, mom, check this out, the largest value longint can hold is TWO BILLION ONE HUNDRED FORTY SEVEN MILLION FOUR HUNDRED EIGHTY THREE THOUSAND SIX HUNDRED FORTY-SEVEN". When the second lesson started, the teacher said, "ok this is awkward, but why none of you said anything? You're in Intro to computer use and I gave you intro to programming! Let's start over. In 1833, Charles Babbage..." When we finally got to programming, it took me (the class, really) a while to understand the concept of variables, because the teacher couldn't understand how stupid we were.
"Our first program. Let's weite a program that calculates the sum of two integer numbers, let's say 3 and 5."
<...>
"Here's something funny," I said, "in my program to calculate 5+3, it doesn't always output 8, it outputs the sum of the numbers I type in!"

Now people have a vague understanding what a browser does when they go to a website. Python's `requests` ("for humans") allows them to automate the same vague shit in their code, it's a new thing they could not do but now can. A guess-the-number game is one of babby's first programs, and it doesn't take much to be able to play guess-the-number with a telegram bot without knowing async, event loops, and other Satanic doctrine.

I'm not telling people to be pajeets, copypasting found/generated code into their projects because someone on the internets says it does what they want. (DO NOT DO THIS, THIS WAY LIES LOLCOW GIRL CODER BRAINROT.) I'm not telling them to be javascript developers, importing a lolbrary for every little thing. But beginner programmers should not be discouraged from developing inside or outside of a box and relying on libraries the contents of which they don't understand. Everyone abstracts away something, unless they're God or Terry Davis. "The bird is okay even though he doesn’t understand the world. You’re that bird looking at the monitor, and you’re thinking to yourself, ‘I can figure this out.’"

The deal with this thread is a lot of you here are programming genuises, and you think, why, if I were a kid again, and maybe if an adult taught me programming, I would've liked to start like so. But this is the Farms, and aspiring programmers on here -- 18+ and never programmed -- are definitely not child geniuses.
 
Python encourages this insanity of using 5 billion libraries per project where what the programmer wanted to write is then reduced to twenty lines of code that branch out into all the dependencies. Not only does that teach an aspiring programmer absolutely nothing, it's also a genuinely shit way to do things since you are beholden to external libraries that change all the time and often have bizarre interactions, side effects and bugs and you end up literally not understanding how to do anything or how anything works. Granted, this is not an intrinsic python problem per se, but using python like that is strongly encouraged by any beginner material you can read on python. Then you end up with a bunch of people that are "python experts" and call themselves "experienced programmers" but run for the hills the first time they encounter a problem a prefab library/framework/engine can't fix. (or are asked to do anything in any language that does not have exactly the same rich ecosystem) copy pasting some lines together from stack overflow/chatgpt or CTRL-Fing an API reference isn't programming.
I'm not entirely sure where this is coming from but I suspect it has to do with web development and particularly the microframework Flask, which will end up having tons of dependencies like other microframeworks. I haven't done any web development with Python yet however and don't know exactly how egregious it gets. Is there anything like is-odd / is-even and left-pad in Python?

In contrast, here are the current dependencies in the venv I created for the Python time series forecasting book:
Code:
asttokens                  fonttools-4.54.1.dist-info  matplotlib_inline-0.1.7.dist-info  patsy-0.5.6.dist-info            pure_eval                              scikit_learn.libs              threadpoolctl.py
asttokens-2.4.1.dist-info  IPython                     mpl_toolkits                       pexpect                          pure_eval-0.2.3.dist-info              scipy                          tqdm
contourpy                  ipython-8.28.0.dist-info    numpy                              pexpect-4.9.0.dist-info          __pycache__                            scipy-1.14.1.dist-info         tqdm-4.66.5.dist-info
contourpy-1.3.0.dist-info  jedi                        numpy-2.1.2.dist-info              PIL                              pygments                               scipy.libs                     traitlets
cycler                     jedi-0.19.1.dist-info       numpy.libs                         pillow-10.4.0.dist-info          pygments-2.18.0.dist-info              six-1.16.0.dist-info           traitlets-5.14.3.dist-info
cycler-0.12.1.dist-info    joblib                      packaging                          pillow.libs                      pylab.py                               six.py                         tzdata
dateutil                   joblib-1.4.2.dist-info      packaging-24.1.dist-info           pip                              pyparsing                              sklearn                        tzdata-2024.2.dist-info
decorator-5.1.1.dist-info  kiwisolver                  pandas                             pip-24.2.dist-info               pyparsing-3.1.4.dist-info              stack_data                     wcwidth
decorator.py               kiwisolver-1.4.7.dist-info  pandas-2.2.3.dist-info             prompt_toolkit                   python_dateutil-2.9.0.post0.dist-info  stack_data-0.6.3.dist-info     wcwidth-0.2.13.dist-info
executing                  matplotlib                  parso                              prompt_toolkit-3.0.48.dist-info  pytz                                   statsmodels
executing-2.1.0.dist-info  matplotlib-3.9.2.dist-info  parso-0.8.4.dist-info              ptyprocess                       pytz-2024.2.dist-info                  statsmodels-0.14.4.dist-info
fontTools                  matplotlib_inline           patsy                              ptyprocess-0.7.0.dist-info       scikit_learn-1.5.2.dist-info           threadpoolctl-3.5.0.dist-info
There are quite a few of them and I don't recognize all of them but what I do recognize is very non-trivial. pexpect is a Python implementation of the Tcl Expect framework for automating command line processes, written in 1990 before JavaScript came and ruined everything and spoiled programmers. dateutil, pytz and tzdata have to do with time; anything to do with time and space in this world is riddled with gotchas. And of course writing all of numpy, pandas, matplotlib, scipy, scikit-learn and statsmodels required huge amounts of knowledge and experience over years, even decades. Here is a quote from the philosopher and mathematician Alfred North Whitehead that is so important in programming:
It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle—they are strictly limited in number, they require fresh horses, and must only be made at decisive moments.
Also load bearing whitespace is completely mental.
I don't see why. It just makes you do what you should be doing anyway, plus you don't need to use brackets anymore.
If somebody really understands the logic that goes into writing a computer program, I'd argue the language does not matter at all.
To a certain extent I agree but some constructs or ways of tackling a given problem are pretty esoteric or specialized and then language does matter. Many years ago I had a go at Prolog, which was made to carry out first-order logic inference specifically. It is Turing-complete and, technically, you can use it for anything. (That is what you call the Turing tarpit.) Using it for anything other than its original intended purpose was an exercise in masochism for me, and that seems to be a very common experience with Prolog. I can see it being used as a DSL in the same way that SQL is used for databases but not for any other purpose. There are also other concepts like continuations that have much wider applications but are extremely brain-melting.
I learned programming in a time where it was easy for one person with a book to understand the entire system they are working with down to CPU specific quirks. It was an amazing way to understand computers, logic and to learn the pros and cons of different programming languages and even programming philosophies.
I was there for the tail end of that and I agree with some of the overall sentiments here but I also would never want to go back to using Borland Turbo C.
(It started my torrid love affair with Forth which is a language I always returned to up to today - I also recommended "Thinking Forth" in this very thread - not even to learn Forth, but to learn to apply logic to problem solving. How many of you do conditional code with arithmetic only, not using IF constructs?)
The first part of the nand2tetris course on Coursera made us implement a stack-based VM which I understand is similar to Forth, though it's been so long. I think the nand2tetris idea is overall a very good one but both the book itself and the course material struck me as a bit rushed.
I don't really know how you could replicate that experience today. Even the simplest $5 MCUs are much more complex on a hardware level and have a lot more periphery and complex CPUs with complex instruction sets than these computers ever had.
Your best bet for the most simple widely used CPU architecture to work with today is probably MIPS and then after that ARM isn't quite as complicated as x86/x64.
I fell for the "work blue collar and be le heckin based tough guy" meme like a retard and realized you actually just get treated like a fucking nigger and your boss will literally skim money off your paycheck.
That's wage theft.
Python has a debugger and you can go through your code and library code step by step and examine all the interactions and edit variables on the fly.
That's one more reminder I need to start using pdb instead of print() statements. Last time I used an actual debugger was in JavaScript. Seriously.
Now people have a vague understanding what a browser does when they go to a website. Python's `requests` ("for humans") allows them to automate the same vague shit in their code
That's analogous to what Expect does. Years ago I used Perl's version of Expect to invoke passwd from a Gtk GUI without the huge security risk of the password ending up in the process list as plaintext.
A guess-the-number game is one of babby's first programs, and it doesn't take much to be able to play guess-the-number with a telegram bot without knowing async, event loops, and other Satanic doctrine.
It is the doctrine of the Blessed Machine!
40k-adeptus-mechanicus-chad-understands-the-weakness-of-his-flesh.jpg
 
Last edited:
This is terrible advice.
It could be worse. I remember when as a little boy my dad tried to teach me JCL and RPG/400 because he thought that would be a good idea to learn how to do shit with computers.
I learned programming in a time where it was easy for one person with a book to understand the entire system they are working with down to CPU specific quirks. It was an amazing way to understand computers, logic and to learn the pros and cons of different programming languages and even programming philosophies. (It started my torrid love affair with Forth which is a language I always returned to up to today - I also recommended "Thinking Forth" in this very thread - not even to learn Forth, but to learn to apply logic to problem solving. How many of you do conditional code with arithmetic only, not using IF constructs?) I don't really know how you could replicate that experience today. Even the simplest $5 MCUs are much more complex on a hardware level and have a lot more periphery and complex CPUs with complex instruction sets than these computers ever had.
Hello fellow Forth programmer. Leo Brodie's books are fucking amazing, and I also think that they're slept on way too much. The two Brodie Forth books and Loeliger's Threaded Interprative Languages really sort of opened up a whole new world when I started programming. As far as simple CPUs go, I think you can still get eZ80's which kind of get you most of the way there to a really simple computing experience, but you're right that even the 8-bit micros like PICs have got the whole fucking kitchen sink in terms of periphrials and shit that needs to be considered (and it's not like there's an easy way to build a general purpose interactive computer with them anyways). Maybe the answer is to build something with reimplementations of the old school shit on softcores on an FPGA, and let people learn on that?
Python can also be used as a beginner and then used indefinitely afterwards. (Arguably the same if not more in some cases applies to JavaScript but JS is a lot more mired in complete fucking garbage.)
Python the language is not a terrible starting language (though the ecosystem was and is increasingly total fucking AIDs), but I have come to really resent this line of thought, because as AmpleApricots noted, you wind up with people that not only can't really think outside the box when things go wrong (and they always do), but also are instilled with this false sense of security and ideas about how the computer work. A number of years back, before Docker was a thing, I was in charge of a system that basically allowed several thousand engineers to concurrently run Python "scripts" (which were full blown applications) to do things across a large number of datacenters; basically I was the janny that maintained the runtime environment and distributed job manager, as well as helping end users understand why things went wrong. One of the things that really impressed upon me is how many guns, clubs and knifes Python presents the user as being "totally safe bro", that if you understood any better, you'd know are pants on head fucking retarded. An example: users would do shit like spin up a ton of threads in a program, and then go invoke multiprocessing all while utilizing shit loads of bad third party libraries, and wonder why things shit themselves or just got mysteriously hung (and none of them understood the consequences of mixing bare ass fork(2) with threads and resources like mutexes and file descriptors, because "the language handles it bro"). I just saw way too much code, even from "senior engineers", that you would have thought Stevie Wonder had banged out, and people who were just content to never learn any better because it sort of worked enough and "made shit easy," even though we pissed away absurd numbers of engineering hours just to keep things running.

Python just encourages a whole lot of brain rot, both in terms of absolute shit performance and really loosy goosy coding, as well as hiding away too many dangerous details that you can't ever really actually safely ignore. It's been the language of choice for Dunning-Krugerands for at least as long as I've been paying attention. It's also increasingly taking on the place of Java for the "wastes absurd amounts of memory award", but people want to use it for everything, when it really really blows for anything larger than small projects with good disciplined developers. I say this as someone whose paycheck largely comes from writing and cleaning up other peoples' Python. We really should push people towards learning and using other tools, when it's better suited (which is often).

I'm not entirely sure where this is coming from but I suspect it has to do with web development and particularly the microframework Flask, which will end up having tons of dependencies like other microframeworks.

I'm going to sperg some more: Armin is a mini-Lennart and a retarded faggot. He basically took one look at bottle, got mad when Marcel Hellkamp wouldn't introduce Armin's shitass WSGI library werkzeug, and decided to reimplement bottle (and called it Flask) with all of the good shit removed and whole lot of really terrible shit turned to 11 (like module level app creation by abusing the import system and extremely stupid and prolific globals). I've never seen a Pallets project that wasn't a smug pile of the stinkiest and poorest horseshit known to man, and it's a travesty that they've seen so much uptake.
 
Last edited:
Back