Programming thread

This is why I think the concept of "learning a language" is ultimately pointless and why a lot of students struggle with programming courses. Congratulations you made a Python program that asks your name and repeats it to you. That's boring as fuck, who cares.

If I was designing a programming course it'd just be head first into using Python to analyze real world data sets. How many people diagnosed with type 2 diabetes require a below-the-knee amputation? Of those, what's the mean time between diagnosis and amputation? Does the patient's age at the time of diagnosis influence this? etc.

I'm of the mindset that if you give people real problems to solve they're going to be much more interested and they'll accidentally learn to program along the way.
You vastly overestimate people's ability to problem-solve in real time. I've encountered people who had problems understanding that putting one line of code above another line of code means that it will transform some data in memory before the next line does. And they aren't necessarily commutative.

If you can get them to do "input name -> print name," you can at least try to impress upon them that putting those things the other way around wouldn't have worked.
 
Are normies just incapable of understanding?
Yes. This is why CS1 is typically a weed out course in most universities - people who enroll will either clear it with flying colors or completely flunk out, very little in-between.

People who are programmers underestimate how difficult something as simple as variable assignment is conceptually if you don't have the right abstract thinking skills. Kinda like The Breakfast Question...

"Okay, x = 10, print x, so the screen will show 10, right?"
"Right!"
"Alright, but what if x wasn't 10?"
"But x is 10"
 
I'm of the mindset that if you give people real problems to solve they're going to be much more interested and they'll accidentally learn to program along the way.
Giving people actual projects to do is definitely a good thing, but only if you have enough existing understanding to undertake it.
It might also end up with you just combining a bunch of stuff from stackoverflow into something that works without ever getting an actual understanding of what you were doing.

Plus the task's problem needs to be something that can be understood by all the people taking the class. You don't want to gatekeep the course by forcing people to have a background in queer feminist anarchist theory. So it needs to be generic.
If I was forced to make a website frontend in the course I would've roped immediately.

That's why I don't think the task is that bad. It's extremely basic but still gives you a chance to actually make something real that you can run. If someone is excited about the subject, that should be exciting enough.

If you can get them to do "input name -> print name," you can at least try to impress upon them that putting those things the other way around wouldn't have worked.
That's why, IMO, procedural/imperative languages are what should be taught first, because that is the most general and simplest paradigm. Plus it is not too abstract in that it aligns with how a computer actually works and will help with understanding computers better. BASIC honestly seems like it would have been one of the better languages to learn with.

People who are programmers underestimate how difficult something as simple as variable assignment is conceptually if you don't have the right abstract thinking skills.
I don't think teachers place enough importance on the abstract/pure concepts like state and that what you are doing is making a list of steps that solves the problem by changing that state.
 
But by the same token, a proper discussion of std:: cout will include namespaces, the overloading of the << operator, what << means when it's not overloaded, whatever the fuck std:: cout is in the first place (a global object??), how std:: endl flushes the buffer...
I just mean understanding it semantically, as like A << B puts B into A and it just kind of "works", whereas printf("my number %g", B) has stumbling blocks like whoops! B was actually typed as an int and now there's a garbage value in the console why's that happen? With a type safe function like the stream operator the student probably won't encounter any of the rough edges early on, so they won't even think to ask any of these questions they're unprepared for.

I'm looking forward to std:: print in C++23, which should make everybody happy.
Yeah that is definitely a huge improvement. Would be cool if C++ eventually got C# style string interpolation literals e.g. $"Array value at {i} = {myArray[i]}". C# has actually come up with a kind of esoteric api that allows interpolated string literals to avoid heap allocations, I expect C++ could work out something similar as well.

If I was designing a programming course it'd just be head first into using Python to analyze real world data sets. How many people diagnosed with type 2 diabetes require a below-the-knee amputation? Of those, what's the mean time between diagnosis and amputation? Does the patient's age at the time of diagnosis influence this? etc.
Sure, but you have to actually be able to use python before that's even possible. Yes, once you've learned a language you'll probably have enough of a foundation that you can just go straight into more practical applications in another language, but unfortunately you do need the boring principles first.

I'm of the mindset that if you give people real problems to solve they're going to be much more interested and they'll accidentally learn to program along the way.
In a similar vein though, I think unity can be pretty good as a practical learning tool. Very visual, wysiwyg, interactive. Helps keep the learner engaged. That said, they really should have a grasp on C# fundamentals (or other programming experience) before they touch unity. I've seen a lot of unity noobs get into weird cargo cult practices because they don't actually really know how the language works.

EDIT;
If only there were some way for the computer to tell you you did something wrong.
Well aware, but that only serves highlight the issue of the non-type safe api. It's something you have to think about in C, it's something that just works in C++. Again, not saying the api isn't perfectly usable, just that it's obtuse for beginners.
 
Last edited:
I just mean understanding it semantically, as like A << B puts B into A and it just kind of "works", whereas printf("my number %g", B) has stumbling blocks like whoops! B was actually typed as an int and now there's a garbage value in the console why's that happen?
If only there were some way for the computer to tell you you did something wrong.
Code:
moo.c:7:20: warning: format ‘%g’ expects argument of type ‘double’, but argument 2 has type ‘int’ [-Wformat=]
    7 | printf("my number %g\n",i);
      |                   ~^    ~
      |                    |    |
      |                    |    int
      |                    double
      |                   %d
 
This is why I think the concept of "learning a language" is ultimately pointless and why a lot of students struggle with programming courses.
This is a very important, yet rarely discussed, step in becoming a true programmer. The discussion of which language is "best", which language you "should learn", etc. absolutely dominates discussion among novice programmers and it's quite harmful. I spent so many years trying to figure out which language was best, learning that language, hating it, giving up, goto 10. All the dunning kruger nerds will tell you to learn C or C++ and every other language is garbage for casuals, which scares away all but the most autistic would-be coders.

Rule 1 of programming should be "the best language is the one you can tolerate long enough to make something functional with". Worrying about optimization or CPU cycles or whatever is rule, like, 70.
 
which language you "should learn",
I get the impression that people are subconsciously comparing them to natural languages. I'm sure if you asked them directly, they'd say "oh, obviously not, it's just a tool". But they really treat it like they're learning German or French.
 
I get the impression that people are subconsciously comparing them to natural languages. I'm sure if you asked them directly, they'd say "oh, obviously not, it's just a tool". But they really treat it like they're learning German or French.
I dunno, if you're around tradesmen enough you'll hear the same conversations about actual tools enough times to make you want to become a hermit. Everyone's got their favorite brand, and it's obviously the best one because it's got 1% more torque than this other brand or its chuck tightens 5% faster or whatever the fuck. Often they don't even know why theirs is the best and every other brand is terrible. It's just a "feeling". A lot of programming language loyalty feels very much the same.

I do agree that people act like learning a programming language is a huge, life-changing deal though. As if you only have enough space in your brain to learn one or two, so you'd better make your choices count. In reality it takes about a week or two to learn a language enough to apply it on a practical level, because 90% of them are mutually intelligible save for some syntactical quirks. I've learned and unlearned numerous languages. They only take up space in your memory while you use them, both in your computer and in your brain.
 
I do agree that people act like learning a programming language is a huge, life-changing deal though. As if you only have enough space in your brain to learn one or two, so you'd better make your choices count. In reality it takes about a week or two to learn a language enough to apply it on a practical level, because 90% of them are mutually intelligible save for some syntactical quirks. I've learned and unlearned numerous languages. They only take up space in your memory while you use them, both in your computer and in your brain.
I started out with BASIC and Assembler. My first real language was C and then PERL. Today my personal stuff gets written in Python or C. But for work I don't much care, I hate them all equally and all the common ones are enough alike that, as you said, it doesn't much matter. Sure it may take me longer to figure out the 37 deep chain of libraries that is causing some devs Golang to fail, but it's all the same premise.

I see the same thing in Linux distributions and those are even more similar than programming languages. "We can't use Ubuntu, it's totally different from RedHat"... um... "dnf: command not found" oh, right, Ubuntu, "apt install foo", and I hate all of them more or less equally too.
 
Except I'd have stronger feelings about my screwdriver if it called me a racist for preferring "master" to "main". Or if the hardware store made me promise to respect trans rights before selling me a socket wrench.
>"Do you swear to fight for marginalized folx like transbians and black lives?"
>click "agree"
>start an ESG++ project
>while (true) {
>print("nigger")
>print("tranny")
>}
>run


Revolt against the modern world.
 
Yes. This is why CS1 is typically a weed out course in most universities - people who enroll will either clear it with flying colors or completely flunk out, very little in-between.

People who are programmers underestimate how difficult something as simple as variable assignment is conceptually if you don't have the right abstract thinking skills. Kinda like The Breakfast Question...

"Okay, x = 10, print x, so the screen will show 10, right?"
"Right!"
"Alright, but what if x wasn't 10?"
"But x is 10"
Had your students taken algebra yet? The idea of variables as 'placeholders' should have been something they already learned by then. If this is a 12 year old, OK, might take nudging to get it. By high school the idea should have already been learned. I'm a complete retard and got the connection that programming can look/behave like an equation without even having to think about it.
 
  • Like
Reactions: Marvin
Had your students taken algebra yet? The idea of variables as 'placeholders' should have been something they already learned by then. If this is a 12 year old, OK, might take nudging to get it. By high school the idea should have already been learned. I'm a complete retard and got the connection that programming can look/behave like an equation without even having to think about it.
The problem with that is then they see something like "x = x + 1" and their brain breaks because "x can't be two different things." You have to deprogram them of what they actually learned about variables in algebra.
 
The problem with that is then they see something like "x = x + 1" and their brain breaks because "x can't be two different things." You have to deprogram them of what they actually learned about variables in algebra.
You know what, I hadn't even considered that. For whatever reason (probably because I am a retard) accepting that data can sort of travel through a program, being transformed in different ways at different times, was never a brain break moment.
 
@pursuitsnail Most people cannot do, without great difficulty anyways, algebra. They wander through it in school but don't understand it, even though we use algebra in day-to-day life. The concept that the variable if a store of value that can be manipulated, and isn't its literal self, frustrates them. I think symbolic logic is a special skill that people either have or they don't. It's also a skill that doesn't necessarily imply other skills/intelligence.

@Kosher Salt Well, that's not more of an unfortunate result of the notation we use. That the equal is not a comparison operator (==), but instead represents a completely different concept (defining or overwriting). In a math proof it would be something like "Suppose x=y." Never seen someone write a proof where an a variable is overwritten with a function of itself, would be no real use for such a thing, but you do see people do "Suppose x=" multiple times in a proof for different steps/cases.
 
Well, that's not more of an unfortunate result of the notation we use. That the equal is not a comparison operator (==), but instead represents a completely different concept (defining or overwriting). In a math proof it would be something like "Suppose x=y." Never seen someone write a proof where an a variable is overwritten with a function of itself, would be no real use for such a thing, but you do see people do "Suppose x=" multiple times in a proof for different steps/cases.
Overwriting is just fundamentally different from anything that exists in algebra. The whole point in algebra is that once you've determined the value of x, you can rely on x being that value later, and overwriting breaks this.

"Suppose x=" is generally a hypothetical where the purpose is to determine that x cannot equal the supposed value (proof by contradiction). This is almost like overwriting, but it's really just a hypothetical recursion into an imaginary world where x is different. You explore that world just long enough to find that it's broken, and then you pop back into reality where x isn't that, never was, and never could be.
 
  • Agree
Reactions: args
Overwriting is just fundamentally different from anything that exists in algebra. The whole point in algebra is that once you've determined the value of x, you can rely on x being that value later, and overwriting breaks this.

"Suppose x=" is generally a hypothetical where the purpose is to determine that x cannot equal the supposed value (proof by contradiction). This is almost like overwriting, but it's really just a hypothetical recursion into an imaginary world where x is different. You explore that world just long enough to find that it's broken, and then you pop back into reality where x isn't that, never was, and never could be.
It's not just used in contradiction, suppose is the beginning of pretty much any proof. Suppose this variable is within this set, or has this property. Then this would happen.

That too is something a lot of people struggle with, they reason off of example (I can do this 1000 times in a row and it works, so it must be true) and don't get that it doesn't, by the standards of math, count. I had a particularly frustrating time trying to teach logic to a classmate, there'd be a proposition, he'd find one example of the proposition, and then declare "true." (Dude was supposed to be a philosophy major, so it's hard to explain how he'd fuck up so badly and so consistently.)
 
  • Lunacy
Reactions: Marvin
It's not just used in contradiction, suppose is the beginning of pretty much any proof. Suppose this variable is within this set, or has this property. Then this would happen.
But that's not like overwriting at all, because you're just supposing x is something and then proving it is that. Supposing that x is something and proving it isn't is a little bit more abstract.

Really though that level of abstraction is the stuff that a lot of people are going to struggle with.
I had a particularly frustrating time trying to teach logic to a classmate, there'd be a proposition, he'd find one example of the proposition, and then declare "true." (Dude was supposed to be a philosophy major, so it's hard to explain how he'd fuck up so badly and so consistently.)
Philosophy 101 literally is basic logic, lol. So many people just can't think logically.
 
  • Like
Reactions: Marvin
Back