Programming thread

Unless you have a specific reason to use 64 bit (like needing a fuckhugingassload of memory), you should really write in 32. There's a lot of useful tools that have not 'caught up' yet with 64 bit, despite it being around forever.
There's all kinds of 32 bit software I use that crashes or burns my CPU simply because it runs out of ram or it can't make any use of my cores/threads. You can chalk that up to just poor optimization, but what I know is that 32 bit Skyrim makes my CPU burn and 64 bit Skyrim doesn't.
 
There's all kinds of 32 bit software I use that crashes or burns my CPU simply because it runs out of ram or it can't make any use of my cores/threads.
there's no limitation on how many threads a 32 bit program can use. Are you writing something that needs more than 4 GB of ram at once?
 
there's no limitation on how many threads a 32 bit program can use. Are you writing something that needs more than 4 GB of ram at once?
Not at the moment, but I want to know how it's done for posterity when I do.

Also I was not aware that 32 bit software could use threads. I always assumed that threading and 64 bit were a package deal.
 
I absolutely hate java due to it's verbosity and locking you into doing things in one very specific way.
Java is probably the only language I've used that feels like it was designed by bureaucrats.
Incidentally, these weaknesses are often praised as features that would actually help develop good practices: low visibility of control flow. To elaborate: separating statements by newline instead of a clear marker like a semicolon; blocking of code via indentation (*); no dedicated entry function like main() in C-like languages;
Could you elaborate on this? I don't like python's syntactically significant whitespace because I find it easier to forget to indent as opposed to using a semicolon or curly brace, but I've never thought python's flow control was unclear. New lines for new statements and mandatory levels of indentation for control flow seem intuitive for new programmers. Additionally, python's lack of mandatory main seems like it would make it easy to explain the purpose of functions. I'm not asserting that I'm right since I'm sure your experience in teaching provides a good foundation for these opinions, I just want to know why my gut feelings here are incorrect.
You probably know the reaction people have: "ew, why so many parens everywhere".
I'd say an equally off-putting feature of scheme is the use of polish notation. Most people recoil when they see something like (* (+ 1 2) (+ 3 4)).

Referring back to the previous discussion on IDE's, I think people overestimate the extent to which languages like Java are impossible to write without them. If you want to see a language that is genuinely impossible to use without its IDE, check out pharo. Incredibly difficult to write without its coupled IDE, based on smalltalk and equally as hard to deploy, and the process of bootstrapping a new image for it is irritating.
 
Could you elaborate on this? I don't like python's syntactically significant whitespace because I find it easier to forget to indent as opposed to using a semicolon or curly brace, but I've never thought python's flow control was unclear. New lines for new statements and mandatory levels of indentation for control flow seem intuitive for new programmers.
It's hard to explain in general, you probably have to see it to understand it. I have one example on hand (more on that below), but for now let's just say that no, indentation in general turns out not to be intuitive to newbies. It was a good try and I applaud the experiment, but the results are out and it's a failure.

By the way, look at how many (most?) newbies structure indentations in non-python languages. It's very random and atrocious. Now put these people into python programming and interpreter stops them from doing this. Will this teach people proper code structure? That's the idea, but turns out many (most?) of these people just do random stuff to make the code pass the interpreter.

The example mentioned: suppose you have a loop with a block of code underneath. Doesn't matter what the loop does, but there's a line of code in the middle of the block, which is redundant to be in the loop (it's enough to run this line once at the end of the loop). I tell the student to move this line out of the loop, so what he or she does is: unindent the line by one level, therefore breaking the loop block into two blocks. This of course causes the interpreter to balk, yet I still need to explain that this wasn't what I meant by "take it out of the loop".

This and other issues are compounded when you get into loop nesting. And let me restate my past position on learning python - I strongly believed that was The Way To Go™ but the reality proved me painfully wrong.

Additionally, python's lack of mandatory main seems like it would make it easy to explain the purpose of functions.
I don't see this angle, care to elaborate?

The problem I have with lack of distinct main() happens exactly when I want to introduce functions. Suddenly there's a nonlocality and scoping involved, while before almost everything was global. And inevitably you'll get global statements intertwined with function definitions willy-nilly.
 
I had to use scheme at university for a semester. My conclusion at the end of it was "If this is what they call a 'functional language', I want nothing to do with any of it".
I hear from many programmers they have been traumatized by scheme at school. Care to elaborate? I came into it by my own volition, never got bad vibes from it.
 
I hear from many programmers they have been traumatized by scheme at school. Care to elaborate? I came into it by my own volition, never got bad vibes from it.
It's been a few years so I don't remember it perfectly, but I remember thinking at every step "wow, this is just artificially difficult". The operator order sucks. The (((parens))) legitimately fuck with my eyes in ways that give me a headache. In fact my own professor couldn't write it correctly during lectures because the parens messed with his eyes too. Maintaining state is unituitive compared to every other language (yes I know that's kind of the point). The entire "flow" of the language reads like fragmented nonsense and it's hard to make it "englishy", which is very important for a real production environment.

I guess if I had to put it in one sentence, the language seems to be built for the purpose of being easy to build an interpreter for, not for a human to use.
 
Last edited:
  • Like
Reactions: Considered HARMful
I don't see this angle, care to elaborate?

The problem I have with lack of distinct main() happens exactly when I want to introduce functions. Suddenly there's a nonlocality and scoping involved, while before almost everything was global. And inevitably you'll get global statements intertwined with function definitions willy-nilly.
My thought process was that, if you first introduce functions as "pieces of code you want to reuse in the same program" (amending that later on in the python course to explain that you can use functions from different files with imports), students will immediately see the visual difference between the "reusable" code and the "regular" code because the former is under a function header and the latter isn't. Also, if you cover scoping with loops prior to introducing functions, students might be able to intuit that functions also have their own scope because they're both indented. Contrast this with explaining to students why public static void main is different from static void foo.

However, based on the other parts of your post, it seems I couldn't be more wrong. Must have been hard to avoid to urge to smash the students' faces into their screens until they really start reading and understanding the interpreter errors.
 
Contrast this with explaining to students why public static void main is different from static void foo.
To be fair, no one explains that to 1st year students...or at least shouldn't. My teacher handwaved all the boilerplate code in C++ in college and said "you'll learn about this later, for now just know you have to type this" and I did the same when I was teaching: I'd assign take home problems with the boilerplate shit already filled out. Even more so now, since there's a number of tools now and interpreters that'll fill in all those details for you (coderpad, hackerrank) that'll give you starting boilerplates for problems.

I think all this talk about "best beginners language to learn" is autistic. No matter what it's all going to look weird, and if a student drops out because they flip the fuck out at seeing public static void main, then they're probably not built for programming anyways. Every language has its weird quirks and the downsides won't be immediately apparent anyway if all you're doing is hello world or is it prime number. Main goal with a first programming language should be 1) is it popular enough that simple answers can be found quickly online 2) is it modern and powerful enough that the student could start building cool stuff with it as early as possible 3) is it standard enough to learn conditions, state, scoping, loops, functions, and classes.
 
It's been a few years so I don't remember it perfectly, but I remember thinking at every step "wow, this is just artificially difficult". The operator order sucks. The (((parens))) legitimately fuck with my eyes in ways that give me a headache. In fact my own professor couldn't write it correctly during lectures because the parens messed with his eyes too. Maintaining state is unituitive compared to every other language (yes I know that's kind of the point). The entire "flow" of the language reads like fragmented nonsense and it's hard to make it "englishy", which is very important for a real production environment.

I guess if I had to put it in one sentence, the language seems to be built for the purpose of being easy to build an interpreter for, not for a human to use.
Were you using Racket? Was it your first programming course? Scheme is an educational tool but it's incomplete without something like the Racket IDE.
The professor stumbling over the parens is pretty terrible, I'll give you that.
I ask about prior experience because I don't think anyone with zero experience will care about maintaining state in other ways. "just carry it with you in another variable bro" is fine for them.
Regarding the flow and "englishy-ness" I have a hard time seeing your point. Can't you say it always flows "upwards"? About its natural language properties an example would be nice. There is also a case against making a language "natural". Just look at SQL.
I will say car and cdr are stupid and lisps should abandon them yesterday.
 
Is it the situation, or they're all irrevocably brain-damaged by squaring the circle of how a brain works to begin with? I found functional programming is harder to adopt among more experienced programmers. They will fight it tooth an nail.
In FP-land, who cares about stuff like indices, mutability, or other annoyances?
Could be I'm the weird one.

Those aren't annoyances, they're implementation details, and I don't mean that in the dismissive way that fart huffing abstraction retards mean it. If you want to write fast software, you need to be able to strip away the abstractions that obscure what's actually going on. The more abstractions you can chuck, the better. There is no alternative because "sufficiently smart compilers" do not exist, and possibly never will.

You're not weird, but you are a wannabe mathematician who can't get over the fact that computers aren't the kind of mathematical object that you wish they were. FP weenie shit has its place, but it's not at all the same kind of discipline as what actual programmers do, and I'm tired of you retards pretending that you're part of our conversation and acting smug about it. You're like an architect bitching at a machinist for using dangerous tools when the architect just has other people do the heavy lifting for him: All you're doing is pissing off the machinist because you're too self-centered to see that your field isn't the fucking messiah.

Cross-discipline pollination is fine, but you fuckers take your evangelism too fucking far.
 
Last edited:
Those aren't annoyances, they're implementation details, and I don't mean that in the dismissive way that fart huffing abstraction retards mean it. If you want to write fast software, you need to be able to strip away the abstractions that obscure what's actually going on. The more abstractions you can chuck, the better. There is no alternative because "sufficiently smart compilers" do not exist, and possibly never will.

You're not weird, but you are a wannabe mathematician who can't get over the fact that computers aren't the kind of mathematical object that you wish they were. FP weenie shit has its place, but it's not at all the same kind of discipline as what actual programmers do, and I'm tired of you retards pretending that you're part of our conversation and acting smug about it. You're like an architect bitching at a machinist for using dangerous tools when the architect just has other people do the heavy lifting for him: All you're doing is pissing off the machinist because you're too self-centered to see that your field isn't the fucking messiah.

Cross-discipline pollination is fine, but you fuckers take your evangelism too fucking far.
Who were the first computer programmers?
The discipline of electron pushing and register twiddling is very far from even programming in C. x86 assembly is far from what's going on inside the processor because it's an out-of-order machine which translates every instruction to microcode anyway. You have no control over cache or the memory controller. Unless you're programming microcontrollers for a living, your programming is very far from the abstraction anyway.
Abstraction has its place, and I am now of the opinion I want to work at the highest level of abstraction acceptable within performance requirements. Anything else is masturbation. Take game programming as an arbitrary example - assuming I want to hit 60 fps and I managed it with plenty of high level abstractions and mostly functional code, who cares? Why does it make you angry?
 
computers aren't the kind of mathematical object that you wish they were
Everyone knows that computers are actually single-tape, single-head Universal Turing Machines.
This may sound inconvenient, but I have a veritable army of very smart grad students working on achieving performance comparable or superior to the most popular languages of today. Plus they get course credit for this (and publications if they're lucky), so everyone wins.
 
Who were the first computer programmers?
The discipline of electron pushing and register twiddling is very far from even programming in C. x86 assembly is far from what's going on inside the processor because it's an out-of-order machine which translates every instruction to microcode anyway. You have no control over cache or the memory controller. Unless you're programming microcontrollers for a living, your programming is very far from the abstraction anyway.
Abstraction has its place, and I am now of the opinion I want to work at the highest level of abstraction acceptable within performance requirements. Anything else is masturbation. Take game programming as an arbitrary example - assuming I want to hit 60 fps and I managed it with plenty of high level abstractions and mostly functional code, who cares? Why does it make you angry?

I'm not saying abstraction isn't useful or necessary. I'm saying I detest your "let's program at the highest level possible" attitude when so often that just means wasteful academic masturbation for no benefit whatsoever. I'm angry that people like you keep pushing this degenerate meme and diluting the sensible voices. You are pollution.

If you manage to get your game running at a decent speed using shitty programming techniques, cool. Now imagine how much more you could have done with that computing time if you hadn't wasted it by pretending your dogma made your life easier. Maybe you did everything you wanted to do, or maybe you made compromises because you have no idea how to write performant code. Your example doesn't particularly interest me because I'm more interested in the general phenomenon of FP weenies and OOPsies retards spreading dogma and fear to the younger generation and ensuring they won't be able to write performant software either.

Everyone knows that computers are actually single-tape, single-head Universal Turing Machines.
This may sound inconvenient, but I have a veritable army of very smart grad students working on achieving performance comparable or superior to the most popular languages of today. Plus they get course credit for this (and publications if they're lucky), so everyone wins.

It's not inconvenient to me at all because I think "the most popular languages of today" are all slow pieces of shit.
 
Last edited:
Why is it that Python is seemingly always favoured over Ruby? Is it simply critical mass or is there some advantage to Python that I've been missing all this time? Except for Rails and web design (the niggercattle of coding) it seems like Ruby's pretty well ignored.
 
Why is it that Python is seemingly always favoured over Ruby? Is it simply critical mass or is there some advantage to Python that I've been missing all this time? Except for Rails and web design (the niggercattle of coding) it seems like Ruby's pretty well ignored.
Who would want to write perl-code to achieve python-functionality?
 
Why is it that Python is seemingly always favoured over Ruby? Is it simply critical mass or is there some advantage to Python that I've been missing all this time? Except for Rails and web design (the niggercattle of coding) it seems like Ruby's pretty well ignored.
Python is the number one preferred language for data science. That has driven much of its popularity in the past 10 years. Python and RoR are different beasts for different situations. You can build a webserver in python via Flask or whatever, but fuck you for trying if you do, because you should have just picked something more well-suited for the job. Not saying you can't with python, but have fun finding a python contractor to support it: I can throw a rock and hit 3 asp.net mvc or java spring pajeets.

Best use of RoR i've seen is leveraging its scaffolding for quick prototyping. When you're with a client and 5 minutes later you have something working that they can type into, their eyes light up and suddenly know what they don't and do want.
 
Why is it that Python is seemingly always favoured over Ruby? Is it simply critical mass or is there some advantage to Python that I've been missing all this time? Except for Rails and web design (the niggercattle of coding) it seems like Ruby's pretty well ignored.

Ideally Python should be pretty well ignored, too.
 
Back