Programming

  • 🔧 Actively working on site again.
Certain people will tell you that variable names shouldn't be "verbose". I say the hell with that! Programming "efficiency" (development time, not execution efficiency) isn't as important as cleanly readable code, in my opinion. These are the same people who think the 5-minutes a year you save using vim is worth it.

There's nothing more I hate then when I go into someone's code and I see shit like "tmp" or "buf" or fucking single-letter variables like "a,x,p". Certain coding standards actually enforce this (looking square at you, Linux kernel naming conventions). Your code should be easy to read: writing out "buffer" isn't that hard and it saves me the time it takes to figure out whether or not "buf" is a buffer or an abbreviation someone thought was clever.
 
Last edited:
Certain people will tell you that variable names shouldn't be "verbose". I say the hell with that! Programming "efficiency" (development time, not execution efficiency) isn't as important as cleanly readable code, in my opinion. These are the same people who think the 5-minutes a year you save using vim is worth it.

There's nothing more I hate then when I go into someone's code and I see shit like "tmp" or "buf" or fucking single-letter variables like "a,x,p". Certain coding standards actually enforce this (looking square at you, Linux kernel naming conventions). Your code should be easy to read: writing out "buffer" isn't that hard and it saves me the time it takes to figure out whether or not "buf" is a buffer or an abbreviation someone thought was clever.
Oh, absolutely.

Honestly, I strive to reduce the amount of commenting I do to zero by putting the burden of describing what my code does on the code itself. I can't do that 100%, of course, but it's still a huge improvement over just commenting the hell out of everything. An inaccurate comment is much worse than ten times the amount of accurate comments. Comments will gradually become inaccurate unless maintained.
 
Oh, absolutely.

Honestly, I strive to reduce the amount of commenting I do to zero by putting the burden of describing what my code does on the code itself. I can't do that 100%, of course, but it's still a huge improvement over just commenting the hell out of everything. An inaccurate comment is much worse than ten times the amount of accurate comments. Comments will gradually become inaccurate unless maintained.
Definitely. I've looked back at scripts I've written in the past, and realised that because of my commentary being written early, some of it is so out of date as to be misleading, which irks me, but as long as the variables and functions (like GetUserName) are nice and clear, it's not such an issue.
 
  • Agree
Reactions: Marvin
I hate pointers so much-- why are they so counterintuitive?

Also, what are our thoughts on the qubit and the remodeling of computers with quantum rules instead of the boolean bit of "1" and "0"?
 
I hate pointers so much-- why are they so counterintuitive?

Pointers (especially two- and three-stars out) can be rather unintuitive and even once you become experienced with them there are some cases where you need to be careful and think about what you're doing. Once you understand pointers, you begin to wonder if there's anything that can be done to make them more intuitive while preserving the same level of functionality. Java does away with pointers, instead using references, but you sacrifice things like pointer arithmetic.

I think the syntax can be improved upon. The C family has a variety of ways you can define a pointer. You can do:

"int *n": which is a way of saying "define a pointer named N, and make it of int type".

Or you can do:

"int* n": which is a way of saying "define an int pointer named N".

These mean the same thing and have the same practical effect. I prefer the second type, which works in most but not all cases (function pointers will still need to be defined with the asterisk on the identifier to my knowledge). The reason I prefer it is because it is clearer that you are defining an int pointer, not an int. However, almost all C style guides I've seen use the first notation, and I'm sure there's some reason why that someone can explain to me.
 
There's nothing more I hate then when I go into someone's code and I see shit like "tmp" or "buf" or fucking single-letter variables like "a,x,p". Certain coding standards actually enforce this (looking square at you, Linux kernel naming conventions). Your code should be easy to read: writing out "buffer" isn't that hard and it saves me the time it takes to figure out whether or not "buf" is a buffer or an abbreviation someone thought was clever.
I've had a few jobs where I've been brought in to update code that clients outsourced to India or Russia and no one else could make sense of. In one case they had an entire CMS coded by some Russian where every single variable was $a1, $a2, $b1 and shit like that. They would have saved a ton of money had they gone with a competent programmer in the first place instead of saving a couple of bucks on initial development and being forced to bring in the firm I was working with to figure out what the hell was going on.
 
  • Agree
Reactions: Marvin
Also, what are our thoughts on the qubit and the remodeling of computers with quantum rules instead of the boolean bit of "1" and "0"?
I suspect that quantum computing will require a retarded amount of energy, so it'll probably be a long time before it's practical.

The biggest thing on my mind about quantum computing is that we're going to need new encryption algorithms. That'll suck. See, even if quantum computing isn't really affordable for your average person, all of our encryption algorithms will need to be written as if it is affordable. That means we're going to need to have encryption strong enough to defeat quantum brute forcing, but we're not going to get many of the benefits of quantum computing. I don't know, maybe it'll make web services cheaper?
Pointers (especially two- and three-stars out) can be rather unintuitive and even once you become experienced with them there are some cases where you need to be careful and think about what you're doing. Once you understand pointers, you begin to wonder if there's anything that can be done to make them more intuitive while preserving the same level of functionality.
Yeah, nah I don't think pointers can really be improved on and still be pointers.
Java does away with pointers, instead using references, but you sacrifice things like pointer arithmetic.
Eh, I don't know if that's a huge loss. Like, the most common use of pointer arithmetic I can think of is multi dimensional arrays. And you can do that in managed languages by just using an integer offset and an ordinary array, just like you'd do with a pointer.

Or if you want to mess with individual bytes, pointer arithmetic might be useful there. But you can get the same effect with other, perhaps more verbose techniques. Like, when I mess with individual bytes, that's like a tiny fraction of my program. If the safe version is slightly more complicated, but in exchange for type safety everywhere else, I consider it a worthy tradeoff.
I think the syntax can be improved upon. The C family has a variety of ways you can define a pointer. You can do:

"int *n": which is a way of saying "define a pointer named N, and make it of int type".

Or you can do:

"int* n": which is a way of saying "define an int pointer named N".

These mean the same thing and have the same practical effect. I prefer the second type, which works in most but not all cases (function pointers will still need to be defined with the asterisk on the identifier to my knowledge). The reason I prefer it is because it is clearer that you are defining an int pointer, not an int. However, almost all C style guides I've seen use the first notation, and I'm sure there's some reason why that someone can explain to me.
This has driven me nuts forever. I have never been convinced by explanations for the former.

With variables, I want to be able to ask myself "what type is n?". And looking at the second example, the type of n is readily apparent. It's an "int*".

I think the argument for the alternative is that they're asking "what type is *n?". But personally I don't buy that argument because there are two syntaxes to access pointer data in C (* and []), and I'll freely switch between the two, depending on the situation.
I've had a few jobs where I've been brought in to update code that clients outsourced to India or Russia and no one else could make sense of. In one case they had an entire CMS coded by some Russian where every single variable was $a1, $a2, $b1 and shit like that. They would have saved a ton of money had they gone with a competent programmer in the first place instead of saving a couple of bucks on initial development and being forced to bring in the firm I was working with to figure out what the hell was going on.
My company is learning that lesson. We hired Indian programmers to do our mobile app and they're braindead and incompetent. At best they can string together some UI elements, a http library, and a json parser. Anything more complex than that is beyond them. For example, we need to aggressively cache data because we're dealing with shitty third world internet connections. Explaining the caching setup I'm providing is extremely difficult. The language barrier doesn't help.

I think they tried to give me documentation once in MS Word.

Like, my manager (went to school for economics or some shit) is getting frustrated to the point where he's learning to program so he and I can replace the Indians when our contract is up.



So, I was thinking about this...

Is programming really a "deep thoughts" kind of thread?

Like, the reason why I'm asking is because programming is a very practical topic. If you get too theoretical, you just start rambling about spergy programming philosophy that no one can relate to. But if you get too practical, then you're basically writing a vocational course. (And that's boring in its own way.)

So, I was thinking, would anyone be interested in maybe writing bots for this: http://theaigames.com/competitions/warlight-ai-challenge/rules

Nothing formal, necessarily. I don't know if they're still running the contest either. But when I found it, I was intrigued by the idea. I don't know, maybe we could demonstrate/discuss programming techniques, while still having some practical demonstrations of them in the form of warlight bots.

Right now I'm just fucking around implementing the protocol for my own bot, for my own experimentation. Someone wrote a very simple engine/viewer that lets you watch two bots play.

(By the way, warlight the game is actually pretty cool. I was never into board games as a kid, and Risk confused the hell out of me. But after playing warlight, I'm really realizing that there was a really interesting element to Risk that I can now appreciate as an adult.)
 
Last edited by a moderator:
I've spent my holiday break fucking around in 65816 assembly, which is an obscure 6502-derived instruction set used primarily in the SNES. I've never really gone in-depth with assembly language programming, but if you have a great macro assembler like WLA DX, it can be really elegant and I kind of dig the level of finegrained control you have over the application you're developing. Certainly a lost art in the era of rapid application development.

This is worth a read.
 
I've spent my holiday break fucking around in 65816 assembly, which is an obscure 6502-derived instruction set used primarily in the SNES. I've never really gone in-depth with assembly language programming, but if you have a great macro assembler like WLA DX, it can be really elegant and I kind of dig the level of finegrained control you have over the application you're developing. Certainly a lost art in the era of rapid application development.

This is worth a read.


I toyed around with some 6502 last summer. I want to make an Atari 2600 game someday.

This is a great resource I used: https://skilldrick.github.io/easy6502/. The "Snake 6502" game is a great learning resource.

I'm mostly a Java programmer, but I've dabbled in about a dozen languages.

Anyone else here like LISP? I've been writing programs in Clojure recently and I'm falling in love with the functional style. I've never seen programs so beautiful. Lazy evaluation feels like reaching out and grabbing infinity.

LISP makes even something as boring as an exponentiation function look pretty (at least it would if the spoiler mechanism here lets me preserve tabs...):

(defn pow "Raises x to n." [x n] (reduce * (repeat n x)))
 
  • Like
Reactions: Marvin
I toyed around with some 6502 last summer. I want to make an Atari 2600 game someday.

This is a great resource I used: https://skilldrick.github.io/easy6502/. The "Snake 6502" game is a great learning resource.

The 6502 is a really neat old architecture. It is also the only processor I ever had even a rudimentary grasp of assembly on, a skill that has long since disappeared, sadly.
 
I can program in a few dialects of BASIC. I suppose if this were 1986, it'd be a very in-demand skill.
 
Last edited:
  • Optimistic
Reactions: Splendid
I toyed around with some 6502 last summer. I want to make an Atari 2600 game someday.

This is a great resource I used: https://skilldrick.github.io/easy6502/. The "Snake 6502" game is a great learning resource.

I'm mostly a Java programmer, but I've dabbled in about a dozen languages.

Anyone else here like LISP? I've been writing programs in Clojure recently and I'm falling in love with the functional style. I've never seen programs so beautiful. Lazy evaluation feels like reaching out and grabbing infinity.

LISP makes even something as boring as an exponentiation function look pretty (at least it would if the spoiler mechanism here lets me preserve tabs...):

(defn pow "Raises x to n." [x n] (reduce * (repeat n x)))
Lisp is a terrible language for anything but AI research and pure math.
 
  • Informative
Reactions: Shokew
Anyone else here like LISP? I've been writing programs in Clojure recently and I'm falling in love with the functional style. I've never seen programs so beautiful. Lazy evaluation feels like reaching out and grabbing infinity.
Lisp is good stuff. I haven't done anything in Clojure myself, I prefer Scheme, but if I ever had to do anything on android, I'd use Clojure.

Lisp's pioneered some very interesting stuff. Everyone knows about the functional aspects, but there's a lot more.

For example, CLOS* is the most elegant OOP systems I've ever worked with. Ruby's smalltalk-esque setup was pretty neat, but still left something to be desired.

This is a good read of a typical CLOS style OOP system. I tried to write my own once, and it was a pretty illuminating experience.

* Well, I've never actually worked in CLOS itself. Just CLOS-inspired OOP systems.
LISP makes even something as boring as an exponentiation function look pretty (at least it would if the spoiler mechanism here lets me preserve tabs...):
There's a code tag.
 
(defn pow "Raises x to n." [x n] (reduce * (repeat n x)))
Lol, check out Python's version.
Code:
def raise(base, power):    return base ** power
This right here is proof that LISP is overcomplicated. I could explain what the Python code is doing to anyone, but you need specialist training/education to understand the LISP.
 
Lol, check out Python's version.
Code:
def raise(base, power):    return base ** power
This right here is proof that LISP is overcomplicated. I could explain what the Python code is doing to anyone, but you need specialist training/education to understand the LISP.
@0xDEADBEEF is making a general statement about the usefulness of composable functions. They're standard features in most high level languages.

For the most part, lisp programs largely take the same form as comparable programs in pretty much any popular language. When there are differences, it's because the lisp program eliminates extraneous boilerplate.

For example, Python didn't always have closures. Before they added closures, you had to write an extra class to emulate them. So you'd have a bunch of extra "classes" that only served to pretend to be functions. NameAgeGetter or DistanceScanner. Google tells me (and someone can correct me on this if I'm wrong) that they added lexical closures in Python 2.1. After they did that, *poof*, all those extra classes just vanish.
 
Oh my god are you kidding me, Lisp is overcomplicated? Because it lets you have more than one statement in a lambda?
Someone's never used Racket before.
A lot of extremely complex things are written in Lisp, I'll give you that. It's such an expressive language with so many powerful features, usually it's used to experiment with things, like garbage collectors, classes, continuations, and language design. Microsoft uses Lisp extensively to experiment with designing languages and virtual machines. The .NET garbage collector had a test version written in Lisp. You don't see a lot of complex Python code considering it's essentially a toy language. In fact, the only mainstream Python code I know if it just build systems.

Anyway Splendid Meat Sticks you should consider trying Lisp before talking ill of a language that's been thriving for over 60 years with more dialects then there are people living in Spain. Don't worry, if my 13-year-old self figured it out, I'm sure you can, too. Like I said, try out Racket. Maybe not everyone will agree with me, but it's what people usually recommend to beginners. Don't get all sour grapes because you're intimidated.

However, having said all that, I really prefer MLs. It's all the fun of functional programming with syntax similar to Python's, but absolutely nowhere near as restricted as Python. Anyone here use any MLs? I got into F# recently and now I pretty much refuse to use anything else (Though I know I need to pick up OCaml since F# is basically a stripped-down bastardized OCaml. But I just love .NET so much.)
 
Oh my god are you kidding me, Lisp is overcomplicated? Because it lets you have more than one statement in a lambda?
Someone's never used Racket before.
A lot of extremely complex things are written in Lisp, I'll give you that. It's such an expressive language with so many powerful features, usually it's used to experiment with things, like garbage collectors, classes, continuations, and language design. Microsoft uses Lisp extensively to experiment with designing languages and virtual machines. The .NET garbage collector had a test version written in Lisp. You don't see a lot of complex Python code considering it's essentially a toy language. In fact, the only mainstream Python code I know if it just build systems.

Anyway Splendid Meat Sticks you should consider trying Lisp before talking ill of a language that's been thriving for over 60 years with more dialects then there are people living in Spain. Don't worry, if my 13-year-old self figured it out, I'm sure you can, too. Like I said, try out Racket. Maybe not everyone will agree with me, but it's what people usually recommend to beginners. Don't get all sour grapes because you're intimidated.

However, having said all that, I really prefer MLs. It's all the fun of functional programming with syntax similar to Python's, but absolutely nowhere near as restricted as Python. Anyone here use any MLs? I got into F# recently and now I pretty much refuse to use anything else (Though I know I need to pick up OCaml since F# is basically a stripped-down bastardized OCaml. But I just love .NET so much.)
Bro, I've tried Racket before. It's a piece of shit too. At least it lets you use square brackets instead of parenthesis, which cuts down on keystrokes and makes it easier to count parenthesis.
Also, if you think language age == quality, I'm sure that you think ALGOL is a great language, right? I personally taught myself COBOL, and it's a great language, but only in its specific domain, finance. I'm not saying that LISP is shit, but it's bad outside of a few select domains.
 
No, Algol stole concepts from Lisp, as, well, pretty much every language ever has. And further, Algol is dead. Lisp is still alive, in fact it's the oldest living programming language family, which is what I'm saying.
If you don't like all the parenthesis, fine. I don't like them either, like I said, I prefer MLs. But you obviously have no idea what you can do with Lisp. For example, Crash Bandicoot was written in Lisp. There's nothing "complex" or "specific" about Lisp. It's literally Lambda Calculus, the core of all computing logic. You can make vanilla variables and functions, and you can make the world's most featureful and complex object system in it. Because it's pretty much the only programmable programming language, you can do whatever you want in it. That's why making Domain-Specific Languages is such a hot topic in the Lisp world.
 
  • Agree
  • Disagree
Reactions: Marvin and Splendid
No, Algol stole concepts from Lisp, as, well, pretty much every language ever has. And further, Algol is dead. Lisp is still alive, in fact it's the oldest living programming language family, which is what I'm saying.
If you don't like all the parenthesis, fine. I don't like them either, like I said, I prefer MLs. But you obviously have no idea what you can do with Lisp. For example, Crash Bandicoot was written in Lisp. There's nothing "complex" or "specific" about Lisp. It's literally Lambda Calculus, the core of all computing logic. You can make vanilla variables and functions, and you can make the world's most featureful and complex object system in it. Because it's pretty much the only programmable programming language, you can do whatever you want in it. That's why making Domain-Specific Languages is such a hot topic in the Lisp world.
Algol and lisp came out essentially around the same time, and the C family is more properly called the "Algol family". Being first by a couple of months or whatever isn't really a big deal. And if we want to talk about appropriate domains for usage or how widespread a given language family's usage is, then Algol family >>> Lisp family.
And I know what you can do with lisp. Lisp is a turing complete language, you can do anything with it, the question is one of compatibility, maintainability, efficiency, etc. And on these metrics, lisp fails pretty damn hard.
 
Ah, yeah, you got me there. The INDUSTRY doesn't like Lisp a whole lot, they prefer to make their own languages and sell them to companies. Compatibility is the big issue, since there's very little of it. I mean, Reacket has like everything you could ever want, but I guess Java just has more libraries and at the end of the day that's all most people care about.

But, you said Lisp was complicated. Are you trying to think of other arguments to cover up not properly learning it? I still think you should go get a PDF of Structure and Interpretation of Computer Programs and give it a good read. It's free you know, you can just look it up. It's a great book, it even teaches you how to make a compiler.
 
Back