Programming thread

I've been coding for years but I still don't understand pointers. I mean I understand what they are but I have a hard time coming for a rationale to use them. All of the newer languages within the past couple decades seem to be getting rid of them or at least transforming them into something different so maybe I'm on the right track.
There is, in practice, only one good reason to use pointers, and that is so you don't have to allocate memory twice.

Let's say I have a linked list called myList and a function called myFunc. myList being a list, takes up a whole lot of space. When I pass myList as an argument to myFunc, it's allocating a second linked list worth of memory with the same value as myVar so myFunc can work with it without messing with the original variable. One list, twice the space. A pointer would have only taken up a single word of space and accomplished the same thing in most cases.

Avoiding this is vital in old-ass systems with 64 kilobytes of RAM, but in modern day, there is zero reason to worry about micro-optimizations like that unless you're a systems programmer or are writing kernels or libraries or something else extremely low level. Odds are nearly 100% that while you were worrying about a few extra kilobytes of memory being reserved, you missed something literally thousands of times worse.

Modern programming languages either did away with or strongly discourage the use of pointers for a reason. They make code hard to read, hard to write, and offer almost no benefit to all but a tiny minority of programmers. If you really, truly need them, you should be coding in C++, if not C.
 
Last edited:
There is, in practice, only one good reason to use pointers, and that is so you don't have to allocate memory twice.

Let's say I have a linked list called myList and a function called myFunc. myList being a list, takes up a whole lot of space. When I pass myList as an argument to myFunc, it's allocating a second linked list worth of memory with the same value as myVar so myFunc can work with it without messing with the original variable. One list, twice the space. A pointer would have only taken up a single word of space and accomplished the same thing in most cases.

Avoiding this is vital in old-ass systems with 64 kilobytes of RAM, but in modern day, there is zero reason to worry about micro-optimizations like that unless you're a systems programmer or are writing kernels or libraries or something else extremely low level. Odds are nearly 100% that while you were worrying about a few extra kilobytes of memory being reserved, you missed something literally thousands of times worse.

Modern programming languages either did away with or strongly discourage the use of pointers for a reason. They make code hard to read, hard to write, and offer almost no benefit to all but a tiny minority of programmers. If you really, truly need them, you should be coding in C++, if not C.
You mostly see pointers in system libraries nowadays. Another reason which you missed is that pointers can effectively be used to manage data shared between concurrent processes. You can have multiple threads write/read to the same section of memory using a pointer to that section.
 
I don't really understand why people consider pointers to be difficult. Most languages have some kind of notion of a reference type or an object handle (which is just a reference type with more steps). The only extra thing that comes with pointers is being able to modify them and perform arithmetic on them. If you just literally don't fuck with your pointers, then you can avoid like 99% of the 'issues' people have with them.

Now dynamic memory allocation on the other hand, that's the real 'shoot yourself in the foot' shit...
 
I finished a Ruby book and it took me five minutes to realize that Ruby is an almost useless language lol
Now I'm studying Python and I'm loving it
I was thinking of studying PHP too, but I don't really like front-end
What made you think Ruby is useless?

Also PHP is mostly used for back-end these days. It predates the backend/frontend distinction (coming from the days of dynamically-generated webpages), but most of what it does is firmly in the backend part of the equation.
 
  • Agree
  • Informative
Reactions: Marvin and MACAROON
The fact that everything that Ruby can do as back-end, other back-end languages can also do in a simpler and even better way
Yeah, but that's basically every language. For the most part most of what you're going to be writing as a backend dev is going to depend on what framework you use, so rather than language it comes down more to whether you prefer Django, Flask, Rails, Laravel etc.

Like for me, I don't really love Java that much. But Spring MVC is my preferred way to write backend web services so I put up with Java.
 
At my job today one of my coworkers pushed broken code, which clearly failed several tests, at the end of the day and just left it like that. Now me and nobody else in the team can deploy their work until this fucking tard fixes whatever it is he broke. Just how much of a fuck up do you have to be to hold up everyone else's progress like that? It's not hard to: write code, test it locally, see that it's broken and not push that broken shit out.
How do you deal with tech retards?
 
Pointers are an essential concept, IMO, and languages without pointers are like typing with mittens. They're also an extremely *simple* concept, and I can't wrap my head around why some flavor-of-the-week template class is somehow less mental overhead than calling delete[] in the destructor or the end of a function where you allocated with new.

One important reason to use pointers is if your arrays are very large. If you're holding onto some 3d EM field or fluid flow field, a 500x500x500 float32 array is 4GB. Pass that by value, and suddenly you're doing a copy operation on gigabytes of ram, and you could easily run out. (An acquaintance of mine was telling me about some *avionics* bug that was caused by running out of memory (on a very memory limited system), because some software fanatic won a holy war and forbade pointers. Enough pass-by-values, and your microcontroller blows up, then presumably your jet engine. IIRC the old way was to allocate a few arrays upfront at startup, and every function referred to them with pointers. The tiny stack was never blown with local variables. The memory layout was some deterministic thing.)
 
Last edited:
Pointers are an essential concept, IMO, and languages without pointers are like typing with mittens. They're also an extremely *simple* concept, and I can't wrap my head around why some flavor-of-the-week template class is somehow less mental overhead than calling delete[] in the destructor or the end of a function where you allocated with new.

One important reason to use pointers is if your arrays are very large. If you're holding onto some 3d EM field or fluid flow field, a 500x500x500 float32 array is 4GB. Pass that by value, and suddenly you're doing a copy operation on gigabytes of ram, and you could easily run out. (An acquaintance of mine was telling me about some *avionics* bug that was caused by running out of memory (on a very memory limited system), because some software fanatic won a holy war and forbade pointers. Enough pass-by-values, and your microcontroller blows up, then presumably your jet engine.)
You are 100% right, but the problems is that most programmers come in with a mindset that any technical problems can be overcome with more software libraries rather than actually understanding the nature of the problem at hand. Why learn about pointers and how to use them correctly when you can install some library from github that does it for you? Why make sure your code runs well when we have tons and tons of computational power? Not to say you should start everything from scratch, but it really goes to show the shortcomings in the actual ability most programmers have when they can't understand fundamental ideas in programming. Hell, I've worked with people that used optimization software but didn't even know the fundamental theory behind it, just how to set the problem up. I was actually kind of appalled when I found this out and quickly left that lab. I can't trust people that refuse to learn the basics of something they work with on a fundamental level. I get not being overly through, but that's infinitely better than being a fucking codemonkey.
 
Not to say you should start everything from scratch, but it really goes to show the shortcomings in the actual ability most programmers have when they can't understand fundamental ideas in programming.
It would be good these days if we made programmers do stuff on microcontrollers and similar before letting them have infinite CPU and memory.

It's always good fun writing a 4 channel PWM input routine that decodes the duty cycles and ships the results out over I2C on a single core ATTiny. I'm just glad I no longer have to bit-bang the serial output and make sure the timing is correct by hand.

Then you get sick of that and move the whole thing to a Pi Pico and you have TWO cores so you can decouple I/O and computation routines and send data over USB to the host instead of I2C.... luxury.
 
At my job today one of my coworkers pushed broken code, which clearly failed several tests, at the end of the day and just left it like that. Now me and nobody else in the team can deploy their work until this fucking tard fixes whatever it is he broke. Just how much of a fuck up do you have to be to hold up everyone else's progress like that? It's not hard to: write code, test it locally, see that it's broken and not push that broken shit out.
How do you deal with tech retards?
If you're high enough on the totem pole you can try hazing and/or humiliation. Force the offender to wear a dunce cap all day or wrap his desk in tin foil. Get everyone involved, it's a team-building exercise!
 
At my job today one of my coworkers pushed broken code, which clearly failed several tests, at the end of the day and just left it like that. Now me and nobody else in the team can deploy their work until this fucking tard fixes whatever it is he broke. Just how much of a fuck up do you have to be to hold up everyone else's progress like that? It's not hard to: write code, test it locally, see that it's broken and not push that broken shit out.
How do you deal with tech retards?
Roll back his push and let him fix it while you all get your own work done. Pushing code that breaks the project should not be acceptable. If it doesn't pass the tests, you don't push the code. Period.
 
It would be good these days if we made programmers do stuff on microcontrollers and similar before letting them have infinite CPU and memory.
Probably don't need to go that far, but it would be nice to force them all to code on shitboxes for a while.

Yeah it turns out your Electron apps run like cheeks on old hardware, Microsoft
 
Pointers are an essential concept, IMO, and languages without pointers are like typing with mittens.
Languages "without pointers" are usually just languages where most things are implicitly passed by reference. If anything, the issue is that a lot of them won't let you introduce pass by value at will. The only language I've seen where 4GB arrays would be passed by value is the sick joke that is newLISP.
An acquaintance of mine was telling me about some *avionics* bug that was caused by running out of memory (on a very memory limited system), because some software fanatic won a holy war and forbade pointers. Enough pass-by-values, and your microcontroller blows up, then presumably your jet engine.
This is extra nasty because if you forbid or severely limit recursion (as you probably should when dealing with microcontrollers anyway), it's nearly trivial to calculate how much stack space a program needs in advance.
 
At my job today one of my coworkers pushed broken code, which clearly failed several tests, at the end of the day and just left it like that. Now me and nobody else in the team can deploy their work until this fucking tard fixes whatever it is he broke. Just how much of a fuck up do you have to be to hold up everyone else's progress like that? It's not hard to: write code, test it locally, see that it's broken and not push that broken shit out.
How do you deal with tech retards?
Don't you use code reviews and continuous integration?
 
Don't you use code reviews and continuous integration?
We do but this guy has permissions to push directly to the master branch which is what fucked everything up. If he pushed out broken code on a feature branch only he uses so that the higher ups could see it and review what's wrong that would've been okay.
 
We do but this guy has permissions to push directly to the master branch which is what fucked everything up. If he pushed out broken code on a feature branch only he uses so that the higher ups could see it and review what's wrong that would've been okay.
That's fucked up, almost like deliberate sabotage. Is he a senior or something like that?
 
That's fucked up, almost like deliberate sabotage. Is he a senior or something like that?
He's isn't a senior but he's been on that project for so long that it's expected of him to know when it's appropriate to push directly to master. It isn't deliberate he's just retarded and/or lazy and I hear the boss (the actual senior) getting more and more fed up with him in our meetings.
 
The fact that everything that Ruby can do as back-end, other back-end languages can also do in a simpler and even better way

This not true, and even if it was, it doesn't affect how useful a language is. Rails is still one of the most used frameworks today. PHP is not frontend, it is also much more complicated and can lead to a lot of antipatterns unless you are experienced enough to set up repo with good practices. Your choice of language and framework very rarely affects how good or bad your codebase is.
Books on languages suck because they get out of date very quickly.
 
Back