Programming thread

I see you're enlightened. C pairs great with those Unix-y text-in text-out tools. It's almost like they're meant for each other, like C was designed simultaneously with Unix. Lisp fags seethe when your tell them this kind of code transformation is more powerful than Lisp macros.
How is it more powerful? At the end of the day you care about representing AST. Lisp lets you just write code that creates AST. Using text just means that you don't write AST but text that has to be reinterpret as AST later on.
Only way I could see it being more powerful if you were to forego structured programming.
 
I love Flex and Bison. Surprisingly handy tools, especially Flex.
I like what autoconf is trying to do, but I hate the tool itself. I gave up on trying to learn it, as far as I can tell its 40 years of rot; tools generating code for tools that generate code for other tools based on 3 other tools generating code, so that we may finally generate the code that generates the code. Do correct me if I'm wrong here.
Flex/Bison are compiler-compilers, you write syntax/grammar parsers with them to process text inputs, usually by regex patterns. I played with them writing a printf format parser and a dice roll command parser once but got bored after awhile. BNF to describe your syntax/grammar trees: https://en.wikipedia.org/wiki/Backus–Naur_form?useskin=vector

autoconf/automake/libtool is unfortunately still the most sane build tool combination especially if you care about cross compiling. Just setting --host=<triplet> will handle everything you need.
I hate cmake with a passion because its so opaque, nobody knows why a failure happens when it sources other .cmake files around randomly. With auto*, at least its still in shell script and you can see what it was trying to do and infer if the developer was being retarded about platform specifics.

Sure you can "cross compile" with cmake too, but you'll need to write your own compiler definition file, and guess what variables the developer might have used because cmake won't tell you. I'd rather just use Makefiles directly. Most cmake developers being retarded means they'll never follow any convention.
 
How is it more powerful? At the end of the day you care about representing AST. Lisp lets you just write code that creates AST. Using text just means that you don't write AST but text that has to be reinterpret as AST later on.
Only way I could see it being more powerful if you were to forego structured programming.
global program transformations and lisp macros have slightly different uses: a global transformation can do all sorts of crazy restructuring but needs to be half a compiler, and the lisp macro just matches on some bit of ast and turns it into different ast (usually in a bid to avoid boilerplate or enhance performance)

also this guy has some balls rolling up on kiwifarms r/lispcirclejerk thread and saying c has better metaprogramming facilities, does he want to be kidnapped by angry lisp shills and dragged to the garbage collector mines?
anyway even if you need the enhanced power of a global program transformation there is nothing stopping you from doing it on a lisp source instead of a c source. it might even be easier because sexprs are easier to parse than c source code. you won't even need to write a global transformer half the time though, because lisp has real macros
 
gcc/clang compiler flag that says "turn on the unused variable warning and treat it like an error"
basically your code wont compile if there are any unused variables (fucking annoying especially when you wanna prototype something)
These compiler rules exist because too many developers are incapable of cleaning up up their spaghetti code messes and it kills productivity to have to sort through garbage code that may or may not even be in use.

Bottom line is you have bad coding habits. If you are making messes like this, even when prototyping, you need to stop and take a step back. You are not ready to start writing code yet, you need to sketch out your ideas on paper first. Or at bare minimum, comment out the unused code while you flail around.

The equivalent example is cooking in the kitchen. It is far better to clean up after yourself immediately as you go than to leave a huge mess and clean up later. The mess can get in the way while you're cooking, especially for others.

Sorry to sound like such a dick, but I used to think like you did. It makes me wonder how many people this thread code for a living and how many do it as a passion or hobby. If you code alone, many aspects of programming languages will seem retarded or useless.
 
These compiler rules exist because too many developers are incapable of cleaning up up their spaghetti code messes and it kills productivity to have to sort through garbage code that may or may not even be in use.

Bottom line is you have bad coding habits. If you are making messes like this, even when prototyping, you need to stop and take a step back. You are not ready to start writing code yet, you need to sketch out your ideas on paper first. Or at bare minimum, comment out the unused code while you flail around.

The equivalent example is cooking in the kitchen. It is far better to clean up after yourself immediately as you go than to leave a huge mess and clean up later. The mess can get in the way while you're cooking, especially for others.

Sorry to sound like such a dick, but I used to think like you did. It makes me wonder how many people this thread code for a living and how many do it as a passion or hobby. If you code alone, many aspects of programming languages will seem retarded or useless.
yeah but im not a jeet,
my workflow is to code a part of it, check if that part works, then code more and check that
its really incompatible with -Werror=unused-variables

and if the actual implementation doesn't require a variable then i remove that variable, as im not a javascript codemonkey and i know what memory is
 
These compiler rules exist because too many developers are incapable of cleaning up up their spaghetti code messes and it kills productivity to have to sort through garbage code that may or may not even be in use.

Bottom line is you have bad coding habits. If you are making messes like this, even when prototyping, you need to stop and take a step back. You are not ready to start writing code yet, you need to sketch out your ideas on paper first. Or at bare minimum, comment out the unused code while you flail around.

The equivalent example is cooking in the kitchen. It is far better to clean up after yourself immediately as you go than to leave a huge mess and clean up later. The mess can get in the way while you're cooking, especially for others.

Sorry to sound like such a dick, but I used to think like you did. It makes me wonder how many people this thread code for a living and how many do it as a passion or hobby. If you code alone, many aspects of programming languages will seem retarded or useless.
shut the fuck up nigger, some people just like to write a function prototype and then progressively flesh it out, compiling it as they go to check how it's coming along. if the compiler is constantly sperging out at you, it can hinder actually getting work done. if you're really that anal about cleanliness, just install a git hook that ensures the project works with -Werror and also remember the most important rule of enforcing code quality: a determined pajeet can write horrible code no matter how fascistic your linter is
obviously nobody should leave unused variables lying around, but having a variable be unused for a second during a non-vcs-checked test run is perfectly fine and the fags who think it isn't are gay and retarded
and if the actual implementation doesn't require a variable then i remove that variable, as im not a javascript codemonkey and i know what memory is
variables in a function don't really influence memory enough to matter unless you're using recursive functions to accumulate stuff on the stack
 
yeah but im not a jeet,
my workflow is to code a part of it, check if that part works, then code more and check that
its really incompatible with -Werror=unused-variables
If you can't comment out your unused code before compiling you may as well be a pajeet. There is nothing about your process that is incompatible with that compile rule. Moreover, creating unused variables is taking up memory for no reason. You may not see how that matters, but those are bad coding habits. This is how you get memory leaks in a language where you manage memory.
and if the actual implementation doesn't require a variable then i remove that variable, as im not a javascript codemonkey and i know what memory is
This is the flailing around I referred to. You don't actually know what you are doing when you start writing your code. If you wrote your concept in pseudocode beforehand, you could work out which variables you need to use before you start coding.

Every time we write code, we risk adding bugs. Even tiny little things like missing a comma or a line ending that is easily fixed. It takes extra time to go back and fix/troubleshoot these micro bugs. If you're thinking out your solution by writing code, you are slowing yourself down. Writing code should be the last step, not the first.
 
variables in a function don't really influence memory enough to matter unless you're using recursive functions to accumulate stuff on the stack
yeah i keep forgetting how powerful computers are nowadays, but im pretty sure theres a game studio out there where a jeet added an array of 256 4MB structures
 
  • Feels
Reactions: Another Char Clone
If you can't comment out your unused code before compiling you may as well be a pajeet. There is nothing about your process that is incompatible with that compile rule. Moreover, creating unused variables is taking up memory for no reason. You may not see how that matters, but those are bad coding habits. This is how you get memory leaks in a language where you manage memory.
ah yes let me comment out and uncomment things so i never have a moment where i don't use a variable
not to mention the fact that any half decent compiler (and a lot of interpreters) will optimize out unused variables anyway
i'm just saying you should put that rule in CI instead of in the compiler and let people turn it off if they want to do so while working, since any unused variables will be caught either way when they try to push their code
you might want to have it on all the time and that's perfectly fine, but not everybody thinks and works in exactly the same way you do and nobody should have their compiler chastising them and refusing to compiler their code because they have a slightly inconsistent temporary thing laying around
This is the flailing around I referred to. You don't actually know what you are doing when you start writing your code. If you wrote your concept in pseudocode beforehand, you could work out which variables you need to use before you start coding.
not everybody does everything in exactly the same way as you
Every time we write code, we risk adding bugs. Even tiny little things like missing a comma or a line ending that is easily fixed. It takes extra time to go back and fix/troubleshoot these micro bugs. If you're thinking out your solution by writing code, you are slowing yourself down. Writing code should be the last step, not the first.
how is writing it in pseudocode first better? what happens if you have an unused variable in your pseudocode?
 
If you don't write code like I do you're basically a retard if you think about it.
Maybe you should stop being a retard and do it like me.
It's more if you code in a way that is so frowned upon that your peers need to force your behavior to be rejected by the compiler, you're basically a retard.

Since they're not getting the hint, that is a direct message that leaving unused variables is such a shitty practice that it is not even acceptable as valid code anymore.

That coding style fucking sucks for everyone on the team except the Elmer fudd doing it. It wastes his time, it wastes my time fixing his mistakes and it's unpleasant to read and sort through later. It would be nice if we could just explain why their style is frowned upon and they would fix the problem, but every developer thinks he is a special gifted mind that is simply misunderstood amd knows better. So now we have to write compiler rules to force these developers to consider other people.

not everybody does everything in exactly the same way as you
Just because not everyone follows best practices doesn't invalidate the best practice. People build houses out of mud and sticks without architecture and it works as a house, but it isn't comparable in quality to a home made of brick according to modern building codes. Programming is no different. That rule exists because your peers have had so many problems with people who code like you that they feel the need to take the option away from you entirely.
ah yes let me comment out and uncomment things so i never have a moment where i don't use a variable
not to mention the fact that any half decent compiler (and a lot of interpreters) will optimize out unused variables anyway
You're right, it makes much more sense to write code before you even know what you're doing and before you know you'll even have a use for it.
not to mention the fact that any half decent compiler (and a lot of interpreters) will optimize out unused variables anyway
Your peers don't optimize the unused variables out of their sight. They still have to read them, then eventially realize they're not used after sorting through what the code does. So it costs everyone else but you time and effort instead. That is why this rule exists, your way causes unnecessary problems and time has shown that you fail to clean up after yourself.
i'm just saying you should put that rule in CI instead of in the compiler and let people turn it off if they want to do so while working, since any unused variables will be caught either way when they try to push their code
Yes, at the time and compute cost of the build failing partway through because you forgot to clean up after yourself. On top of the fact that you are now developing with different compile rules than your CI process. Also you still have to go back and do the work of removing your unused variables, so you actually didn't save any time or effort.

At no point should you be able to compile code locally that will fail a CI build because the compile rules are different. This is how you get "works on my machine" excuses.

I've sperged enough about this, I'll take my tophats and trash cans now.
 
If you can't comment out your unused code before compiling you may as well be a pajeet. There is nothing about your process that is incompatible with that compile rule. Moreover, creating unused variables is taking up memory for no reason. You may not see how that matters, but those are bad coding habits. This is how you get memory leaks in a language where you manage memory.

This is the flailing around I referred to. You don't actually know what you are doing when you start writing your code. If you wrote your concept in pseudocode beforehand, you could work out which variables you need to use before you start coding.

Every time we write code, we risk adding bugs. Even tiny little things like missing a comma or a line ending that is easily fixed. It takes extra time to go back and fix/troubleshoot these micro bugs. If you're thinking out your solution by writing code, you are slowing yourself down. Writing code should be the last step, not the first.
in my entirely unprofessional opinion the middle step of pseudo code was just a thing they taught people in highschool that didnt know programming principles to get acclimated... i can safely say that i havent written any meaningful amount of pseudo code since the required hisghschool projects and i had at least assumed that the professionals were doing the same... i think the most ive ever done is list out features in a notepad/stickynote/whiteboard when collaborating with friends at a hackathon or something. Im not going to knock the practice im sure someone somewhere finds it useful but not me. maybe thats ignorant, maybe thats because i started with highlevel languages so i dont care about memory leaks, but i feel like regardless of the leaks rapid prototyping is a proven method of writing programs. I want to believe your slow methodical principles have merit but i just dont see it.

Whats more, do you not run into the issue of your pseudo-code also not working and needing to be tinkered with? what happens when your pseudo-coded solution fails to translate well? whats more what if the pseudo-coded solution doesnt run at all when translated? May we tinker then? or should we go back to the pseudo-code and come back once we think we have a solution that works? the more i go through the steps the less sense this all seems to make.
 
Last edited:
blah blah blah blah blah blah
have you ever stopped to consider that maybe some people use actual code in lieu of pseudocode? or do you never put unused variables in your pseudocode either? i'm not arguing that unused variables are an ok thing to leave in the code even, i'm just saying that people should be able to force the compiler to ignore an unused variable while they are trying to work on something. when they go to commit their code, they should by all means build with all the warnings on and make sure it doesn't have any issues.

also i personally never leave unused variables around when i'm working, but i still think nanny compilers are fucking gay
 
The clear answer is yet again a compromise: Encourage good practices through default features/behaviors and allow for those with the know-how to disable such features easily.

No matter who we are, we can't force others to get in line and behave how we want. It's a losing battle.
That said, I am beyond frustrated with basic human ignorance, stupidity, and laziness. But I wouldn't dare be so arrogant as to claim I'm the first man in the history of our species to have a solution for it.
 
not really you just need to be able to allocate memory. Since infinite-memory computers aren't readily available on this plane of existence, most lisp-like languages scan the heap for unreachable objects and returning them to the available memory, but you could mix and match many other techniques like memory regions, borrow checker bullshit (buzzword ready!), reference counting, an unholy procedure to free your conses manually (cnile's choice!), or the noble technique called "bump allocate until you run out of memory or the program finishes and you can release the entire heap as garbage" (which i think would actually be a pretty good idea in certain niche cases)
there are probably 7 more of them that i missed because my knowledge of memory allocation techniques is rather limited

i think for lispy languages there are three main ways to go:
  1. have the programmer insert some extra information about how that memory is being used (slightly tedious)
  2. "memory is infinite as long as i don't cons the last byte of it" (toy compilers and certain extremely niche cases)
  3. hypothetical super-advanced sufficiently smart compiler that can infer all the possible memory lifetimes in your program and insert deallocation commands in the perfect places (i will be excited to use this compiler to compile gnu hurd 1.0 in 45 years when they are both finally done)
personally i think option 1 is probably the most sane as far as having a working implementation before everybody here dies of old age, and is also much more practical than option 2 letting you only write programs that last for a few fractions of a second of cpu time
Decades ago, Paul Graham posted a little discussion of how ITA Software, the company behind the old airfare site Orbitz used Lisp to implement their software. They had some interesting technical details about memory issues and performance. Some of them sound pretty hamfisted, sorta like you mentioned pre-allocating a bunch of cons cells and handing them out with cons! .

I love reading about little technical stories like that, because it really highlights how flexible a Lisp dialect can be. It's really a family of languages you can apply to a wide variety of situations.

Another set of stories I like are Andy Gavin of Naughty Dog / Crash Bandicoot fame. If I remember correctly, he wrote a series of custom Lisp dialects for the Playstation and essentially just hacked everything straight to the hardware.

I also think there's something to be said about a language designer or implementer designing the language he wants to use as opposed to designing a language for lesser morons who they believe are below them.
 
Decades ago, Paul Graham posted a little discussion of how ITA Software, the company behind the old airfare site Orbitz used Lisp to implement their software. They had some interesting technical details about memory issues and performance. Some of them sound pretty hamfisted, sorta like you mentioned pre-allocating a bunch of cons cells and handing them out with cons! .
another big part of this post is just how much you can save by working with the operating system, like how they use mmap instead of using the usual functions to read things into dynamically allocated structures and creating tons of garbage
they end up working around the gc as a side effect and end up reaping the benefits (and drawbacks) of it
i'm sure the modern gcs and really fast computers of today make some of these optimizations less important, but it's still based as fuck to allocate a huge block of memory and allocate it using a free pointer, which is actually faster than the fastest malloc and pretty memory-safe if your program is made a certain way
I love reading about little technical stories like that, because it really highlights how flexible a Lisp dialect can be. It's really a family of languages you can apply to a wide variety of situations.
lisp isn't really even a programming language it's more like a fundamental truth of the universe that can be made into programming languages
 
Back