Programming thread

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
I was indeed under the impression you should always default to functions and then switch to macros upon noticing niggerlicious DRY code smells.
it's sort of like void*; if you use it in the right places you can get around a bunch of stupid bullshit, but if you abuse it you will make incomprehensible spaghetti
lisp macros give you the power to add an additional feature to the language so you must carefully judge whether you really need the increased complexity
I have yet to find a problem that could be solved by macros that could not be solved by simpler means, usually a closure, albeit only in languages with first class functions.

Is there really a scenario in which macros are the best solution?
stuff like cond is sometimes a macro that compiles to if
generally the only way to create new control structures is with macros

they also work pretty well for weird things like forcing stuff to inline and creating 11 slightly different versions of the same definition
you could also make a macro that takes some shit like (select (from 'users) (where (eq 'username some-variable))) and it compiles to properly prepared sql boilerplate or whatever you need it to do
you can really use them for whatever you want but the most basic use is to make keywords that stand for large chunks of boilerplate

there is also the infamous loop macro as well...
a big problem with macros is that most of the really good macros are already defined so you never end up thinking about them
 
I have yet to find a problem that could be solved by macros that could not be solved by simpler means, usually a closure, albeit only in languages with first class functions.

Is there really a scenario in which macros are the best solution?
Far and away, my number one use of macros is for initializing static adjoint metadata arrays for enums.

C:
#define ENUM_META_HEADER(ET) const ET value; const char* symbol; const char* name
#define ENUM_META_INIT(EV, N, ...) [EV] = {.value=EV, .symbol= #EV, name=N __VA_OPT__(,) __VA_ARGS__}

typedef enum some_t {
    SOME_FIRST = 0,
    SOME_SECOND,
    // blah blah blah...
    NUM_SOME_T
} some_t;

typedef struct some_meta_t {
    ENUM_META_HEADER(some_t);
    int some_val;
} some_meta_t;

static const some_meta_t SOME_META[NUM_SOME_T] = {
    ENUM_META_INIT(SOME_FIRST, "First", .some_val=1),
    ENUM_META_INIT(SOME_SECOND, "Second", .some_val=2),
};

I vary the implementation from project to project, but it's nice as hell for debugging and I prefer it to switch statements because the access is branchless.
 
  • Thunk-Provoking
Reactions: Belisarius Cawl
From one Cnile to another, they were having a Lispy chat. Quite disjoint from the C preprocessor one. Cool macro tho.
In hindsight I should have been clued in by the word closure being in the post
 
  • Like
Reactions: Belisarius Cawl
In hindsight I should have been clued in by the word closure being in the post
also it was in a very lisp-related conversation too
it's still neat to see applications of c's preprocessor text substitution rule facility
"Closures are the poor man's OOP and OOP is the poor man's closure"
and with the power of macros you can make a poor man's language feature into a real one pretty quick, at the cost of complicating the syntax somewhat
 
  • Winner
Reactions: Belisarius Cawl
it's still neat to see applications of c's preprocessor text substitution rule facility
The C preprocessor is still pretty scuffed compared to Lisp macros, right? Refer to page 248 (of the PDF) / page 212 (of the book):
One of the broad themes of the book is a hate boner for the C / Unix approach of handling things in terms of texts rather than with higher levels of abstraction like you might see on a Lisp machine. Well, I've tried Pharo (modern implementation of Smalltalk) and how it's mostly divorced from the Unix (and Windows and Mac) world of text (virtual image running on its own VM). Maybe I was just reared on a bad system that became too popular for historical reasons but I really fucking missed humble grep (and other Unix tools) when I was in there.
 
Last edited:
does this have some sort of really weird footgun inside
My brother in Christ, I don't think this code can even be aimed at anywhere other than the foot. But who knows? Maybe there's some brilliant use case. I'd expect a paragraph of justification in a comment somewhere if I saw that shit in a project.

The C preprocessor is still pretty scuffed compared to Lisp macros, right?
Yes. The preprocessor executes after tokenization, so the best it can do is manipulate tokens. Depending on the Lisp macro system, the difference might be mild to vast, with the kind of AST metaprogramming Lisps are known for.
 
The C preprocessor is still pretty scuffed compared to Lisp macros, right?
lisp macros are little programs that work directly on lisp abstract syntax trees that happen to have a structure almost identical to printed lisp code
you can easily use the full power of lisp in a macro definition as you would in real code, which is usually just used for generating unique symbols that won't clobber identifiers inside the substituted macro. the sky is the limit however and there are a lot of macros out there that do some really crazy stuff, and you can even write macros that define macros (iirc) so you can metaprogram your metaprogramming
the c preprocessor isn't really quite as powerful unless you want to write 10000 macros full of cleverly disguised footguns
One of the broad themes of the book is a hate boner for the C / Unix approach of handling things in terms of texts rather than with higher levels of abstraction like you might see on a Lisp machine. Well, I've tried Pharo (modern implementation of Smalltalk) and how it's mostly divorced from the Unix (and Windows and Mac) world of text (virtual image running on its own VM). Maybe I was just reared on a bad system that became too popular for historical reasons but I really fucking missed humble grep when I was in there.
honestly a based system would have many ways to represent the same shit and it would have the optional capability to transparently do very lossy conversions in increasingly ridiculous ways so text tools can always get text
imagine if you tried to run grep on a directory and the shell could ask you if you wanted to run an ocr program on the images it found inside so grep can treat them like text
My brother in Christ, I don't think this code can even be aimed at anywhere other than the foot. But who knows? Maybe there's some brilliant use case. I'd expect a paragraph of justification in a comment somewhere if I saw that shit in a project.
c is as flexible as lisp if you are willing to severely endanger both of your legs
Yes. The preprocessor executes after tokenization, so the best it can do is manipulate tokens.
it also doesn't have stuff like gensym and the syntax can get very painful very quickly
 
My brother in Christ, I don't think this code can even be aimed at anywhere other than the foot.
Also from the Unix Haters' Handbook:
Screenshot 2025-06-20 at 20-42-10 ugh.book - ugh.pdf.webp
(I know more recent standards have improved the situation somewhat)
 
Could you describe what your system would look like in a bit more detail?
instead of only being able to open files as a stream of bytes (the unix way) you could tell the operating system that you want this file opened as text/plain or something and it does <handwave for running the right utility programs> and then gives the program a byte stream that is guaranteed to be regular plain text
then you don't need any libraries to convert formats and you don't need to set up complicated converter pipelines if a program supports only images in tga or something

replace mime type with whatever method of typing files you want and replace the byte stream of guaranteed stuff with high-level types that are already marshaled in or whatever
 
  • Thunk-Provoking
Reactions: Belisarius Cawl
instead of only being able to open files as a stream of bytes (the unix way) you could tell the operating system that you want this file opened as text/plain or something and it does <handwave for running the right utility programs> and then gives the program a byte stream that is guaranteed to be regular plain text
then you don't need any libraries to convert formats and you don't need to set up complicated converter pipelines if a program supports only images in tga or something
I'm not super familiar with systems programming, though I did write a fair bit of C in middle to high school and learned pointers pretty well (for example). If there are no libraries here, would it be a matter for the kernel or are we talking about no libraries in the sense of, at the level of libc as opposed to libpng or any manner of other, higher-level libraries?
 
I'm not super familiar with systems programming, though I did write a fair bit of C in middle to high school and learned pointers pretty well (for example). If there are no libraries here, would it be a matter for the kernel or are we talking about no libraries in the sense of, at the level of libc as opposed to libpng or any manner of other, higher-level libraries?
more like "you can and should have libraries but you don't need libavif to support avif because something outside your program somehow turns them into stuff your program knows how to work with"
 
  • Like
Reactions: Belisarius Cawl
Back