Programming thread

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
the original* is better and this one is slightly less funny even though it shows off more of the shit you can do with raw html
it also goes on and on and has a bit of a non sequitur about assembly in there because the llm that wrote it was off its meds

*he knows it's better and he even linked to it: https://motherfuckingwebsite.com/
what.webp

It's beyond irony at this point. What is the point of putting this bullshit on your insignificant website promoting vanilla HTML?
 
I want to get in on the functional/stateful slapfight as well as the web framework slapfight but I have nothing to really add other than my pet stack which is kind of autistic and probably an easy way to get picked out of a crowd :(

I just wish the web platform supported scripts other than Javascript.
I looked and literally every alternative is just a variation of javascript or typescript :story:
 
>Been working on my coding skills for over a year now
>Actually have a pretty decent grasp of html and javascript
>Writing a README.md file before I upload a tool I made to github
>Realized I am really shit at remembering all the stupid fucking buzz words to explain my code to internet autists in tech speak :stress:

"THIS TOOL TAKES A NUMBER AND DOES THE THING THEN RETURNS A JSON WITH BITCOIN BLOCKCHAIN DATA"

I found a more useful way to implement AI. I'll just have it fucking write the README file for me so I don't have to bother myself with the bullshit word salad that tech people like to conjure out of their fucking soylent addled brains.
 
>Been working on my coding skills for over a year now
>Actually have a pretty decent grasp of html and javascript
>Writing a README.md file before I upload a tool I made to github
>Realized I am really shit at remembering all the stupid fucking buzz words to explain my code to internet autists in tech speak :stress:

"THIS TOOL TAKES A NUMBER AND DOES THE THING THEN RETURNS A JSON WITH BITCOIN BLOCKCHAIN DATA"

I found a more useful way to implement AI. I'll just have it fucking write the README file for me so I don't have to bother myself with the bullshit word salad that tech people like to conjure out of their fucking soylent addled brains.
what the fuck are you smoking just type what the program is supposed to do in english and how to compile and use it
if you don't know what your own tool does enough to explain it yourself you're probably at least 6 times as retarded as any tech speak autist
also if you can don't upload anything to github because github is a gigantic fucking meme perpetuated by pajeet inertia
 
what the fuck are you smoking just type what the program is supposed to do in english and how to compile and use it
if you don't know what your own tool does enough to explain it yourself you're probably at least 6 times as retarded as any tech speak autist
also if you can don't upload anything to github because github is a gigantic fucking meme perpetuated by pajeet inertia

I know what my tool does and I explained it in detail when I wrote the readme just now. I didn't even use AI. I was just venting because I realize that I keep forgetting some of the technical vocabulary and that shits kind of frustrating.
 
I know what my tool does and I explained it in detail when I wrote the readme just now. I didn't even use AI. I was just venting because I realize that I keep forgetting some of the technical vocabulary and that shits kind of frustrating.
some of it can be retarded but it's there for a good reason, since it can start getting hard to understand when you say, e.g., "this is the background service thingy that does the stuff to the temporary storage file so all the blocks are more lined up and closer together" so instead you have "cache file defragmenter daemon"
 
I've been using gitignore.io for a while to have consistent ignore files, but a consistent issue I've been running up against is having to manually add things, then subsequently needing to remember to re-add them if I regenerate the file. You can use subdirectory .gitignore files, and that does help, but there are instances where that is insufficient. To that end, I skidded up a bash script that seems to work pretty well.
You just have an array for the templates, and then a here file for your custom shit that gets appended.

Bash:
#!/usr/bin/env bash


GIT_ROOT=$(git rev-parse --show-toplevel)
IGNORE_FILE="$GIT_ROOT/.gitignore"

templates=(
  # add your gitignore.io templates here
)

joined=$(IFS=,; echo "${templates[*]}")

url="https://gitignore.io/api/$joined"

tmpfile=$(mktemp "$GIT_ROOT/.gitignore.tmp.XXXXXX")

echo "Using url: $url"

if ! curl -Ls "$url" > "$tmpfile"; then
  echo "Failed to download gitignore.io template"
  rm -f "$tmpfile"
  exit 1
fi


cat <<EOF >> "$tmpfile"

# === Custom ignores ===

# Put your own stuff here

EOF

mv "$tmpfile" "$IGNORE_FILE"
echo ".gitignore regenerated."

My use case personally was adding GCC preprocessor files, the .i ones.
I feel pretty happy about this, it's very consistent, and it's just a single command to regenerate the thing.
Feel free to use at your convenience, I'm releasing it under the "Ctrl+C, Ctrl+V" License
 
  • Like
Reactions: Marvin
if you can don't upload anything to github
I don't like the Microsoft Github regime, but they've failed to piss me off yet. It's a good artifact for giving prospective clients a notion of my approaches. I guess that's my chief reason for being there, just the standardness of it. What else do you suggest, especially in the context of prospective portfolio use?
 
I don't like the Microsoft Github regime, but they've failed to piss me off yet. It's a good artifact for giving prospective clients a notion of my approaches. I guess that's my chief reason for being there, just the standardness of it. What else do you suggest, especially in the context of prospective portfolio use?
it's good enough from the "i want to make money by writing programs" standpoint but it's extremely horrible from the "gnu/halal, or as i have recently come to refer to it, gnu+halal" standpoint
if you want a "standard" git forge that isn't github, codeberg has a very similar look and feel to github, is becoming somewhat mainstream, and therefore is the most likely to suit you
this sperging is mostly intended for freetard niggers and anybody else who hates microsoft with every fiber of their being, and if you don't give a fuck you can just continue using github
 
  • Agree
Reactions: Zeftax
Lotta clowning in this thread, but if you actually wanna see what C++23 implementation looks like, see GCC: https://gcc.gnu.org/projects/cxx-status.html

C++26: Bit over halfway implemented
C++23: 90% done?
C++20: Two module related features out of ~50 not done
C++17: One features not implemented
C++14 and earlier: 100% compliant

Clang looks about the same, with a few different foibles: https://clang.llvm.org/cxx_status.html

If you don't mind limiting yourself to a compiler (or two, for most of the features), there's a lot of options.
The glorious benefit of multiple competing implementations: You can't use the new shit for a decade.
Rust solves this btw.
The C in C++ and C# stands for Cuck.
Cuck Sharp and Cuck Plus Plus, both derived from the Cuck programming language, where your compiler cucks you by letting israel hack your programs.
I had one fun case where I was doing some fancy stuff in an interrupt routine such that the interrupt could actually break the main flow of the program. But the main flow was toggling pins with low latency and the ISR also needed low latency so I had to go in and carefully disable interrupts just for the smallest part of the pin toggle so the pin wouldn't end in the wrong state if the ISR triggered, but it was a couple lines of assembler.
Rust solves this btw.
I always like people giving a Lisp a try but even if it's not super wide spread, I still consider it successful.
For me, Lisp has been a critical component in unveiling the inherent beauty of homomorphic code, most of my language designs since have incorporated some core homomorphic component specifically for functional programming that the rest of the language is modelled by and is usable on its own, and then all the sugar is built around that core. For lisp it's the (linked) list, but I have found you can do it out of most structures, the pure function is the natural one to go to, since lists are really just functions modelled on the heap rather than on the call stack. Weird things begin to happen when you make your core structure out of matrices and turn every linear arithmetic operation into SIMD matrix operations, also designing a nice syntax for that is hard as hell.
You have to write a Javascript wrapper around any web apis you want to call from FFI, which I think is honestly worse than just coding in Javascript to begin with. *sigh*
JS is just the transport layer between WASM binaries and the browser API. It's best to abstract it away now so that when JS is eventually eliminated, you can just implement whatever mechanism replaces it directly.
 
The glorious benefit of multiple competing implementations: You can't use the new shit for a decade.
Rust solves this btw.
yeah and c++ had it solved when bjarne's compiler for it was the only implementation
iirc rust recently got a real standard and gccrs is being worked on so :optimistic:
every language will eventually start to solidify and fragment and it's just starting for rust
Rust solves this btw.
this kind of shit would probably be in 3 layers of unsafe blocks so :optimistic::optimistic::optimistic:
when JS is eventually eliminated
:story:
 
this kind of shit would probably be in 3 layers of unsafe blocks so
In C it's all unsafe. With rust, unsafe operations can only be done in an unsafe context, that means unsafe things can't happen outside of the unsafe context, and if you try to do that, your compiler helpfully stops you from compiling that code. Also, unsafe doesn't disable lifetime checking so you get that benefit too.
It VILL happen!
 
In C it's all unsafe. With rust, unsafe operations can only be done in an unsafe context, that means unsafe things can't happen outside of the unsafe context, and if you try to do that, your compiler helpfully stops you from compiling that code. Also, unsafe doesn't disable lifetime checking so you get that benefit too.
yes but this specific problem you said would be solved by rust is in fact not trivially solved by using rust, as far as i understand it
i admire some of rust's philosophy about not always trusting the programmer to never cause ub, but this feels somewhat deep in the land of register bashing shit where not even the smartest compiler can help
It VILL happen!
incredibly :optimistic:
the dom is built around js
removing js would break compatibility with every website that uses it (most of them) and the web standards people don't like doing things like breaking most websites
and i don't really see a huge push to get rid of js (wasm is more meant to be a way to port your c, c++, and c+=hipster programs to the web)
 
this feels somewhat deep in the land of register bashing shit where not even the smartest compiler can help
Let's look at the quote:
I had one fun case where I was doing some fancy stuff in an interrupt routine such that the interrupt could actually break the main flow of the program. But the main flow was toggling pins with low latency and the ISR also needed low latency
Main code flow is literally interupted during some slow operation, doesn't inherently involve some memory safety problem or issue with invalid state occurring due to asynchronous operations.
so I had to go in and carefully disable interrupts just for the smallest part of the pin toggle so the pin wouldn't end in the wrong state if the ISR triggered, but it was a couple lines of assembler.
Revelation that the interrupt would in fact result in an invalid state in the main code flow, if the interrupt happened during the slow operation.
Specifically, it sounds like the pin or pins that were being toggled in the high latency situation were a shared resource between the interrupt routine and the main expected code routine, it sounds like some kind of data race between the ISR and the main code flow, due to a lack of synchronisation between the two systems. Those few lines of assembly, which disable interrupts during the sensitive operation on the shared data, is basically the same as an atomic thread fence in a process on a managed OS. The point is to synchronise the access between the two independent routines.
Rust fixes this by not allowing you to compile a program that attempts to access those pins that an interrupt routine has higher precedence to access, you will be forced to obtain exclusive access to the pins by 'fencing off' the sensitive routine from interrupt routines firing, or more likely: disable the specific offending interrupt routine from firing during that 'fenced off' portion of code in your main code flow.
There are already hardware abstraction layers that provide such interfaces for you, for pretty much every mainstream microcontroller you can think of.
 
Rust fixes this by not allowing you to compile a program that attempts to access those pins that an interrupt routine has higher precedence to access, you will be forced to obtain exclusive access to the pins by 'fencing off' the sensitive routine from interrupt routines firing, or more likely: disable the specific offending interrupt routine from firing during that 'fenced off' portion of code in your main code flow.
There are already hardware abstraction layers that provide such interfaces for you, for pretty much every mainstream microcontroller you can think of.
The interrupt never touched the pins it simply triggered on an external event saying "do something different". When the interrupt finished the return address was changed to a new path involving different pins until the next interrupt came along and changed the pin mapping again. So the interrupt couldn't fire until the pins were stable and the previous output was done. Obviously the usual process would be "do the pin stuff, check the flag" but checking the flag added too much latency so it was better to let the interrupt redirect the main code.

Obviously, with today's microcontroller speeds you can waste all the time you want running high level languages, but at the time I couldn't afford the 10x slowdown.

Wasn't nearly as much fun as the code that interleaved bit banged 9600 bps serial with ADC conversions and math processing. That one was 100% assembler with cycles carefully counted.
 
Last edited:
  • Like
Reactions: Concentrate Juice
The interrupt never touched the pins it simply triggered on an external event saying "do something different". When the interrupt finished the return address was changed to a new path involving different pins until the next interrupt came along and changed the pin mapping again. So the interrupt couldn't fire until the pins were stable and the previous output was done. Obviously the usual process would be "do the pin stuff, check the flag" but checking the flag added too much latency so it was better to let the interrupt redirect the main code.
That sounds like it would lead to a lot of hard-to-debug problems, seeing as you're re-writing the call stack in a way that violates the expectations of a subroutine to return back to where it was called from. How did you manage the new fake call site's code assuming a valid stack frame but getting the real caller's stack frame? Just no local variables or something?
Return-oriented-programming is a pretty crazy technique to employ for dynamically changing your code flow, although I have seen something similar with a dedicated call stack data structure and jumps to addresses from that stack (was for some bytecode VM optimisation stuff), rather than re-using CPU state machine's call stack, so it's not so weird but still pretty crazy. Was your system so constrained as to not be able to maintain a zeroed next routine pointer, that was specifically overwritten for new code flows from the interrupt routine, which would then be checked after returning to the real call site?
Obviously, with today's microcontroller speeds you can waste all the time you want running high level languages, but at the time I couldn't afford the 10x slowdown.
Hah, rust does all that work up front in the compiler, so you still get optimised code for your target ISA, but on your dev box, you get all kinds of nice static analysis at compile time, which I think is 100% worth the computational tradeoff. It is kind of a curse that CPUs keep getting faster and faster coz it disincentivises writing good, fast code. It is nice to see normies hating on corporate bloatware more and more, lots of opportunities to let them know of the systemic rot in the software development world.

Hopefully, some day shitty software developers will be harassed in the street for their crimes, frame it in the lib agenda as a crime on nature for wasting resources or something, add in the vibe code wasteful AI element and holy shit you could probably get greta to uncle ted copilot servers.
 
Back