Open Source Software Community - it's about ethics in Code of Conducts

  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account
Yeah, there's definite advantages to subscription models, but this seems to go beyond subscription services. Like running Docker containers locally, or using Terraform to build out your layout, as opposed to even automatically provisioning VMs via API let alone doing manual or semi-automatic deployments. But with building it all in code and pressing play and magically it's there? Just seems to take away some of the understanding. This might be from a trend of computer literacy decreasing due to the use of phones and tablets over actual desktop machines as well. Maybe I'm just seeing specters where there aren't any.
The key you might be missing is that in my experience - both in normal American enterprise and in bay area startups - developers do not in any way want to know about the infrastructure they sit on. They want data stream come in, process, data stream go out. To most of them the more they can abstract the better. Many of them consider it a waste of their brain space to know any of it.
 
But with building it all in code and pressing play and magically it's there? Just seems to take away some of the understanding. This might be from a trend of computer literacy decreasing due to the use of phones and tablets over actual desktop machines as well. Maybe I'm just seeing specters where there aren't any.
You are just seeing spectres.
 
Look up how the Itanic actually worked in real life and try to tell anyone it is a smart way to design a processor.
well x86 processors are a really dumb way to build a processor and yet fast x86 processors are made every day
itanium was a complete shitshow for sure but i'm not entirely sure that it rules out vliw as a serious processor design
perhaps it could work best in controlled scenarios where programmers expect the weirdness, with some sort of risc core in use for random shit. gpus seem to be pretty popular despite being an easily crippled variety of simd processor
HINT: There is a VERY good reason why the world went with X86-64 over I64.
i feel like it's at least 60% momentum tbh (even to this day there are a lot of programs that only work on x86, either due to terrible design or having an extremely platform-specific piece of code such as a jit compiler)
sucking really bad at running x86 programs + everybody has tons of performance-critical x86 programs = pain

No, he says it isn't. It's the Indians who think it is.
saar deploy docker container plls let me see kubernetes and devop good sir thank you come again
The key you might be missing is that in my experience - both in normal American enterprise and in bay area startups - developers do not in any way want to know about the infrastructure they sit on. They want data stream come in, process, data stream go out. To most of them the more they can abstract the better. Many of them consider it a waste of their brain space to know any of it.
and this is why technology is so fucking shit these days. abstractions are never to be fully trusted at face value, you need to know where they can leak and fuck you up
 
well x86 processors are a really dumb way to build a processor and yet fast x86 processors are made every day
itanium was a complete shitshow for sure but i'm not entirely sure that it rules out vliw as a serious processor design
perhaps it could work best in controlled scenarios where programmers expect the weirdness, with some sort of risc core in use for random shit. gpus seem to be pretty popular despite being an easily crippled variety of simd processor

i feel like it's at least 60% momentum tbh (even to this day there are a lot of programs that only work on x86, either due to terrible design or having an extremely platform-specific piece of code such as a jit compiler)
sucking really bad at running x86 programs + everybody has tons of performance-critical x86 programs = pain
X86 IS shitty, but the whole concept of compiler-led compute is absolutely absurd and as practically functional as communism, which I guess is why a Stallmanite like you thinks it could possibly work.

The Itanium had performance somewhere around 40% of what X86 could do at the same clocks, and even worse compared to X86-64, because there is simply no possible way for a compiler to know what the conditions would be in a system at runtime for the code it is compiling.
 
the whole concept of compiler-led compute is absolutely absurd
don't they already do a million things that make people's shitty c code actually run fast on modern processors?

oh well today i learned that compilers do absolutely nothing to optimize for specific microarchitectural quirks
i used to think they did crazy shit like reorder the fuck out of instructions to make things run smoothly on out-of-order superscalar processors, but apparently not!
and there also definitely aren't techniques like profile-guided optimization that allow the compiler to be sufficiently smart through the power of cheating

i'd actually like to hear your idea of the ideal architecture that sacrifices all compatibility for pure performance. would it make memory caching explicit or something?
 
don't they already do a million things that make people's shitty c code actually run fast on modern processors?

oh well today i learned that compilers do absolutely nothing to optimize for specific microarchitectural quirks
i used to think they did crazy shit like reorder the fuck out of instructions to make things run smoothly on out-of-order superscalar processors, but apparently not!
and there also definitely aren't techniques like profile-guided optimization that allow the compiler to be sufficiently smart through the power of cheating

i'd actually like to hear your idea of the ideal architecture that sacrifices all compatibility for pure performance. would it make memory caching explicit or something?
No need to even give you details if you can't grasp that a compiler-led compute setup is never going to be as efficient as a processor that determines instruction priorities at runtime. It is not rocket science, but to you it all must be the work of the gods or pixie magic.
 
The key you might be missing is that in my experience - both in normal American enterprise and in bay area startups - developers do not in any way want to know about the infrastructure they sit on. They want data stream come in, process, data stream go out. To most of them the more they can abstract the better. Many of them consider it a waste of their brain space to know any of it.
No, I completely understand that and I've seen it first hand. Which is a problem. Why would you build a house without knowing that the foundation is laid correctly? I don't expect any dev or application maintainer to have as deep of an understanding of the Linux system they're on as I do, but I expect them to understand some damn fundamentals. How can you build something without understanding the layers that came before you?
 
No, I completely understand that and I've seen it first hand. Which is a problem. Why would you build a house without knowing that the foundation is laid correctly? I don't expect any dev or application maintainer to have as deep of an understanding of the Linux system they're on as I do, but I expect them to understand some damn fundamentals. How can you build something without understanding the layers that came before you?
To follow your construction analogy, I know a drywall taper. He doesn't hang drywall, he doesn't do framing, he doesn't do finishing. He goes in, trusts that the drywall hangers were pros who did the job right, and he does his job, knowing the finishers will trust he did his right.

The more focused you are on your specialty, the better you do it, as long as you can trust the layer before and after are done right. He doesn't need to know how the framers did their thing, the wall being up and having the drywall hung on it is all he needs to know to do the taping.
 
There is potential in giving the programmer and compiler access to the microcode.
I think it would still be used gor the most part as if it was just nornal instructions, but there would be possibility to really juice the processor when it really matters.
you know how modern processors have this trend where they have specialized performance and power efficiency cores? maybe this should be extended to having one processor architecture for executing mediocre instruction sequences and another architecture for executing properly optimized programs
of course on modern machines people typically scratch this itch with compute shaders on the gpu, but perhaps there are a few cases where hand-written assembly on the low-level processor might actually be a lot better (maybe faster, or maybe a shitload more power efficient) than using a gpu or the regular out-of-order cpu

How can you build something without understanding the layers that came before you?
by the time-honored jeet engineering techniques: cargo cult, shotgun debugging, adding another layer of abstraction, and "it's not broken if it works"
when you change things you don't understand and hit recompile over and over again, it's actually fairly easy to stumble into something that barely works
as long as you can trust the layer before and after are done right
very optimistic to assume that this is ever the case in computing
also a proper analogy for some webdevs i've seen would be the drywall taper saying "waat is drywall please saar. sir?" because he doesn't even fully understand what it is he's putting the tape on
 
No, I completely understand that and I've seen it first hand. Which is a problem. Why would you build a house without knowing that the foundation is laid correctly? I don't expect any dev or application maintainer to have as deep of an understanding of the Linux system they're on as I do, but I expect them to understand some damn fundamentals. How can you build something without understanding the layers that came before you?
We build entire technologies (like Docker) to make deployments agnostic of as much of the irrelevant details as possible. That's how.
 

Since this is the talk about compilers and autistic programming languages thread this seems fitting.
is this a video about the classic "trusting trust" backdoor?
iirc gnu guix sort of fixes it (they have a chain of bootstrap compilers from a tiny hex assembler all the way up to a small c compiler written in scheme that eventually bootstraps tcc and eventually bootstraps ancient versions of gcc which then bootstrap modern gcc which can compile everything else) but of course it has to run on a kernel and that kernel might have been backdoored by the malicious compiler to detect when it is running a compiler that is loading the compiler source code so it can insert the backdoor by putting something funny in at the vfs
 
is this a video about the classic "trusting trust" backdoor?
iirc gnu guix sort of fixes it (they have a chain of bootstrap compilers from a tiny hex assembler all the way up to a small c compiler written in scheme that eventually bootstraps tcc and eventually bootstraps ancient versions of gcc which then bootstrap modern gcc which can compile everything else) but of course it has to run on a kernel and that kernel might have been backdoored by the malicious compiler to detect when it is running a compiler that is loading the compiler source code so it can insert the backdoor by putting something funny in at the vfs
This kind of backdoor was implemented by Thompson 40-50 years ago. It is a well known and fully understood issue.
This is what reproducible builds solve.

Having reproducible builds prevents such attacks.
 
This is what reproducible builds solve.
reproducible builds ameliorate the possibility of compiler backdooring, but they don't solve it
your backdoored compiler binary will always reproducibly insert the backdoor into its source, after all
Having reproducible builds prevents such attacks.
but only if you know where the binaries come from and bootstrap things
guix's bootstrap uses reproducible builds too, but they are bootstrapped too

really i think reproducible builds should just be a basic matter of computing hygiene. you should not even post code that fucks up reproducibility and determinism, especially for building. it's just completely fucking disgusting pajeet shit
 
reproducible builds ameliorate the possibility of compiler backdooring, but they don't solve it
It does solve it. Implicit by using reproducibility you also have validation.
And validation could be as simple as "at the end, the resulting binary shall have this sha1 hash". Or it could be "the binary should be identical to the one I built some time previously and then formally audited and vetted to not have been tampered with."

your backdoored compiler binary will always reproducibly insert the backdoor into its source, after all
Yes. But the validation step would detect that the malicious compiler inserted something it shouldn't have.
Reproducibility without validation is pointless.
 
Yes. But the validation step would detect that the malicious compiler inserted something it shouldn't have.
Reproducibility without validation is pointless.
of course most of the time that validation is going to be checking the sha256 hash, and if that hash was created by compiling the compiler with a backdoored compiler, then you just have a reproducible backdoored compiler...
which is why reproducibility is really just step 1 and step 2 is knowing exactly where all of your binaries come from
checking the resulting binary is quite secure if you trust all of your tools, but if you don't trust your tools you will need something more powerful than mere reproducible builds
"the binary should be identical to the one I built some time previously and then formally audited and vetted to not have been tampered with."
reducing the size of the trusted binaries you need to build the system is crucial. formally auditing a 15mb compiler toolchain binary is probably quite hard if i had to guess
since it's way easier to find sus things in source than in a binary, guix reproducibly builds a giant chain of 1488 compilers so that the only binary you have to audit is the 357 byte hex assembler

reproducibility is just an important ingredient for knowing what the fuck is going on, and allowing other people to independently reproduce your shit and ensure everything is exactly the same. cryptographic message digests never except for small mathematical coincidences that are practically impossible lie
 
The key you might be missing is that in my experience - both in normal American enterprise and in bay area startups - developers do not in any way want to know about the infrastructure they sit on. They want data stream come in, process, data stream go out. To most of them the more they can abstract the better. Many of them consider it a waste of their brain space to know any of it.
This has been my experience with some developers and they constantly come to me for help. What makes it worse is they introduce unnecessary complexities or security risks due to their ignorance. The god complexes some of these people have isn't at all warranted.
One of the teams here is baking in a default GCP service account json token with editor access into their Cloud Run code because they don't understand GCP's SDK will use the service account attached to the Cloud Run instance. They pushed back and said they didn't have "time in the sprint or train" to make the fix with it was literally just removing two lines. They won't read documentation and are now starting to heavily use Copilot to do their work which may actually be a good thing given these people are intern-tier in their coding skills and have been for years.
But this feels like the wrong thread for all this.
I do wonder if the tranny devs know anything about the systems their stuff runs on. They seem to flock to whatever the others claim is hip an new and just dump it. There are also the fake retro trannies like femboy.hu(before he DFE'd).

I did see mail.fedora now uses the tranny anubis anti-ddos software.
 
Last edited:
They won't read documentation
how the fuck do these people even have jobs programming
maybe take it to the "tales of the competency crisis" thread if you haven't already this would fit there
But this feels like the wrong thread for all this.
somebody needs to start the "proprietary / closed source / commercial software and its developers" thread i guess
 
Back
Top Bottom