Programming thread

that's why we have a large need for language-agnostic package management that works on all systems and can do project-based management, multiple versions, and everything else
also windows and macos not having package managers is a skill issue on their part, but at least they can install one
Perhaps this provides a hint as to the reason for language-specified package managers.
doesn't apt use a lot of perl?
I don't think so.
(well, technically, they are different packages, but the end result is the same)
🗿
For example, semver, a single number, and something else. (the "-n" is the version of the package itself, not the software):
Then you must build the infrastructure for the package manager (e.g. viewing all versions of a package in increasing order (just an example)) around this.
Perhaps if only implement sorting for the few most popular formats, but then you are making subjective assertions about which things get first-class support.
You may consider this an acceptable tradeoff, but it is a tradeoff.
It is the job of the packager (which may, or may not, be the software developer) to package the software
So if nobody cares to regularly package Library X for OS Y, the user is needed to manually download it?
Why should I care about the opinions of a package manager on a system that I don't use?
You need building (which need packages) to be perfectly reproducible on all platforms at all times, unconditionally.
This is within the purview of a language developer.
if I were forced to use language-specific package managers then this would be rather inconvenient to say the least.
I agree, this is a significant problem, but it is also a small minority of software.
Generally I agree with the rest of what you've said.

desktop VPN apps that take up 500 MB of storage space (glares at Mullvad)
The worst thing about electroniggers is there is an infinitely superior alternative (WebView) that does exactly what they want, but it requires 20 minutes of additional one-time effort so they decline to even consider it.
he thinks you should just install the 140mb static binary package built from god knows what on some niggerlicious github ci server
Build script that refers to multiple package managers. Frankly not much superior than what you said.
what do you think of pkg-config it's my favorite language-specific package manager
Unsafe. RIIR.
Did you forget the 3 transwomen to balance the polycule?
 
Perhaps this provides a hint as to the reason for language-specified package managers.
sorry i don't exactly get it
I don't think so.
says so on fucking wikipedia
Then you must build the infrastructure for the package manager (e.g. viewing all versions of a package in increasing order (just an example)) around this.
Perhaps if only implement sorting for the few most popular formats, but then you are making subjective assertions about which things get first-class support.
most good system package managers have implemented versioning flawlessly for every piece of software users can install on the desktop and all of their libraries
we all know that there hasn't been a super strict c version numbering standard handed down from on high, so please stop acting like version numbers are a representative issue
So if nobody cares to regularly package Library X for OS Y, the user is needed to manually download it?
it probably means either library x is incredibly obscure and there is only one of them and it isn't too much of a deal;
os y is extremely retarded and the user should install os z's weird package manager that can install packages in ~/.os_z_packages, and use that;
or os y doesn't have the library but the user repositories do
You need building (which need packages) to be perfectly reproducible on all platforms at all times, unconditionally.
This is within the purview of a language developer.
this is what the debian maintainers are supposed to do and packaging a new version of every package when library #21239 updates instead of being able to update that package in the apt repository is extremely gay
language developers (known as "compiler writers" to white people) only need to be concerned with loading libraries from the switches that were put in the makefile (or ninja file, if you're feeling fancy and modern) and turning it into some lower-level form of computational expression
where these libraries come from is left as an exercise to the programmers, packagers, and users and should be solvable many ways
and reproducibility can simply be achieved by sticking to one package management solution but it can be chosen on a per-project basis
Unsafe. RIIR.
yeah there are a few alternate implementations, including ones in memory-safe languages (your favorite) like perl
 
but at least it's not curl | bash like soydevs love doing
Are you checking your tar archives after downloading somehow before you extract them? If not, then that behaviour is functionally equivalent to curl | tar. Is there some "check that my tar archive doesn't nuke my system" step that is common practice that I'm unaware of? I mean, the bash pipe is weird, especially when people say to do it sudo, but there you can just read the script and see. Is there something similar for tar? This just seems like a weird take.
 
Then you must build the infrastructure for the package manager (e.g. viewing all versions of a package in increasing order (just an example)) around this.
You suggested that language-specific package managers were required to handle different (language-specific) version schemes, so I gave you an example of each scheme that you mentioned working perfectly fine in a single package manager.
Most people don't need every single version of a given package to be available, the latest version of each major release is almost always enough. If you absolutely require a specific minor version that is not the latest then something has gone wrong somewhere and you should fix that instead of just grabbing the old version from an archive. Arch keeps an archive of every package from the last 5 years and if the software lives in a git repo then there is the entire history there too.
Perhaps if only implement sorting for the few most popular formats, but then you are making subjective assertions about which things get first-class support.
You can use whatever versioning system that you like, sorting is generally the same unless you do something very strange (all three of the version numbers that I gave can be sorted using the same simple algorithm). Though I don't see a use case that is not covered by semver (or something compatible), a single number (often the date), or git describe (the giberish in the first example). Do you have an example that you might share?
So if nobody cares to regularly package Library X for OS Y, the user is needed to manually download it?
Yes, or package/maintain it themselves. The few times that I have needed something that wasn't in the official repositories, it was in the AUR. If I ever come across something that isn't packaged (or write something that is worth being packaged) then I will learn how to do so, at least for Arch/pacman.
You need building (which need packages) to be perfectly reproducible on all platforms at all times, unconditionally.
This is within the purview of a language developer.
Do I? To what extent?

If that were the case, then I would include the external libraries within my project. After all, what if some external package is removed and no longer available? (e.g. left-pad) Or tampered with in some way? (do you specify hashes along with version numbers?) None of this is language-specific.
I agree, this is a significant problem, but it is also a small minority of software.
It's common enough that it ought to be considered, though I have not seen quite the mess that I described, yet. A lot of Python stuff is just wrappers around C libraries (e.g. sqlite, opengl/sdl, crypto stuff) which are also used by C++ programs. You might have also noticed that re-writing C libraries in Rust is also quite trendy.

If you want language-specific package managers then you should also specify some common API amongst them to handle dependencies for multi-language projects (or not support them), or you could just defer to the system package manager...


I think it's clear that we won't agree, and I do not wish to further contribute to shitting up the thread with a pointless argument, so I will say one last thing.

The system that you prefer (from what I understand) can exist underneath that which I prefer, the same is not true for the reverse. Just because I do not wish to use "your" system, that does not mean that I am forcing "my" system onto you. You have yet to present an actual issue, or adequately address any criticism.

Enjoy the rest of your day.
 
  • Thunk-Provoking
Reactions: Marvin
Are you checking your tar archives after downloading somehow before you extract them? If not, then that behaviour is functionally equivalent to curl | tar. Is there some "check that my tar archive doesn't nuke my system" step that is common practice that I'm unaware of? I mean, the bash pipe is weird, especially when people say to do it sudo, but there you can just read the script and see. Is there something similar for tar? This just seems like a weird take.
it might have only been a problem in the past that has warnings in modern tar implementations but certain tar files can have absolute paths
absolute paths like /usr/bin/something_important and it extracts a copy of the bee movie script onto it or something and then successful denial of service attack
Most people don't need every single version of a given package to be available, the latest version of each major release is almost always enough. If you absolutely require a specific minor version that is not the latest then something has gone wrong somewhere and you should fix that instead of just grabbing the old version from an archive.
you keep forgetting that this nigger uses languages where everybody thinks completely breaking all compatibility from 2.2.6 to 2.3.0 is a completely normal thing to do
Yes, or package/maintain it themselves. The few times that I have needed something that wasn't in the official repositories, it was in the AUR. If I ever come across something that isn't packaged (or write something that is worth being packaged) then I will learn how to do so, at least for Arch/pacman.
he wants to pull in 14 different libraries that were made yesterday by random trannies on github, each of which are doing the same thing
packaging 1,000 dependencies gets boring quick and the developers need some self-serve garbage where they can dump their horrible code
you know how some programs are described as "big balls of mud"? rust's ecosystem is one giant swamp, full of more mud than anybody could easily comprehend
 
Yeah that was it. I had remembered something about the fancy pipe detection stuff, but I wasn't sure. Kudos.
Apropos to this discussion from last month, I found a particularly egregious example from Gitlab local install instructions:
sudo_bash.webp
Sure, pipe the whole script to sudo bash. Great idea! What could go wrong?
 
Apropos to this discussion from last month, I found a particularly egregious example from Gitlab local install instructions:
View attachment 7118798
Sure, pipe the whole script to sudo bash. Great idea! What could go wrong?
we all know people who want to host software forges are too stupid to add repositories to their package managers so they can just use that handy script instead of running scary commands that append to text files
i don't even understand why they do this
 
  • Agree
Reactions: y a t s
Gotta invoke GNU tar with --absolute-names (-P if you want to obfuscate) for this to be exploitable
it can also clobber shit in your current directory but what degenerate unpacks a tarball directly in their important files directory
I didn't think of this angle. Cheers.
iirc it was one of the curl|sudo bash-type retard moves before curl was around to pipe into bash
this is some pretty deep lore though and modern tar implementations will not randomly overwrite important files unless you specifically tell them to
prior to that, it was smart to list tarball contents before unpacking the malicious login binary
 
  • Like
Reactions: Marvin
There is no difference security wise between running untrusted executables without verifying their source and their correspondence to that source and piping things to bash as either ways malicious code is going to execute. And since the way that unix is designed there is no real difference between sudo bash and bash you might as well pipe it into sudo bash. The only effective way of protecting oneself against unwanted consequences of running untrusted executables on unix is containers(preferably white man containers ie unshare wrappers not niggerware like docker). The only reason not to pipe things into bash is when running niggerware setup scripts that are't malicious just written by a retard.
 
The only effective way of protecting oneself against unwanted consequences of running untrusted executables on unix is containers(preferably white man containers ie unshare wrappers not niggerware like docker).
no all the white people were taught the only secure sandboxing method in poc oppression school
i remember my training very clearly:
  1. buy a computer for running the untrusted code
  2. take it to a building with no other computers
  3. run the code
  4. completely erase all physical evidence of the existence of the computer
important note: any peripherals you use with the untrusted computer are untrusted too and need to be treated in the same way as the computer

unshare is way better than d*cker i'll give you that
View attachment 7122093
Why does every python program ever made have eleventy bajillion dependencies?
python niggers never added these things to the standard library (or the standard "all the things the standard library doesn't have" package)
 
And since the way that unix is designed there is no real difference between sudo bash and bash you might as well pipe it into sudo bash.
This isn't true whatsoever. Consider the implications of running every single command as root vs just one or two commands as root.

This feels like saying that the way the universe is designed there is no real difference between a .22 caliber and .50 caliber, so you might as well shoot yourself in the foot with the .50.
 
This isn't true whatsoever. Consider the implications of running every single command as root vs just one or two commands as root.

This feels like saying that the way the universe is designed there is no real difference between a .22 caliber and .50 caliber, so you might as well shoot yourself in the foot with the .50.
1st the only thing that lack of sudo is stopping you from is fucking with other users and since it is not the 70s anymore basically all unix system even the server ones run are single user. 2nd there are thousands of ways of elevating to root which you unless you are total schizo are not going to check against for example you can echo shell wrapper to bashrc and from that get root. The only scenario in which sudo will protect you is in case or running it on throwaway however at that point just run it in container.
 
1st the only thing that lack of sudo is stopping you from is fucking with other users and since it is not the 70s anymore basically all unix system even the server ones run are single user. 2nd there are thousands of ways of elevating to root which you unless you are total schizo are not going to check against for example you can echo shell wrapper to bashrc and from that get root. The only scenario in which sudo will protect you is in case or running it on throwaway however at that point just run it in container.
question: are you supposed to not lock your doors ever because locks are useless and criminals can just break in through the window?
 
question: are you supposed to not lock your doors ever because locks are useless and criminals can just break in through the window?
If the only lock that you have is as strong as paper straw and wast majority of your wealth is outside you might as well save yourself some time. Or move the wealth inside and buy a proper lock ie run it inside container.
 
  • Autistic
Reactions: Wol and y a t s
i think general practice for servers is to have a special account for the server process that doesn't even have a shell and anything that touches /usr has to be either a package manager or something equally trusted
Pretty typically on a multi-purpose server box, you have a user per process.
and sql user for the database, web user for the http server, etc.
 
Back