Open Source Software Community - it's about ethics in Code of Conducts

Meanwhile arch is a troon infested distro that is so known for its faggot femboy furry troon problem that the toxicity acts as a shield to prevent straight men from entering their vicinity.
What an absolutely piss poor example, the Arch wiki is so well documented that even non-arch users often end up using it for how terse and useful it is. Just today, I went to it to see how to set up some XMPP-related software. Maybe you should try actually reading the manual next time.
frog23.png
half of your ESL rant is you bitching about Arch users, maybe don't use Arch if you think its user base is too toxic? Alternatively, improve your English skills so people don't have to wade through your non-stop ESL fails you keep shitting out when they try to help you.

If you want a professional distro with proper support, you can always just pay for RHEL and get the support you want without having to wade through the community. There's nothing wrong with paying for a valuable service, and if you think support is valuable, then just pay for it.






It's almost as if hundreds of millions of people telling you something is retarded and you say no and it's a product for commercial use not a research tool. You're the retard. And you've got an allergy to money. I would say get a job but having income would probably kill you.
im literally employed making open source software, idk why you're being a nigger about your lack of ability to read or effectively communicate with others.

Maybe you're an unpleasant person and that's why no one wants to help you.
Because an is with good documentation would see it's documentation visited more than its forums
esl card.gif
 
Well there's your problem, you're using NoVidya. Try AMD, it actually works fine with the open source driver (like 90% of the time, probably not for some very new cards), dunno about CUDA cause I don't game or do any computational stuff but it probably works (amd has opencl, hip or rocm according to wikipedia).

I've had so much problems with NVidia in Linux I just instinctively rip them out and put in an AMD (or just use the integrated graphics which is more than enough for what I need). Open source nouveau driver is just dogshit, doesn't even enable frequency scaling or power management on the card right, fans spin at full speed, no acceleration, lagging and stuttering even in a basic desktop. Then I tried the proprietary driver and it sometimes compiles and installs but then just gives you a black screen after you reboot so you have to chroot in and reinstall or uninstall it. That happened one too many times before I said fuck it and swore off nvidia.
But I'm not a gamer and don't do computational stuff so I usually don't even have a gpu, just use the integrated graphics.
I honestly disagree on Nvidia being for gaming even on Windows. Nevermind Linux where the drivers are shit.

Nvidia for me is mostly for AI training. AMD cards to this day cannot do certain things and also the things they can do often require more vram than Nvdia. If someone just games and uses Nvidia they're retarded though, Nvidia costs an arm and a leg compared to AMD and is only useful if your job requires a graphics card. If you're not working with AI or VFX or video editing, or something on a server etc, why the fuck aren't you just using AMD. And honestly with how most AAA games are dogshit, just get one of them G apus by AMD if you're just gaming. It handles games just fine. Technically Linux is still better for AI training and performs better for gaming with Nvidia despite having worse drivers funnily enough.

That's ultimately something people forget. Nvidia is great when you need a gorillion tensor cores doing ai and cuda and rt cores for vfx. If you're just gaming you don't even need a GPU because current APUs can handle AAA up until AAA stopped being good, and any indie games actually worth playing. Either go AMD + Nvdia Tesla for an ai server rack, Intel + Nvidia RTX for homebrew AI and VFX, or just get an AMD APU (honestly the laptop ones might be better than the desktop ones these days since AMD keeps updating their laptop APUs but has let their Desktop lineup sit) for like 200 bucks.

Or I should say, a toy shouldn't cost thousands of dollars, if you're not making money off of something costing as much as any pc worth even slotting in a 50 series card, you should sue your teachers.
 
You wouldn't know how many times being this aggressive about documentation has saved me and my team.
Do you go so far as to take the OpenBSD approach to documentation: if the documentation and the actual behavior of the program/library/component/system/etc diverge, that is a bug and either the documentation or component needs to be corrected?
 
Post dotfiles and stop humoring the /g/ rapefugee Jay Niggerwin rapebaby saar

Do you go so far as to take the OpenBSD approach to documentation: if the documentation and the actual behavior of the program/library/component/system/etc diverge, that is a bug and either the documentation or component needs to be corrected?
That should be the standard in any software with more than one user; the literate programming approach of writing both human legible code and robust documentation, sometimes as commentary within the code itself if sufficiently complicated or large, seems like the correct way to go about writing good documentation. If there's an error between documentation and function, that's the literal definition of a bug, no?
 
When I am writing software I generally write the program while keeping a notebook open (or an Obsidian.md note) in front of me. While I write the program I describe my ideas and thoughts in the notebook, and then afterwards I write the documentation in markdown and publish in on my own documentation site. I keep the notebook around and link the notebook to the documentation so that in the future I know what I was thinking and trying to accomplish.

The notebook is a purely "append only" document, so that I have all the things that didn't work also noted, so that in future endeavors I do not repeat the same mistakes. It generally works very well and it has saved me in quite a few situations. I also like the fact that it slightly slows me down, so that I force myself to think about the problem for a bit longer.
 
the logical continuation of static linking.
Right, well, in the sense in that it's a workaround to the program written in a way that it only works with a specific set of library versions. It could be because the OS you're running the program on has older or newer library versions than the versions the program was written to work with. And you need a newer or older version of the program instead of the version that was released with compatibility with your OS's library versions.
Or it could be because the program stopped being maintained a long time ago so it depends on old libraries.
Or it could be that the program is written with such strict library dependencies that it can only work on a specific version of a specific distro.
But static linking has more uses and reasons for its use, even in systems which all use the same library versions globally. Can't remember the exact reasons. But in some distros it's possible to statically link everything.
It's also hard to tell if cat has truly completed or if its entirely written only in cache.
That's easy, just run sync, it will wait for the buffer to be flushed to disk before exiting. You could also monitor /proc/vmstat, nr_dirty should be small and nr_writeback should be 0, if dirty is large or writeback is non-0 it hasn't finished flushing the buffer yet.
As for cat to copy the image directly into the USB drive, the reason it fails is because shitty USBs just stopped responding after chugging along at 2MB/s for 15 minutes if you don't control write/sync sequences.
I'd consider that a hardware defect and not really a fault of how the OS sends commands to it.
When I am writing software I generally write the program while keeping a notebook open (or an Obsidian.md note) in front of me. While I write the program I describe my ideas and thoughts in the notebook, and then afterwards I write the documentation in markdown and publish in on my own documentation site. I keep the notebook around and link the notebook to the documentation so that in the future I know what I was thinking and trying to accomplish.

The notebook is a purely "append only" document, so that I have all the things that didn't work also noted, so that in future endeavors I do not repeat the same mistakes. It generally works very well and it has saved me in quite a few situations. I also like the fact that it slightly slows me down, so that I force myself to think about the problem for a bit longer.
Note taking is fine... but there is one thing I don't like that's become a trend.

I hate the trend of making documentation online-only. Like not even shipping the documentation in the package, but just pointing you to a docs website. Thanks, now I need a fully working web browser and internet to even use the software. Then I have to find the right revision of the docs for the particular version that I have, because it may not be the latest version.

The best place to put documentation is in the manpage. Texinfo is acceptable, many GNU programs use it, and there is a TUI reader for it that's usually part of the base set of packages of most distros. But I don't like html, I may be able to open it in w3m or lynx, but those usually aren't installed, and are large. An entire browser just to read docs? Rst or Markdown I also don't like, you need a dedicated reader or plugin for them too. And the docs have to either be installed into the system's manpage or texinfo locations or in /usr/share/doc.

It's easy to bundle docs with your package. I know how Gentoo's ebuilds work, they scan the tarball for a doc or docs directory, and just copy it into /usr/share/doc, with maybe some filtering based on file type, maybe compressing some file types. You can also easily tell it exactly how to install the docs in the ebuild if you don't like the defaults, tell it where in the tarball the manpages and texinfo docs are, and it'll install them in the right places.
 
I'd consider that a hardware defect and not really a fault of how the OS sends commands to it.
There are many thousands of lines of code in the Linux kernel devoted to making sure things still work in the face of hardware defects. Linux calls them “quirks” instead of defects.

At time of writing, the file devoted to dealing with PCI hardware defects is 6382 lines: https://github.com/torvalds/linux/blob/master/drivers/pci/quirks.c

It’s a software engineer’s job to make the computer work if at all possible. Blaming hardware bugs is a pussy move.
 
I hate the trend of making documentation online-only.
I think this is totally fair and if I ever released software to the public it would probably be the way I would go. Having a copy of the documentation in some way included with the installation just seems like the logical right way to do things. Thankfully the only end users of my software are three people in the cubicles across the hall or my friends when I subject them to the programming equivalent of a shitpost.
 
There are many thousands of lines of code in the Linux kernel devoted to making sure things still work in the face of hardware defects. Linux calls them “quirks” instead of defects.

At time of writing, the file devoted to dealing with PCI hardware defects is 6382 lines: https://github.com/torvalds/linux/blob/master/drivers/pci/quirks.c

It’s a software engineer’s job to make the computer work if at all possible. Blaming hardware bugs is a pussy move.
Keeping up with defective flash storage on the market is also out of scope for the Kernel, it can't possibly track every single counterfeit or defective flash drive on the market and try to identify it if it even can, if it conforms to the USB mass storage standard, the Kernel will present it as a block device. If the thing then decides it's gonna break and slow down or even stop responding altogether, the kernel couldn't predict that, nor could it fix that, if it's an issue in the usb device itself.
Quirks are for stuff that doesn't conform to the standard, but people have figured out how to make it work, so it has to do some nonstandard stuff to make it work. But that doesn't apply in this case. Because as far as the UMS standard is concerned, there should be no difference in behavior of the device, even if used from a different OS like Windows, it should still conform to the UMS spec, Windows will use the built in standard driver for it, not some proprietary driver from the manufacturer of the device that knows how to make it work. The thing will still break in Windows if you try to copy stuff to it in the same way. Now maybe the firmware on the device has been written to only work with the command pattern that Windows sends it, and since Linux sends it a different command pattern, it breaks, but that means that it doesn't conform to the standard, and is broken.
Software can't fix a hardware problem. Kernel maintainers can't possibly keep track of all the counterfeit and defective peripheral devices that an user could obtain in the entire world.
 
Do you go so far as to take the OpenBSD approach to documentation: if the documentation and the actual behavior of the program/library/component/system/etc diverge, that is a bug and either the documentation or component needs to be corrected?
obviously so, documentation and behavior should always align. I don't know what system where this isn't the case.
 
I don't know what system where this isn't the case.
Aligning the software behavior with the documentation is what we all should strive for, in reality, the documentation will always be written in a haphazard way and lacking. The source code may be the documentation, so it should be written in a simple and understandable manner, with as many comments as needed and not a single one more. This does not mean we should stop writing documentation.

Just like in the real word, the fact that there is no salvation should not prevent you from being a decent human
 
I would love to show you the world of scientific lab equipment.
I don't think I would like to see that world :(
in reality, the documentation will always be written in a haphazard way and lacking
that's only true of unpaid and indian developers, the second there's a financial incentive to write documentation (or you have a sperg who likes it for whatever reason) the problem goes away. If anything, the fact that FOSS people don't make it easier to document their tools is doing them a disservice, there's plenty of people who like hacking around with tools but don't necessarily know the internals of them well that could do a lot of good for documentation. One of the best examples of this IMO is the arch wiki, there's lots of high quality documentation there made by tool users rather than developers just because doing so is easy.

When are we getting the manpage rust rewrite that actually makes writing manuals for software projects piss easy so we can have the quality of the arch wiki on every project?
The source code may be the documentation, so it should be written in a simple and understandable manner, with as many comments as needed and not a single one more
While usually nice in theory, I think that "simplicity only" really is limiting for doing optimizations and very complex systems for niche use-cases. It's better just to have good documentation that explains a complex system than to neuter it. That said, code comments are pretty much always a must, even for what is banal (as I previously have argued).
 
When are we getting the manpage rust rewrite that actually makes writing manuals for software projects piss easy so we can have the quality of the arch wiki on every project?
Tell me you're joking. If you're serious you're insane.
man-db is perfect for what it does, it works perfectly, it's written almost entirely in C, why the hell would you want to mess with it? Even replace it with something in Rust, which is the biggest cancer that is infecting stuff right now? I'm not talking about the language itself, I'm talking about Cargo being retarded, unsafe and a giant attack vector and pain in the ass, and for stuff that's already written in C, there is no benefit to rewriting it in Rust.
You can write manpages in stuff like markdown already, there are converters that convert it to roff that man-db can display.
You have converters that convert the --help output from a program into a manpage automatically, to which you can add as much additional stuff as you want.
If just a single large manpage isn't enough (there's stuff that has giant manpages and I'm fine with that, it's not hard to search over them) you can make texinfo documentation that has sections, subsections, links, TOC, glossaries. The info viewer isn't hard to figure out either.
Ah, I see the confusion now. I’m talking to someone who’s never shipped an electronic device to market before.
You're retarded. I said it's not the kernel's job to fix hardware problems of some noname junk usb flash drives that may even be counterfeit. If you're manufacturing a device that's supposed to conform to a standard and you mess up the firmware so that it isn't compliant, it's not the kernel's job to work around it. It could, maybe, if there's enough interest in it, because there are millions of these identical broken devices floating around, put in a workaround, but not for something that's as common as a flash drive, if you get a broken one, just go to the store and buy a new one, they're everywhere.
Kernel developers are not hardware device manufacturers. If the kernel and the hardware were manufactured by the same company, and they promised that they worked together, and they didn't, it would obviously be possible for them to fix their screwup after it's too late to fix the firmware, by working around it in the kernel. But that's not the scenario here.
 
Last edited:
Tell me you're joking. If you're serious you're insane.
about it being a rust re-write? that was absolutely a joke, that would be the biggest piece of utter dogshit by the mere fact that the rust community has zero capabilities at making software and would shit out 1 trillion special snowflake dependencies for what shouldn't be a difficult to compile tool. Not to mention it'd be abandonware once the tgirl maintainer gets the next estrogen shot and starts stimming out to some new tool rewrite.

You can write manpages in stuff like markdown already, there are converters that convert it to roff that man-db can display.
This is moreso what im talking about: it's a bit esoteric to make and import your own manpages because it cannot directly import markdown, and thus newer software is unfortunately coming without them. Yes, there are converters, but the fact that a superior documenting format is not natively supported does cripple documentation efforts.

Furthermore, manual pages usually only explain what parameters for a tool do and potentially provide some information on what its used for. Say what you will about GNU info, but info pages usually give you an encylopedia's worth of information on how to use a tool, tutorials for it, explains their functionality, cavets, when is a proper time to use it, and a general overview of the program as a whole.

I'd like to see a day where projects basically just ship entire wikis as documentation (not in the sense of needing a web-browser, but in the sense of being highly detailed). It'd be especially nice if it was very easy for non-developers to contribute to such a documentation system as well.

Basically my ideal is some kind of mix between man and info that natively supports markdown and has some retard-proof GUI for editing documentation so your average user can help out a project more easily.

Having all of that with the nice UI of old reliable man (or some other gui tool if normies are scared of terminals) would really do a lot for making normalfags capable of owning their computer. Ones that aren't utter goy, but not familiar with a computer, could easily just crack open a high quality manual on any given project and get a hang of a tool with relative ease. Not to say man pages don't have their value, it's just that man pages are by far not the end-all-be-all of software documentation. That and texinfo, as a format, kinda sucks ass to write in.
 
Woe upon those who have to drudge through Microsoft documentation and the official Windows support forum).
Windows World:
>go onto windows help forum
>"I upgraded my computer from 10 to 11 and now it starts melting whenever I turn it on."
>reply from "top microsoft specialist": "Hello sir, I am Guadeep Shitstreet. Thank you for posting on the Windows Help Forum and welcome to the community. I understand your computer is melting whenever you turn it on. To make your computer not melt when you turn it on, try opening Windows PowerShell(tm) and running sfc /scannow, then rebooting the computer."
>doesn't work

Linux World:
>go onto kiwi farms
>"Arch fucking sucks. I can't get it to show me porn."
>reply from mentally ill schizophrenic: "type nigger in the terminal you dumb fuck retard"
>it works
 
Back
Top Bottom