Open Source Software Community - it's about ethics in Code of Conducts

Computing after the collapse:
1. Enjoy thinking about software and planning
2. Scavenge a bunch of nonfunctional hardware, excited to try to frankenstein something working from the various parts
3. Realize you have access to none of the hardware documentation
4. Realize that even before the collapse you didn't have access to the hardware documentation
5. Try to power through with sheer trial and error and your multimeter that has definitely not run out of batteries by now
6. Destroy any hardware that was working when you connect the wrong pins

From my limited experience trying to work with undocumented PCBs as a layman, it sounds extremely annoying and time-consuming. But maybe those with an electrical engineering background or extensive experience with the circuits in question could actually accomplish something.

In the mid-term, I feel like the most essential capabilities would be replacing the parts of computers that typically fail, like storage, power delivery, capacitors, etc.

Personally I like the idea of creating mechanical computers. Ideally they could be created with commonly-available materials, and would have a rather natural interface to mechanical energy sources instead of needing generators. Sure they'd be slow compared to electric computers, but could still probably compute a block of chacha20 much, much faster than I can.

Although for simple calculations I don't know how they'd compare to the abacus and slide rule.
 
Personally I like the idea of creating mechanical computers. Ideally they could be created with commonly-available materials, and would have a rather natural interface to mechanical energy sources instead of needing generators. Sure they'd be slow compared to electric computers, but could still probably compute a block of chacha20 much, much faster than I can.

Although for simple calculations I don't know how they'd compare to the abacus and slide rule.
Honestly, the idea is cool but I think there's a reason they weren't really made until relatively high precision metallurgy could let you build shit like Leibniz's mechanical calculator or Babbage's machines or whatever. Anything the average Joe needs can be done with an abacus and a slide rule, and anything more advanced is technologically out of his reach post collapse.

Post collapse computing sorts tend to forget that any seemingly independent computing device will cease to be useful VERY fast when the world-society wide infrastructure it depends on is gone.
 
everyone in the open source community is a pedo or Troon.
Everyone in tech is a troon.

Is what I think you mean. In open source the individual devs are just a lot more visible. With proprietary software yoy have no idea how many troon hands touched your software.

But the entirety of tech has been infiltrated. In large part because the kind of person that codes seem to suffer from the exact kind of autism that makes people likely to troon out.
 
Seriously, this is why these arguments never really go anywhere. Without a definite set of constraints on what shit hitting the fan consists of (i.e. did the nukes fly but there's someone hiding out who's got the capital to rebuild civilization, or are you completely on your own and civilizatoin has been levelled, or is this some kind of apocalyptic plague that took out humanity but left everything useful behind -- or what?) you just wind up mentally masturbating.
Generally things will be pretty, pretty bad. Pretty much all current hardware is designed to break after a couple years and relies on very special software. Heck, even if you have a home server or raspberry Pi running linux it'll be mostly useless if you didn't predownload every possible tool you'll need, and if you're writing code god forbid you're using Rust or something where you have to pull in a bunch of dependencies from the nonexistant internet. Any devices people have will have limited function - say you have a home server with a massive media library, if it's running Plex it's immediately useless unless you made some changes before the internet went down.

Nobody will be able to cobble together computers from smaller bits, heck they can't even go to radioshack and build something with a breadboard and a 70's or 80's level IC. Places that do have a circuit foundry AND access to the raw resources to make the chips will be rare and immediately put under the control of the closest government, so the average person will be screwed and will basically be making computers out of base principals, such as mechanical gearing systems, basic analog radios, and maybe eventually people will figure out vacuum tubes again.
 
Looks like the FreeBSD forum has been hacked: https://forums.freebsd.org/threads/forum-outage.102193/
Hi guys, sorry that the FreeBSD Forums were offline for a couple of hours.

We were hit by an exploit against a slightly outdated XenForo version that we were still running.

The same exploit hit quite a number of XenForo installations today, including linux.org.

The FreeBSD Forums showed a defacement page for a couple of minutes before it was detected by the admins and then skillfully removed, after which the XenForo software was updated.

In the meantime the FreeBSD organization decided to take our DNS record offline, in case we were possibly spreading malware, which did not appear to be the case.

After some investigation, the defacement was labeled a low-hanging-fruit type of script kiddie attack that only scratched a little bit of the surface of our installation. Nothing on the actual server was in any way touched, altered, or otherwise compromised, including databases and credentials.

Resulting discussions and forensics took a couple of hours to complete, after which the DNS record was reinstated and the FreeBSD Forums were reachable once again.

We apologize for the inconvenience, and we will be slightly more diligent in keeping up with our forum software versions..

If you are still, in any way, shape or form, concerned about your credentials, feel entirely free to change your password and, more importantly, to turn on Two Factor Authentication on your account. Also note that we now support PassKeys!

Here's a screenshot of the hack:

1774920511133.png


Not sure what version of XenForo the forum is running, but just in case @Null. Linix.org also got hit: https://www.linux.org/threads/whoops-a-xenforo-xss-vulnerability-bit-us.64521/
 
Despite the nuisance, its kinda nice to see silly skids doing these kinds of harmless prank hacks every once in a while. Not sure why, just feels cozy somehow.
it honestly reminds me of the cozy wild west days when you had script kiddies just defacing websites for fun.
 
Can someone explain to me what's the deal with so many people praising ripgrep for being fast? In which case has grep ever been so slow that it mattered in any meaningful way? In which context exactly are people grepping for something that needs to be done so fast that it needs a rewrite of a tool that I never found a problem with?
 
But even during a nuclear fallout, you will still be able to find a lot of x86-64 ewaste to run your computing needs.
That's going to depend heavily on where that waste was located at the time the bombs dropped. If it was buried in a bunker, well isolated from an electromagnetic pulse from bombs airbursting overhead then sure, but for everything else... no, those rigs are now paperweights.
No, this is precisely what I'm saying. In this situation it's likely that the most useful thing is something that lets you do a bit of math without error and figure out exactly how much wood you need to put up your hut. Anything more complicated than that is really just pretending.
Well, if that's your bar, then go pick up a sliderule or an abacus, because those are far more reliable and something you can produce yourself.
Nobody will be able to cobble together computers from smaller bits, heck they can't even go to radioshack and build something with a breadboard and a 70's or 80's level IC. Places that do have a circuit foundry AND access to the raw resources to make the chips will be rare and immediately put under the control of the closest government, so the average person will be screwed and will basically be making computers out of base principals, such as mechanical gearing systems, basic analog radios, and maybe eventually people will figure out vacuum tubes again.
This is making a lot of assumptions without first defining what the scenario is.
 
Can someone explain to me what's the deal with so many people praising ripgrep for being fast? In which case has grep ever been so slow that it mattered in any meaningful way? In which context exactly are people grepping for something that needs to be done so fast that it needs a rewrite of a tool that I never found a problem with?
FWIW, I noticed a huge speedup between grep and rg when grepping through lots of massive log files. Like to the extent that rg was usable and grep was not. Give it a try!
 
At what file size does this begin to matter?

Man, if vanilla grep is fast enough for you, then keep using it. It’s fine. Should you ever find some task taking more time than you’re comfortable with, then investigate the alternatives.

Personally I don’t care about the speed so much as the fact it automatically follows .gitignore rules so when I search my codebase, I can find what I’m looking for only in the shitty code I wrote rather than the shitty third-party code pulled in as dependencies that I usually don’t care about.
 
At what file size does this begin to matter?
Spoken like someone who has never run into production servers setup by idiots without log rotation. I have found 100GB+ log files while trying to troubleshoot issues important enough that I can't just delete it, setup log rotation, and wait for the issue to crop up again so I have to deal with them. There's also a text reader you can use... I think it's gvim? Been awhile. scrub
 
Guys, I can't decide whether this is an elaborate troll: https://malus.sh.
Malus is a cleanroom-as-a-service platform. You upload your dependency manifest (package.json, requirements.txt, Cargo.toml, whatever you use) and our AI systems independently recreate every package in your software bill of materials from scratch.
One set of AI agents analyzes only public documentation: README files, API specifications, type definitions. They produce a detailed specification that contains no code. A completely separate set of AI agents, which have never communicated with the first set, never seen the original source, never so much as glanced at a Git repository, implements the specification from scratch.
The benefits are immediate and quantifiable:
  • Zero supply chain risk. Every line of code is generated by our robots. No compromised maintainer accounts. No geopolitical payloads. No Christmas ham emergencies.
  • Zero license compliance overhead. No AGPL contamination vectors. No attribution clauses. No CLA administration. Your legal team can finally work on something else.
  • Zero dependency on strangers. Your software stack depends on MalusCorp, a company with a support contract, an SLA, and a mailing address. We are, unlike the maintainer of left-pad, contractually obligated to care.
  • 100% CVE-free at time of delivery. Freshly generated code, untouched by human hands or known vulnerability databases. Your compliance dashboard goes from red to green overnight.
 
Last edited:
Back
Top Bottom