Open Source Software Community - it's about ethics in Code of Conducts

This guy makes it sound like force pushing is some nefarious thing to hide evidence, but if Graf Zahl actually intended to push this stuff to its own branch and only accidentally pushed to master instead, then that isn't really noteworthy. I force push on my own branches all the time to keep the history clean for reviews. He could also just be trying to undo the accidental push.
 
if you use microsoft windows in 2025 despite knowing better, you just deserve everything that happens to you at this point
People studying or working under these conditions:
may beg to differ

This one is about the AI feature in OneDrive from Microsoft. They only let you disable it 3 times a year. It scans all images uploaded to one drive with facial recognition software. And you can't just turn it on and off as many times as you please.
One trick to sidestep the AI scans and settings entirely is to use Cryptomator. It's like VeryCrypt but optimized for cloud storage, since it encrypts and stores each file individually.
Photos can't be natively viewed on Onedrive Web, but it should be sufficient to throw a wrench into Microslop's efforts. That is, if admins also use it.
 
Last edited:
This one is about the AI feature in OneDrive from Microsoft. They only let you disable it 3 times a year.
How can any sane person read past this and still think "yes, I still should use this service."

You know there's a point where the simple truth is that it's your own fault if you stick with stuff like this because it's more "comfortable" or whatever. Behave like cattle, don't be surprised you get treated like cattle, IMO. A good example is email. I pay for email, about a buck a month from a privacy-respecting local provider which is more than reasonable (I'd pay more tbh). When I hear about the newest humiliation ritual/rectal exam you need to do for a gmail account or whatever, I'm at a point where I just say "You know what, it's your own godamn fault. Pay for services."

E: Also what the guy above me said, always encrypt things you upload into a cloud for storage, even if you pay for it. There's no way of knowing where stuff exactly is stored and who has access to it.
 
Last edited:
There's no way of knowing where stuff exactly is stored and who has access to it.
With (proper) encryption, it's a little less important. Sure, a Cloud Provider can scrape the Metadata, but the Content itself is nearly impossible to read without decrypting it. Similarly to how Null uses Signal, of which their CEO is hostile to the farms, yet they can't make out anything with Null's traffic.
it's your own fault if you stick with stuff like this because it's more "comfortable" or whatever
That would mean for millions of people to ditch their jobs or education. Workers, who have no say, aren't happy when migrating to Goyslop365.
They have to choose between getting redeemed or losing their "comfortable" job.
"You know what, it's your own godamn fault. Pay for services."
It has less to do with payment and price and more about who influences their employer and governments. Millions are spent by C-Suites that act as Niggercattle to shove Slop365 everywhere, thanks to Microsoft (and other Brands) infiltrating Institutions to turn them into their Ad-platform.
It's hardly a personal choice anymore.
 
you can actually do a lot with metadata so i would prefer encrypting everything in a huge block before putting it on goydrive
That may be a better option but depending on how big the stored files should be, it would be cumbersome to continuously download and upload a huge container to simply edit a small portion of it. Cloud Storage is sometimes larger than the user's local storage, hence why this project exists.
File sizes and amount of files (but not file structure) may additionally be exposed compared to encrypted containers, but it should be enough to subvert most needful acts.
A stop-gap for better options, that one has more control over.
 
That may be a better option but depending on how big the stored files should be, it would be cumbersome to continuously download and upload a huge container to simply edit a small portion of it. Cloud Storage is sometimes larger than the user's local storage, hence why this project exists.
File sizes and amount of files (but not file structure) may additionally be exposed compared to encrypted containers, but it should be enough to subvert most needful acts.
A stop-gap for better options, that one has more control over.
granted! it is better than giving them literally everything
 
That may be a better option but depending on how big the stored files should be, it would be cumbersome to continuously download and upload a huge container to simply edit a small portion of it. Cloud Storage is sometimes larger than the user's local storage, hence why this project exists.
File sizes and amount of files (but not file structure) may additionally be exposed compared to encrypted containers, but it should be enough to subvert most needful acts.
A stop-gap for better options, that one has more control over.
What about a system that does some sort of segmented encryption system? Where instead of one big encrypted blob there's a series of blobs the exact same file size, and the decryption utility pulls the folder structure and filenames from the first blob and downloads only the blobs that contain the requested files?
 
What about a system that does some sort of segmented encryption system? Where instead of one big encrypted blob there's a series of blobs the exact same file size, and the decryption utility pulls the folder structure and filenames from the first blob and downloads only the blobs that contain the requested files?
this would indeed be a smart way to do it
you would have to contend with a handful of weird issues though when you get up to inserting and deleting many differently-sized files from all those blocks
it would essentially be like building a small file system
 
this would indeed be a smart way to do it
you would have to contend with a handful of weird issues though when you get up to inserting and deleting many differently-sized files from all those blocks
it would essentially be like building a small file system
You see this pretty frequently in the video game space, doing it for other applications isn't unthinkable.
 
What about a system that does some sort of segmented encryption system? Where instead of one big encrypted blob there's a series of blobs the exact same file size, and the decryption utility pulls the folder structure and filenames from the first blob and downloads only the blobs that contain the requested files?
I haven't found a popular implementation for it, but that may be better against Goysoft. But what happens when one container gets corrupted or deleted? How much can be recovered?
When one unlocks a "Vault" in Cryptomator, there is the option to decrypt a singular file, even if it's outside the "Vault". Recoverability would be more successful if only some individual files are corrupted, rather than a Container.
In one release where they removed file-padding, citing performance issues, they also mentioned how they want to balance between usability and security.
 
But what happens when one container gets corrupted or deleted?
ideally that would not happen because, you know, the cloud
How much can be recovered?
it will probably clobber that file and several other files, all depending on how the program maps file pieces to blocks

In one release where they removed file-padding, citing performance issues, they also mentioned how they want to balance between usability and security.
you know, storing files individually without names and padding them to 16kb or something would probably be decently secure
 
I haven't found a popular implementation for it, but that may be better against Goysoft. But what happens when one container gets corrupted or deleted? How much can be recovered?
When one unlocks a "Vault" in Cryptomator, there is the option to decrypt a singular file, even if it's outside the "Vault". Recoverability would be more successful if only some individual files are corrupted, rather than a Container.
In one release where they removed file-padding, citing performance issues, they also mentioned how they want to balance between usability and security.
I think a good solution would be to sort of have each container be a series of folders that fit under a file size limit, with individual files that's over the limit being split. Probably the pool would be a folder of containers with randomly generated file names, if a container gets lost or corrupted then the file or folders in it get marked as damaged unless you add some sort of raid like recovery system. When a file gets updated the container gets replaced, if the container contents grows too much it automatically gets split
 
The amount of effort you battered housewives will expend to make your hubby beat you a little less hard on Fridays instead of just fucking leaving already continues to astound me, no matter how many years pass.
 
ideally that would not happen because, you know, the cloud
More H1-Bs will make the impossible, possible.
storing files individually without names and padding them to 16kb or something would probably be decently secure
That's what would cause more performance issue with more files stored. Having files Split may cause issues with file syncs if not synced properly.
I think a good solution would be to sort of have each container be a series of folders that fit under a file size limit, with individual files that's over the limit being split. Probably the pool would be a folder of containers with randomly generated file names, if a container gets lost or corrupted then the file or folders in it get marked as damaged unless you add some sort of raid like recovery system. When a file gets updated the container gets replaced, if the container contents grows too much it automatically gets split
And assuming with more logic required, the performance needed might increase (?)

The amount of effort you battered housewives will expend to make your hubby beat you a little less hard on Fridays instead of just fucking leaving already continues to astound me, no matter how many years pass.
Don't mind us, enjoying Reddit but without the censors. I'm also waiting Rust to reappear as a topic in this thread.
 
My quick and dirty solution for storing encrypted backups on the cloud is using Disk Destroyer to allocate a file of a given size and format it as a LUKS container. Then I can unlock and mount the file any time I need to store or access important files.
Not the most flexible way to do it, but it requires the fewest external dependencies.
 
The amount of effort you battered housewives will expend to make your hubby beat you a little less hard on Fridays instead of just fucking leaving already continues to astound me, no matter how many years pass.
windows users in 2025 are The Raped™
More H1-Bs will make the impossible, possible.
trve which is why you should just not bother and put your files on random usb thumb drives instead
Reddit but without the censors
talking about kiwi farms this way will make people mad sometimes because it's inconveniently true in several ways

My quick and dirty solution for storing encrypted backups on the cloud is using Disk Destroyer to allocate a file of a given size and format it as a LUKS container. Then I can unlock and mount the file any time I need to store or access important files.
Not the most flexible way to do it, but it requires the fewest external dependencies.
yeah when you have a decent operating system with loopback devices, the tools are all already there
of course good luck trying to use it from fucking windows (which pose some problems if you're midway through trying to unrape yourself)
 
What about a system that does some sort of segmented encryption system? Where instead of one big encrypted blob there's a series of blobs the exact same file size, and the decryption utility pulls the folder structure and filenames from the first blob and downloads only the blobs that contain the requested files?
This is exactly how Borgbackup works. But it doesn't support Wangblows lol.
 
That would mean for millions of people to ditch their jobs or education. Workers, who have no say, aren't happy when migrating to Goyslop365.
They have to choose between getting redeemed or losing their "comfortable" job
At least when I was going through school still. Teachers had a computer at work. and likely another at home. I think it would be very easy for someone to only use Microsoft for their teaching And keep personal things on another computer. It's not like that would be a hard thing for even a normy to do.

I'm sure if they really, really wanted to. They could even run linux. And access microsoft 365 just through the web. The best option is to have either a work computer, and a personal computer. And keep personal data separate.


On cloud storage solutions.

I will never not see all of the cloud services as a nigger cattle solution to storage. Besides having a cloud server you are running yourself. Storing your data on someone else's computer was a scam to get money from people from the beginning. It still is, but it's sold to people as convenience. I have never used them. And somehow I'm able to access my data just fine. The only time it almost makes sense is for some online service, and for that it's especially a scam to make money (AWS). I think even the cloud name is so people don't think about what it really is. Sending all your data to someone else's computer.
 
Back
Top Bottom