Open Source Software Community - it's about ethics in Code of Conducts

  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account
It's marketed as something to make your computer unhackable and the data onboard impossible to retrieve by unauthorized third parties. In practice, it's just another way for Silicon Valley and Davos to push you further towards owning nothing.
Im more worried about big tech and the da bigga gobmint than I am some elite haxxor who is more likely to spear phish me than anything else.

The safety aspect is true, just not for consumers, but for them.
 
I think you already know (but anyway) that Lunduke made an 120IQ decision and paywalled his new YouTube videos. Well, according to Socialblade, he instantly became irrelevant and started to lose subs.

And I thought that Jews are smart...
He does this regularly when he doesn't have anything worthwhile to post.

The Lunduke Cycle : Get a decent story with a good take on it > Complain about a mob (Nazi's, trannies, Big Tech, whatever) targeting him > Needing to lock his shit down to stop all the attacks and risk to his family, but the Patreon is right over there.

There's a reason why he has been doing this shit for over a decade, worked on a magazine, and has an anemic view count on anything he does in comparison to his subscriber numbers. He's not a nobody, or dumb, just horrifically incompetent when it comes to being an internet personality
 
What's even the point of "trusted computing"?
Conceptually, your computer acting in some capacity as an "independent" agent not designed to be controlled by any party (of course, in practice this means "not controlled by the end user" - wouldn't surprise me at all if current implementers had backdoor access). The original use for this was doing things like storing disk encryption keys and verifying the integrity of the bootloader before allowing them to be fetched. In this way you could have full-disk encryption and still be able to start up your computer without having to enter a password. In theory this was supposed to protect against "evil maid" attacks where the "end user" at the moment is someone who has gained unauthorized physical access to your computer. In practice it caused problems for people trying to run anything that wasn't Windows IIRC. Notably, to my knowledge, it does nothing to defend against software vulnerabilities - running malware can access any encrypted device that has already been unlocked and that it has the necessary permissions to access.

More recently the concept is being expanded to things like "hardware attestation", which IIUC is basically a mode of running software in which the computer will verify using built-in cryptographic keys whether only "Authorized Software" is running. Basically a way for the computer to cryptographically verify to third parties that the user isn't in control of the computer as it runs the specified software. Of course, this isn't particularly useful for handling anything sensitive or important, since the user can always just open up their computer and dump the memory, eavesdrop on the system buses, etc, so my understanding is it's going to be used for things like anticheat and verifying that the end user is indeed seeing the intended advertisements.

The short answer is that right now there isn't much legitimate point except maybe to save typing in a password on startup. The idea doesn't map very well to general-purpose computers for personal use, since it's pretty much never in the user's interest to not control their own computing.

To give an example of a legitimate-in-theory use, suppose you have a computer that does nothing but accept data and sign it with the current time, confirming that "I have seen this no later than <time>". This is a real type of server that gets used all the time as part of the Timestamp Protocol. All it needs is a clock, a key, and a network interface. If a general-purpose computer does this, the signature is only worth as much as the integrity of its operator, its physical security, etc. If a special "trusted computing" computer is used, it's also backed by the guarantee that forging a timestamp would require physically compromising the computer at a very low level. How difficult that is would depend on the measures taken by the computer's designer to protect the hardware keys and the clock. In general, the more special-purpose the computer, the more measures can be taken, which is why for general-purpose consumer computers the protection isn't nearly as strong as it could be.
 
And again, creating a falsified creation record requires that every device upstream of it is compromised. If a single person with an independent, uncompromised device shows up to record the creation of this device - which should be a public event - it can't be compromised without it being made known.
No, you only need a false creation record if deception and tampering are impossible, because then compromise must happen at creation time and nobody can lie about it. Otherwise, you just convince someone to vouch for you in error or compromise your device at a later point. The "impossible" part is vital for your scheme, you can't weaken it to "kinda hard" later and reuse the old results.
You don't need to have perfect accuracy - deleting the key material on false positives is always an option.
This renders the device unusable because it can no longer generate trustworthy keys after a tampering alert. False positives will be common too because a false negative spells doom for everything that actually relies on the integrity of this trusted computing scheme. It's expensive, frail and requires moon technologies left as an exercise for the reader - hence nerd wank.
You proceeded to go on about how akshyually trusted computing is always bad because if you put a proprietary black box "mandatory mystery chip" in consumer hardware a "general purpose computer" it can be used to exert control over the user makes it so the user "doesn't control it anymore". To cap it all off, you conclude by directly comparing trusted computing to Microsoft.

Should I be exasperated by you immediately proving me correct, or reassured?
Well excuse me for not being too excited about technology whose existing instances are exclusively about enabling glownigger and copyrast garbage, and whose legit applications amount to a few extra days of time in case of physical compromise. Have you considered thermite instead?
 
the ability to lock down your system to only run kernels signed by your key that you have hidden behind full disk encryption is nice
of course corponiggers are going to be dicks so they want to sell you shitty hardware that only runs stuff they sign
a good solution is to never touch that garbage and buy shit that isn't fucking retarded

if we went down the darkest timeline and all desktop pcs after 2026 were only allowed jeetdows or approved versions of some shitty debian derivative, that would be the most horrible shit ever
on the other hand, this is far from the only way the hardware you buy fucks you over. if you want to avoid all forms of bullshit you would need to pick up glassblowing and build a computer yourself out of handmade vacuum tubes
 
where the "end user" at the moment is someone who has gained unauthorized physical access to your computer.
If that happens you've got much bigger problems.
the ability to lock down your system to only run kernels signed by your key that you have hidden behind full disk encryption
Again, what is the purpose?
I see this a lot in the fawss space - insane security measures for a home PC. What's the point? What's the point of having disk encryption on my home PC? It's not like it protects against malware, since the disk is open while using the PC, so it only protects your sensitive data if someone gained unauthorized access to your physical disk. But at that point, I reiterate, you have much bigger problems than your disk.
 
Again, what is the purpose?
I see this a lot in the fawss space - insane security measures for a home PC. What's the point? What's the point of having disk encryption on my home PC? It's not like it protects against malware, since the disk is open while using the PC, so it only protects your sensitive data if someone gained unauthorized access to your physical disk. But at that point, I reiterate, you have much bigger problems than your disk.
just having the option is not a bad thing
also people sometimes have laptops that they lug around in public and shit
somebody swiping one of these computers for a second and installing a rootkit for a deep and powerful compromise is not a terribly far fetched idea, and people having the tools to defend against it isn't bad
you are correct in that there are absolutely bigger problems when you can't maintain physical security against malicious actors, but at least an evil-maid-proofed pc can help soften the blow of certain shit the malicious actor could do

like all things security, these things depend on your threat model. many people are fine with regular efi loading whatever from an unencrypted disk but you should also be able to have the trusted booting and disk encryption if you think it would protect you somehow and don't mind the extra effort setting it up
 
just having the option is not a bad thing
This is the crux of the matter that shills itt and faggots fail to see. Trusted computing is a venue for castrating your freedom even more. It is not enough that 95% of all laptops in use are infested with the glowie silicone-level spyware that is the Intel Management Engine / AMD Secure Processor, now they want to add hardware attestation that will 1000000% be used to garden wall everyone even more. FUCK your glownigger restrictions and FUCK your "USE CASE? DURRR WHAT IS THE USECASE FOR SECURITY & PRIVACY!?!?!?????? NO ONE CARES BROOOO HURR UR NOT IMPORTANT ENOUGH FOR THE GLOWIES DURRR" literal nigger monkey 40 iq take. Everyone should have absolute sovereignty over their system the moment they purchase it, and they should be able to modify it as they see fit from the silicone up. If I want to FDE everything, I should have the right to do so. If I want to install Linux for Niggers, I should be able. If I want to flash custom firmware, I should absolutely have the freedom to. The ONLY restrictions to such things should be the user's own skill and will. Imagine being such a cucked faggot that you actively shill for increased enslavement. What the fuck ever happened to "zero trust"?
 
No, you only need a false creation record if deception and tampering are impossible, because then compromise must happen at creation time and nobody can lie about it. Otherwise, you just convince someone to vouch for you in error or compromise your device at a later point. The "impossible" part is vital for your scheme, you can't weaken it to "kinda hard" later and reuse the old results.
Perhaps you should expound somewhat on how exactly you're defining "impossible" here. Certainly physical impossibility isn't it: a sufficiently fast and accurate sensor could penetrate the material of the device at the speed of light and land at the key storage faster than any signal to delete could propagate, but that doesn't seem like a particularly reasonable scenario. By comparison, all asymmetric cryptography can, by design, be broken with sufficient time (there is only a computational barrier, not a shortage of information), and likewise for any cryptographic hash function collisions necessarily exist. Both of these can in theory be found instantly if the attacker simply "gets lucky". How unlikely or slow does success need to be before it's "impossible"? Given the physical nature of attempts, it seems like the usual "1 in 2^128" rate for information security might be a bit excessive? Would "1 in 2^32" be reasonable, or too common?

I'm legitimately uncertain how far you could go in securing these devices, and I'll admit that drives some innate nerd curiosity. But just like with cryptography, it's easy to build something you can't break - building something others can't is another matter entirely, and to my thinking necessarily requires a back-and-forth of attack, review, and improvement.

This renders the device unusable because it can no longer generate trustworthy keys after a tampering alert. False positives will be common too because a false negative spells doom for everything that actually relies on the integrity of this trusted computing scheme. It's expensive, frail and requires moon technologies left as an exercise for the reader - hence nerd wank.
It does indeed render the device unusable. How common false positives are depends on how the user treats it and what conditions need to be treated as failsafe-worthy. I imagine devices used in the field would have an elevated rate of false positives, but fixed deployments should have pretty consistent conditions.

No argument on the expensive and - to an extent - frail points. I don't yet know of any moon technologies it requires, just careful application of existing technologies.

Well excuse me for not being too excited
It's okay, I really can't complain when I started out by saying I should've seen it coming.
 
Explains why games like Battlefield 6 are enforcing Secure Boot, so those unsigned cheater drivers can't be loaded.
I have said this before and will say it again: Secure boot is a scam to make your computer more like a phone so they can control everything.

Don't like google safety net giving everyone your hardware backed uuid to every app? Sorry, no banking for you! (Or food for that matter, I remember reading the mcdicks app checked safety net at some point. The botnet wants you to starve). And because you need to use an approved rom that means you get all the ads/(p)re-installed slop/etc. samsung thinks you will tolerate.

This state of affairs where you only run corpo-approved software that can reliably ID and force ads (and other manipulation) down your throat is great for google/samsung/apple/etc. and the government. It is why they fought bootloader unlocking so hard even though a teeny tiny percentage of people run custom roms. And you best believe if you couldn't unlock your bootloader so that possibility of custom roms wasn't there it would be WAY worse.

But the corpos and the government are not happy with PCs as they are. They would prefer they be more like phones. I can, at this very moment, run free software, connect to TOR, and post mean words on the internet to an obscure fruit site without being identified. This is unacceptable (see: the UK) to the powers that be. So what is the solution to this problem? Secure boot.

Much like any attack on people's rights this will be started with ostensibly good intentions that supposedly target a disliked group. We must think of the children and let the government spy on everything so they can catch pedos. If we only were to accept secure boot then there would be no more cheaters. Attaching it as a requirement to battlefield slop is just step one to try and get the slightly-more-techincal normies acquinted/comfortable with it.

Of course much in the same way the governemnt doesn't use their surveillance to catch pedos, secure boot will not actually help with cheaters:
Battlefield 6 already has a huge cheating problem and it hasn't even launched yet.

The end goal is hardware attestation to access the internet. They will say they're not killing free software because ubuntu will be allowed. Your choices will be Apple, Microsoft Saars, or Jeremy Bitcha and everything you do on your PC monitored with some law that makes the recent UK online safety bill look great in comparison. If you're buying these games or making software that uses secure boot you're stupid, evil, or both.
 
What's the point of having disk encryption on my home PC?
I use it all the time, because at least on Linux it is super easy to setup.
There is always the risk that your devices are stolen or end up where they should not be, or you yourself doing something you shouldn't.

We have all seen all the channels where people buy second hand computers on craigslist and similar and then go on to look at and laugh at all the files with personal information they find on the disks.

Encrypted disks, especially when it is super easy to set up, prevents that from happening if you dispose of it incorrectly or it gets stolen.


I even do this all the time for USB sticks. "cryptsetup luksFormat" is all it takes and when I want to erase the device, it is just a matter of forgetting the passphrase or just running cryptsetup again.
I do this every time I use usb memorysticks, doing it every time prevents you from forgetting to do it on the rare occasions where it actually matters.
 
just having the option is not a bad thing
also people sometimes have laptops that they lug around in public and shit
somebody swiping one of these computers for a second and installing a rootkit for a deep and powerful compromise is not a terribly far fetched idea, and people having the tools to defend against it isn't bad
you are correct in that there are absolutely bigger problems when you can't maintain physical security against malicious actors, but at least an evil-maid-proofed pc can help soften the blow of certain shit the malicious actor could do
You're a fucking retard and the "evil maid attack" is retard bait for bike-shedding retards such as yourself. We get it, you can understand "bad person touch computer" (and not a whole lot else). Bully for you.

Unfortunately there is a much easier attack for someone with physical access that completely defeats secure boot: A malicious usb cable. Not only do they not require installing software at all (that must be customized to your target) so are much faster but have tons of additional useful features like wifi access, key stealing, input injection, etc.
 
Everyone should have absolute sovereignty over their system the moment they purchase it, and they should be able to modify it as they see fit from the silicone up. If I want to FDE everything, I should have the right to do so. If I want to install Linux for Niggers, I should be able. If I want to flash custom firmware, I should absolutely have the freedom to. The ONLY restrictions to such things should be the user's own skill and will. Imagine being such a cucked faggot that you actively shill for increased enslavement. What the fuck ever happened to "zero trust"?
yes i agree with this
this is why computers you buy should come with all the trusted computing shit you want, just disabled by default
when you want to lock down your computer you program the otp memory deep in your processor and motherboard to only load a specific signed kernel
Unfortunately there is a much easier attack for someone with physical access that completely defeats secure boot: A malicious usb cable. Not only do they not require installing software at all (that must be customized to your target) so are much faster but have tons of additional useful features like wifi access, key stealing, input injection, etc.
"bro protecting yourself against evil maid is worthless because <other attack>"
it is crucial for computer users to use multiple layers of security to protect themselves against various attacks, and it has always been that way
i'd imagine if you were serious about this specific class of attacks you would be fairly cautious about unlocking your drive if you notice that your computer has strange new hardware attached to it
 
Crossposting. Android is of course not free software, but I believe this is relevant to this thread.
 
Just don't run random programs LOL
Generally valid advice for end users, not so much for organizations that have to worry about direct attacks from corporate- and state-level threat actors trying to conduct espionage or straight sabotage (although for them the biggest threat vector is STILL Karen McRunseveryattachment over in HR getting spearphished)
 
Back
Top Bottom