Brianna Wu / John Walker Flynt - "Biggest Victim of Gamergate," Failed Game Developer, Failed Congressional Candidate

Also, wasn't he somehow busted sending himself threats?
He posted "harassing" himself on the Steam forum for R60 and forgot to change accounts.

He also claimed to have received threats and complained that the sheriff wouldn't do anything and the sheriff publicly confirmed John had never even contacted them.

I don't think he's ever been outright caught sending himself threats although for some reason all of them he claimed to have received had an oddly familiar writing style to them.
 
You should go on air with your throbbing cock hanging out.

hat.png
 
John will surely be discarding all of his Apple products if this plan goes ahead as stated.

avast.png


Apple wants to protect children online. But could there be unintended consequences?

Apple is taking steps to combat child sex abuse materials (CSAM), including implementing technology that will detect known-CSAM uploaded to iCloud; an iMessage feature that will alert parents if their child sends or receives an image with nudity; and a block if someone tries to search for CSAM-related terms on Siri or Search. The changes, which Apple says will be released in the US later this year, were first leaked via a tweet thread by a Johns Hopkins University cryptography professor who heard about them from a colleague. Apple has since confirmed the reports.

The scanning technology Apple is implementing — called NeuralHash — doesn’t compare images the way human eyes do. Instead, it creates a string of numbers and letters — called a “hash” — for each image and then checks it against their database of known CSAM, according to reporting by TechCrunch. And because editing an image traditionally changes a hash, Apple has added additional layers of scanning called “threshold secret sharing” so that “visually similar” images are also detected. The technology, Avast Global Head of Security Jeff Williams says, sounds similar to the Photo DNA project that was developed during his time at Microsoft.

“With threshold secret sharing, there’s some kind of scoring in place for an image,” Williams says. “Say two images are completely identical — that’s a 100 percent match. But if someone alters it — say they change it by cropping — then maybe it’s only a 70 percent match. Apple is saying that if it’s above a certain percentage score, they’ll move on to the next review phase.”

The potential unintended consequences of NeuralHash

NeuralHash processes the data on the users’ device, before it’s uploaded to iCloud. That method, Avast Chief Privacy Officer Shane McNamee says, is the right one.

“If you are going to check something on someone’s device, a really good way of doing that is to not pull that data off their phone and onto your servers,” McNamee says. “It’s really minimizing the data you’re sending. So, from a privacy perspective, that’s technically a very pro-privacy way to go.”

While combating CSAM is extremely important, privacy and security experts are concerned about the possible unintended consequences of this technology. McNamee questions whether companies should scan people’s devices at all.

“Now that this is possible to have access, authorities will push for more access,” he says. “It’s like we’re peeking over your shoulder, but we’re wearing sunglasses and saying the sunglasses can only see bad things. And you have this little snooper on the device that’s just reading everything and checking it, not sending it to Apple unless you’re doing something wrong. That’s the problem — the definition of ‘doing something wrong’ could be broadened.”

Brianna Wu — a computer programmer, video game creator, online advocate, and Executive Director of Rebellion PAC who describes herself as “an Apple fan” — points out that the US government could theoretically create legislation giving them permission to use this technology without the general public ever knowing. There are “far less checks and balances” on behind the scenes deals between the US government and tech companies, in the name of national security, than the general public may believe.

“This would allow agencies to spy on our phones to find, say, pictures that the Pentagon says compromise national security or belong to terrorists,” Wu tells Avast. “And if you look at the specifics of Edward Snowden’s revelations, it’s clear that our national security agencies may stick to certain rules in the US, but outside there are no rules at all. I feel very confident this technology could be used to spy on people in other countries.”

According to Williams, the same concern was raised when Photo DNA was publicly licensed in 2014.

“As soon as Microsoft brought it out for CSAM, they heard ‘Why not use it for terrorism?’” he says. “Pragmatically, there’s no reason why they couldn’t. It’s a neutral technology that doesn’t require the image to be CSAM-related in order to work.”

However, Williams also points out that NeuralHash could actually weaken the push for backdoor encryption that is a common request from law enforcement in recent years. By taking both the CSAM and the process of detecting it off the cloud, Williams says, Apple has set themselves up to build a platform that is end-to-end encrypted, “removing law enforcement’s excuse to go in [our devices].”

“I commend Apple for taking steps in this regard, both to protect individual privacy and to support law enforcement in a manner that does not require backdoor encryption,” he adds.

Stalkerware and danger to LGBTQIA+ kids

The second big change is that Apple will allow parents to implement a program on their children’s iMessages that would blur any images with nudity. It will also alert parents if the child chooses to view the image or send nude images themselves. While Wu says she “can live with the iCloud part” of these new changes, she feels that the scanning messages part leads down “a deeply Orwellian road” and she “would beg Apple to reconsider.”

“The thought that any time you transmit a nude photo, your parents might be alerted to that? Obviously there are good use cases, like if a predator is involved,” Wu says. “But I can’t help thinking of the kid who’s going to be busted starting a romance with a school mate.”

Wu points to the fact that the majority of US teens are sexually active before the age of 18 — and that “sexting” is not uncommon among teenagers. This technology, then, potentially infringes on teens’ right to sexual autonomy. It could also potentially open up charges of distributing child pornography against the children, if a parent reports, or the parents if they share the image with the other parents involved.

But even more concerning to Wu is the possibility that this technology could “out” LGBTQIA+ kids to their parents, potentially placing them in both psychological and physical danger. Drawing on her own experience as a queer kid in Mississippi in the ‘90s, she worries that “Silicon Valley forgets what the rest of America is like.”

“When I was trying to figure out that I was queer, I would actually take my bike and go to the University of Southern Mississippi library just to read books and find more information,” Wu says. “I needed that space to figure out who I was. And, thank God, there are laws for libraries that mean it’s very hard for parents to find that information. This is taking away the privacy a child might have had 20 to 30 years ago in a library.”

She also points out that this tool could theoretically be used as stalkerware in intimate partner violence, especially for less tech-savvy people who might not recognize (or know how to change the fact) that the child profile had been implemented on their phone.

“I love my husband dearly, but he knows nothing about technology,” Wu says. “I could install this on his phone tomorrow and he’d have no idea.”

“Or let’s say a woman wants a divorce from her husband," she continues. "He grabs her phone, activates this, sees she’s sexting a new partner, and uses the images against her in revenge porn. I can promise you that will happen with this technology.”

Apple has hung their hat on the consumer right to privacy — and they’ve made significant moves to prove their commitment, from giving users the right to opt out of tracking to letting users hide their email addresses. But while all three experts consulted here agree that NeuralHash NeuralHash appears to protect user privacy to the best of Apple’s technological ability, the potential unintended consequences of the iMessage alerts likely outweigh the potential benefits.

“Name a product from Apple, I’ve got it,” Wu says. “I’m all-in on the Apple ecosystem because of privacy. I root for them to succeed, but this is by far the worst plan I’ve seen them put into effect. I think they’re going down a wrong path and it’s extremely concerning."
 
"She also points out that this tool could theoretically be used as stalkerware in intimate partner violence" - Wu lifted that directly from another article.

Plus "In order to receive the warnings about sexually explicit images on their children’s devices, parents will have to enroll their child’s phone. Kids over 13 can unenroll, meaning parents of teenagers won’t get notifications."
Why kids must be able to send nude pictures of themselves to strangers is a mystery. Why boys must be able to send dicks pics to grown men is not as much of a mystery.

The elephant in the room that Wu would never bring up.
“What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for,’ ” Green asked. “Does Apple say no? I hope they say no, but their technology won’t say no.”
 
The elephant in the room that Wu would never bring up.
“What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for,’ ” Green asked. “Does Apple say no? I hope they say no, but their technology won’t say no.”

Yeah, that's where this fucks us all over. If someone were to make a law that all computing devices had to scan for anti-government activity then it'd be a no brainer for every tech company to just pull out of that market. But if they've already built and deployed the technology then they're already on the ride and there's no getting off.

But sure, trannies most affected.
 
Just FUCK OFF you dang dirty MEN, I fucking hate the penis grrrrrrrrr

View attachment 2421503
Hehe, he's STILL butthurt is abt getting corrected on the technicals

John is using the 'oh this is for the non-engineer plebbies' he's done before


2021-08-07 06_15_32-Window.png

[the attach is about John confusing location data with contact data]. I notice the guy alling John out now has protected tweets...could be anything, but maybe the wu crew went after him -- yeah John people can't understand the difference between their location and their contact list only 'engineers' understand the difference between where you are and who you know

In his panic to save face, I wonder if John gets that HE is the one deciding the audience is too stupid for accurate information (and funny that John never PRE-announces he's using analogies or sweeping generalizations)
 
I'm with Wu here. Despite all the claims that it is wevy wevy restricted and accurate, this is a giant surveillance hole (not that Apple does not glow already anyway). Next it will be use to check for terrorist propaganda, then for revenge porn, then for dangerous assault hate memes, then for remaining political opponents. That is how shit always goes. Manuel Icaza despite his programming cred is a known big tech bootlicker and turned GNOME into complete shit and smuggled Microsoft's .NET into linux.
 
So you'll definitely be closing your social media accounts ASAP then, of course.

View attachment 2421824
I know he’s using the gun thing as a metaphor here, but… they sell automatic weapons to people because the demand exists and they want to make money, John. That’s definitely not the “better question” here. It’s a very easy question.
 
  • Like
Reactions: Homer Simpson
Last edited:
Back