Brianna Wu / John Walker Flynt - "Biggest Victim of Gamergate," Failed Game Developer, Failed Congressional Candidate

You spent Thanksgiving on the couch eating microwave popcorn, John.

polio.jpg


It could be worse, at least he's not an unemployed idiot constantly bragging about his sports car online.

car.jpg
 
So John's saying this woman got polio in 1968? There was widespread vaccination by then. Oh, and paralysis only occurs in 1 out of every 1000 cases. Also, in 1968 there were all of 57 cases in North America. In fact, between 1961-1968 there were only 773 cases in the US. You've got a better chance of getting hit by lightning than meeting someone who had polio.

Lie better, John.
 
So John's saying this woman got polio in 1968? There was widespread vaccination by then. Oh, and paralysis only occurs in 1 out of every 1000 cases. Also, in 1968 there were all of 57 cases in North America. In fact, between 1961-1968 there were only 773 cases in the US. You've got a better chance of getting hit by lightning than meeting someone who had polio.

Lie better, John.

This was probably one of the millions of Mississippi trannies John knew who all committed suicide before dying in Iraq from having conversion therapy forced on them by their bigoted parents.
 
I had a nightmare last night. John was disguised as Santa and frank was disguised as his Chinese helper. They came to knock on the doors in the middle of the night and to hand out flyers with '' I hate Christmas, vote Brianna if you do too ''. John looked like the Cringe in my nightmare. He was green and slimey and all he wanted for Christmas was to be a real woman with a real stinkditch.
 
That brings up an interesting, real conundrum doesn't it?


"They said that one way to solve this problem is to do a better job of teaching social sciences such as ethics and gender studies to computer science students."

Facial Recognition Software Regularly Misgenders Trans People

Human computer interfaces are almost never built with transgender people in mind, and continue to reinforce existing biases.

Facial recognition software is a billion dollar industry, with Microsoft, Apple, Amazon, and Facebook developing systems, some of which have been sold to governments and private companies. Those systems are a nightmare for various reasons—some systems have, for example, been shown to misidentify black people in criminal databases while others have been unable to see black faces at all.

The problems can be severe for transgender and nonbinary people because most facial recognition software is programmed to sort people into two groups—male or female. Because these systems aren’t designed with transgender and gender nonconforming people in mind, something as common as catching a flight can become a complicated nightmare. It’s a problem that will only get worse as the TSA moves to a full biometric system at all airports and facial recognition technology spreads.


These biases programmed into facial recognition software means that transgender and gender nonconforming people may not be able to use facial recognition advancements that are at least nominally intended to make people’s lives easier, and, perhaps more importantly, may be unfairly targeted, discriminated against, misgendered, or otherwise misidentified by the creeping surveillance state's facial recognition software.

Os Keyes, a “genderfucky nightmare goth PhD student” studies the intersection of human-computer interaction and social science at the University of Washington’s Department of Human Centred Design & Engineering. To find out why automatic gender recognition (AGR) is so ubiquitous, Keyes looked at the past 30 years of facial recognition research.

They studied 58 separate research papers to see how those researchers handled gender. It wasn’t good. Keyes found that researchers followed a binary model of gender more than 90 percent of the time, viewed gender as immutable more than 70 percent of the time, and—in research focused specifically on gender—viewed it as a purely physiological construct more than 80 percent of the time.

“Such a model fundamentally erases transgender people, excluding their concerns, needs and existences from both design and research,” Keyes wrote in The Misgendering Machines, a research paper they published in November. “The consequence has been a tremendous under representation of transgender people in the literature, recreating discrimination found in the wider world. AGR research fundamentally ignores the existence of transgender people, with dangerous results.”


“I couldn't help but be personally, as well as professionally annoyed by the approach that the field took to gender—of assuming these two very monolithic and universal categories of gendered experience,” Keyes told me over the phone. “Pretty much every paper I read did it.”

"We’re talking about the extension of trans erasure"

The bias against trans and nonbinary people was everywhere, from research to suggested applications of the technology. It seemed hardcoded. A 2015 research paper on AGR from the National Institute of Standards and Technology (NIST), the oldest federally funded science lab in America, suggested people could use facial recognition software to sound an alarm around women’s bathrooms if men got too close. “An operator may be alerted when a male is detected in view,” the paper suggested.

“Precisely why this technology is necessary for bathroom access control is not clear: most AGR papers do not dedicate any time to discussing the purported problem this technology is a solution to,” Keyes wrote in their paper. “The only clue comes from the NIST report which states that: ‘the cost of falsely classifying a male as a female...could result in allowing suspicious or threatening activity to be conducted,’ a statement disturbingly similar to the claims and justifications made by advocates of anti-trans ‘bathroom bills.”

Problems and prejudices like that cropped up again and again in Keyes research. “Three of [the research papers] focused on trans people. Zero of them focused on non-binary trans people, in the entire 30 year history of the field,” Keyes told me.


Machines aren’t value neutral, they act as they’re programmed. “We’re talking about the extension of trans erasure,” Keyes said. “That has immediate consequences. The more stuff you build a particular way of thinking into, the hard it is to unpick that way of thinking.”

Technology is a feedback loop—the values we build into our machines are then taught to anyone who uses it. “So when we build a particular set of values, into new spaces, and new systems, not only we making them exclusive spaces and systems and making it harder to have a world that is more inclusive overall, we're also communicating to people who try and enter—‘this is how gender works, these are the categories that you can live in, this is how your gender is determined,’” Keyes explained. “Any conflict or dissonance you have with that is your problem because this is a faceless machine.”

As facial recognition technology spreads, problems will arise for anyone who doesn’t fit the “norm” the technology was designed to recognize. This is already a problem. In 2018, MIT researchers Joy Buolamwini and Timnit Gebru published research that pointed out the AI running facial recognition software was overwhelmingly trained with white faces and led to an increased number of false positives for any other shade of skin. “A false positive carries different weights, depending on who the subject is,” Keyes explained. When traditionally marginalized groups interact with law enforcement, there’s a disproportionate chance they’ll end up dead, hurt, or in jail.


Keyes doesn’t see a need for any kind of AGR at all.

“Technologies need to be contextual and need-driven,” they said. “What are the values of the people who use the space that you're deploying a technology in? Do the people in that space actually need it? If we're not discussing gender at all, or race at all...it doesn't necessarily lead to a better world.”

They said that one way to solve this problem is to do a better job of teaching social sciences such as ethics and gender studies to computer science students. The more inherent biases are studied, the easier they are to avoid when designing new technologies. “The average [computer science] student is never going to take a gender studies class,” Keyes said. “They're not probably going to even take an ethics class. It'd be good if they did.”

Facial recognition software needs to allow for self-id, just sweep your hand over the camera and say "I'm not the dindu you're looking for".
 
That's a strange way of saying even a mindless algorithm can tell troons are men.

A new Chappy movie or a Short-Circuit reboot where the bulk of the movie is about teaching the AI the differences between heavy metal fans and "women", glam metal fans and "women", how Grace Jones and Annie Lennox are actually women despite the short hair, David Bowie, but there's also punk chicks with weird hair colors and...
 
A new Chappy movie or a Short-Circuit reboot where the bulk of the movie is about teaching the AI the differences between heavy metal fans and "women", glam metal fans and "women", how Grace Jones and Annie Lennox are actually women despite the short hair, David Bowie, but there's also punk chicks with weird hair colors and...

Algorithms mostly work on skull shape, though. A troon is going to have a male skull. They're better at recognizing it from an actual skull but you usually can't just scalp someone you're trying to identify.
 
Algorithms mostly work on skull shape, though. A troon is going to have a male skull. They're better at recognizing it from an actual skull but you usually can't just scalp someone you're trying to identify.

It's an uphill battle to accurately distinguish a man dressed like his idea of a whore and 80's Dee Snider.
 
It's an uphill battle to accurately distinguish a man dressed like his idea of a whore and 80's Dee Snider.

No matter what, though, an AI is never, ever going to detect that some insane troon, ten minutes previously, looking exactly like he looked ten minutes ago, declared himself to be a woman.

He's still objectively a man and the algorithm will detect that no matter how loudly he screeches.
 
No matter what, though, an AI is never, ever going to detect that some insane troon, ten minutes previously, looking exactly like he looked ten minutes ago, declared himself to be a woman.

He's still objectively a man and the algorithm will detect that no matter how loudly he screeches.

Agree with that, that's why comp-sci taking gender studies classes won't make algorithms woke. How would that even work? Pink pixie haircut flags man as woman, a hair style overrides the rest of the facial recognition system? That was my original point.

edit: the obvious solution is to connect all kinds of facial recognition in all public places to a database that can check people's gender identity to correctly identify and tag them with their gender identity so that [pizza chain] knows who looks at their storefront. That it enables wholesale surveillance of absolutely everyone(so no one is misgendered) is just a fluke of the system.
 
Last edited:
edit: the obvious solution is to connect all kinds of facial recognition in all public places to a database that can check people's gender identity to correctly identify and tag them with their gender identity so that [pizza chain] knows who looks at their storefront. That it enables wholesale surveillance of absolutely everyone(so no one is misgendered) is just a fluke of the system.

They should combine the algorithms so they can detect someone's real gender and whether they're a troon, so it can then misgender them and mock them and make fun of them for being a tranny, and make attack helicopter jokes, and post a Kiwi Farms thread about them with their dox and most embarrassing social media posts.
 
They should combine the algorithms so they can detect someone's real gender and whether they're a troon, so it can then misgender them and mock them and make fun of them for being a tranny, and make attack helicopter jokes, and post a Kiwi Farms thread about them with their dox and most embarrassing social media posts.

Nah, just constantly check everyone against a central database and make it mandatory to do so in the EU and their adjacent territories. For marketing purposes.
 
I think we should just go old school with armbands. Worked well for the Nazis. It will be easier to round up the terves in re-education camps that way.

It goes badly with the new school of thought giving people armbands and crayons and letting them make up their own group with their own set of super-specalities. Labels are in fashion, being unlabeled is a sign of fascism.
 
It goes badly with the new school of thought giving people armbands and crayons and letting them make up their own group with their own set of super-specalities. Labels are in fashion, being unlabeled is a sign of fascism.
Oh, it doesn't matter what symbol they draw. The armband itself tells me all I need to know.
 
Back