Culture The viral AI avatar app Lensa undressed me—without my consent - My avatars were cartoonishly pornified, while my male colleagues got to be astronauts, explorers, and inventors.


By Melissa Heikkilä

lensa-mh-grid.jpg
MELISSA HEIKKILÄ VIA LENSA

When I tried the new viral AI avatar app Lensa, I was hoping to get results similar to some of my colleagues at MIT Technology Review. The digital retouching app was first launched in 2018 but has recently become wildly popular thanks to the addition of Magic Avatars, an AI-powered feature which generates digital portraits of people based on their selfies.

But while Lensa generated realistic yet flattering avatars for them—think astronauts, fierce warriors, and cool cover photos for electronic music albums— I got tons of nudes. Out of 100 avatars I generated, 16 were topless, and in another 14 it had put me in extremely skimpy clothes and overtly sexualized poses.

I have Asian heritage, and that seems to be the only thing the AI model picked up on from my selfies. I got images of generic Asian women clearly modeled on anime or video-game characters. Or most likely porn, considering the sizable chunk of my avatars that were nude or showed a lot of skin. A couple of my avatars appeared to be crying. My white female colleague got significantly fewer sexualized images, with only a couple of nudes and hints of cleavage. Another colleague with Chinese heritage got results similar to mine: reams and reams of pornified avatars.

IMG_0184-2.jpgIMG_0270-1.jpgIMG_0271.jpgIMG_0231-1.jpg

Lensa’s fetish for Asian women is so strong that I got female nudes and sexualized poses even when I directed the app to generate avatars of me as a male.

IMG_0317.jpg
MELISSA HEIKKILä VIA LENSA

The fact that my results are so hypersexualized isn’t surprising, says Aylin Caliskan, an assistant professor at the University of Washington who studies biases and representation in AI systems.

Lensa generates its avatars using Stable Diffusion, an open-source AI model that generates images based on text prompts. Stable Diffusion is built using LAION-5B, a massive open-source data set that has been compiled by scraping images off the internet.

And because the internet is overflowing with images of naked or barely dressed women, and pictures reflecting sexist, racist stereotypes, the data set is also skewed toward these kinds of images.

This leads to AI models that sexualize women regardless of whether they want to be depicted that way, Caliskan says—especially women with identities that have been historically disadvantaged.

AI training data is filled with racist stereotypes, pornography, and explicit images of rape, researchers Abeba Birhane, Vinay Uday Prabhu, and Emmanuel Kahembwe found after analyzing a data set similar to the one used to build Stable Diffusion. It’s notable that their findings were only possible because the LAION data set is open source. Most other popular image-making AIs, such as Google’s Imagen and OpenAI’s DALL-E, are not open but are built in a similar way, using similar sorts of training data, which suggests that this is a sector-wide problem.

As I reported in September when the first version of Stable Diffusion had just been launched, searching the model’s data set for keywords such as “Asian” brought back almost exclusively porn.

Stability.AI, the company that developed Stable Diffusion, launched a new version of the AI model in late November. A spokesperson says that the original model was released with a safety filter, which Lensa does not appear to have used, as it would remove these outputs. One way Stable Diffusion 2.0 filters content is by removing images that are repeated often. The more often something is repeated, such as Asian women in sexually graphic scenes, the stronger the association becomes in the AI model.

Caliskan has studied CLIP (Contrastive Language Image Pretraining), which is a system that helps Stable Diffusion generate images. CLIP learns to match images in a data set to descriptive text prompts. Caliskan found that it was full of problematic gender and racial biases.

“Women are associated with sexual content, whereas men are associated with professional, career-related content in any important domain such as medicine, science, business, and so on,” Caliskan says.

Funnily enough, my Lensa avatars were more realistic when my pictures went through male content filters. I got avatars of myself wearing clothes (!) and in neutral poses. In several images, I was wearing a white coat that appeared to belong to either a chef or a doctor.

But it’s not just the training data that is to blame. The companies developing these models and apps make active choices about how they use the data, says Ryan Steed, a PhD student at Carnegie Mellon University, who has studied biases in image-generation algorithms.

“Someone has to choose the training data, decide to build the model, decide to take certain steps to mitigate those biases or not,” he says.
The app’s developers have made a choice that male avatars get to appear in space suits, while female avatars get cosmic G-strings and fairy wings.

A spokesperson for Prisma Labs says that “sporadic sexualization” of photos happens to people of all genders, but in different ways.

The company says that because Stable Diffusion is trained on unfiltered data from across the internet, neither they nor Stability.AI, the company behind Stable Diffusion, “could consciously apply any representation biases or intentionally integrate conventional beauty elements.”

“The man-made, unfiltered online data introduced the model to the existing biases of humankind,” the spokesperson says.

Despite that, the company claims it is working on trying to address the problem.

In a blog post, Prisma Labs says it has adapted the relationship between certain words and images in a way that aims to reduce biases, but the spokesperson did not go into more detail. Stable Diffusion has also made it harder to generate graphic content, and the creators of the LAION database have introduced NSFW filters.

Lensa is the first hugely popular app to be developed from Stable Diffusion, and it won’t be the last. It might seem fun and innocent, but there’s nothing stopping people from using it to generate nonconsensual nude images of women based on their social media images, or to create naked images of children. The stereotypes and biases it’s helping to further embed can also be hugely detrimental to how women and girls see themselves and how others see them, Caliskan says.

“In 1,000 years, when we look back as we are generating the thumbprint of our society and culture right now through these images, is this how we want to see women?” she says.



Our hot virtual Asian Melissa also wrote this article about Greg Rutkowski:
 
Last edited:
Ha ha, based.

Lensa AI app causes a stir with sexy “Magic Avatar” images no one wanted​

Early reports said the app unwantedly sexualized depictions of women using the app.

BENJ EDWARDS

lensa_hero_2-800x450.jpg
A selection of male and female "Magic Avatars" generated by the Lensa AI app, including a beard cannot be contained.

Over the past week, the smartphone app Lensa AI has become a popular topic on social media because it can generate stylized AI avatars based on selfie headshots that users upload. It's arguably the first time personalized latent diffusion avatar generation has reached a mass audience.

While Lensa AI has proven popular among people on social media who like to share their AI portraits, the press has widely focused on the app's reported tendency to sexualize depictions of women when Lensa's AI avatar feature launched.

A product of Prisma Labs, Lensa launched in 2018 as a subscription app focused on AI-powered photo editing. In late November 2022, the app grew in popularity thanks to its new "Magic Avatar" feature. Lensa reportedly utilizes the Stable Diffusion image synthesis model under the hood, and Magic Avatar appears to use a personalization training method similar to Dreambooth (whose ramifications we recently covered). All of the training takes place off-device and in the cloud.

In early December, women using the app noticed that Lensa's Magic Avatar feature would create semi-pornographic or unintentionally sexualized images. For CNN, Zoe Sottile wrote, "One of the challenges I encountered in the app has been described by other women online. Even though all the images I uploaded were fully-clothed and mostly close-ups of my face, the app returned several images with implied or actual nudity."

The reaction in the press grew as other women experienced similar things. For example, here's a selection of headlines covering Lensa in various publications:

Meanwhile, the same sexualization issue didn't appear in images of men uploaded to the Magic Avatar feature. For MIT Technology Review, Melissa Heikkilä wrote, "My avatars were cartoonishly pornified, while my male colleagues got to be astronauts, explorers, and inventors."

This is one of the dangers of basing a product on Stable Diffusion 1.x, which can easily sexualize its output by default. The behavior comes from the large quantity of sexualized images found in its image training data set, which was scraped from the Internet. (Stable Diffusion 2.x attempted to rectify this by removing NSFW material from the training set.)

The Internet's cultural biases toward the sexualized depiction of women online have given Lensa's AI generator its tendencies. Speaking to TechCrunch, Prisma Labs CEO Andrey Usoltsev said, "Neither us, nor Stability AI could consciously apply any representation biases; To be more precise, the man-made unfiltered data sourced online introduced the model to the existing biases of humankind. The creators acknowledge the possibility of societal biases. So do we.”

In response to the widespread criticism, Prisma Labs has reportedly been working to prevent the accidental generation of nude images in Lensa AI.

On Tuesday, we experimented with Lensa AI by uploading 15 images of a woman (and also 15 images of a man), then paid for Magic Avatar representations of each. The woman's results we saw weren't sexualized in any obvious way, so Prisma's efforts to reduce the incidence of NSFW-style imagery might be working already.

lensa_female_examples-980x397.jpg
A selection of female Magic Avatars that we generated using the Lensa AI app.

lensa_male_examples-980x397.jpg
A selection of male Magic Avatars that we generated using the Lensa AI app.

This does not mean that Lensa AI's tendencies to sexualize female subjects don't exist. It's worth noting that due to how Stable Diffusion generates images, it's likely that different women will see different results depending on how closely their face (or input photos) resemble photos of a particular actress, celebrity, or model in the Stable Diffusion dataset. In our experience, that can heavily dictate how Stable Diffusion renders the rest of a person's body and its context.

Using Lensa AI is easy, but it's a product with a yearly subscription fee (currently $50 per year) and a one-time fee for training Magic Avatar images—$3.99 for 50 avatars, $5.99 for 100 avatars, or $7.99 for 200 avatars. Customers upload 10-20 selfies taken from different angles and then wait roughly 20-30 minutes to see the results. Users must confirm they are over 18 years or older to use the service.

Those technically inclined could experiment with Dreambooth instead for free, but it currently requires some knowledge of development tools to get working.



Why why why would you ever pay for this what the actual fuck. Yeah, sexism of the "AI" is bad, but the white men who make these pointless AI projects have shown that they do not care time and time again. So, sexism is to be expected. Why you would pay for that is.. mindboggling.
 
Last edited:
Create your own AI to portray you as an ugly cunt. Then again, for there to be actual value to the AI, transformation is somewhat necessary.

"AI training data is filled with racist stereotypes, pornography, and explicit images of rape, researchers Abeba Birhane, Vinay Uday Prabhu, and Emmanuel Kahembwe"

These people might merit some attention, sounds like there could be potential.
 
Using Lensa AI is easy, but it's a product with a yearly subscription fee (currently $50 per year) and a one-time fee for training Magic Avatar images—$3.99 for 50 avatars, $5.99 for 100 avatars, or $7.99 for 200 avatars.
Bulk price 4 cents an avatar, but why would you ever need 200 of the same character?
 
  • Thunk-Provoking
Reactions: xX_rAcE_wAr_420_Xx
How are american asian women more historically disadvantaged than others? I never read stories of black women complaining about ai sexualizing them.
Maybe if we use the logic about the stories you don't hear.

You know, if it's a violent crime with a nondescript suspect... If it's a sex crime with a nondescript subject...

Unfortunately not a provable hypothesis, since you don't hear the stories you don't hear. Closest I can think of to a story of an Asian woman being low on the oppression stack was when one got murdered and the story about Mexican men being oppressed came out instead.
 
There is relatively little porn of black women compared to women of other races. Thus the AI is in fact racist for not showing us enough naked black ladies! This is an outrage! I won't stand for this injustice!
 
  • Horrifying
Reactions: Matt Damon
like every article posted here i skimmed over the first paragraph and then scrolled down. is she upset that an AI generated a sexy image of herself?
The same type of women go out publicly dressed in a provocative way and demand to get male attention ONLY from the men they want to be noticed by and feel like they've been personally wronged if it comes from anyone else.

Western society has indoctrinated them with the belief that it is their birthright to wield absolute power over sexuality itself.
 
Back