US US Politics General - Discussion of President Biden and other politicians

Status
Not open for further replies.
BidenGIF.gif
 
Last edited:
Show was fire till he got to his aliens segment. What is this shit about and how is he sure it isn't some deranged guy? Qny thread on the farms covering this?

Onion Link of the thread on the Ayy Lmaos on the Farms. And I think the point of Tucker's highlighting of it might be a bit different from what you got: He seems to be implying that this should be a news story because... well it's exactly the sort of thing the Media loves to sperg and talk about. But instead they are ignoring it to talk about the sacred cow of Ukraine and how we must worship it.

I think it's a argument to be made about the media being true propaganda for the government, and not just "profit seeking gossip" that a lot of people insist they are as a cope to pretend there isn't a obvious conspiracy against regular people.
 
For any fellow northern US kiwis, stay safe outside

The wildfires from Canuckistan has been pushing their smoke to our neck of the woods, and it hit NY today. Air smells like burnt wood, and outside looks like a cloudy orange due to the sun reflecting off of the smoke.

The news is in a panic and it's gotten to the point where even schools are keeping the kids inside all day.
This shit is a yearly occurrence in most of the western US in late summer/early fall. These fuckers are probably all still wearing masks anyway, what's the problem?
 
There is a reason there are no real YouTube competitors, hosting and streaming video is hardware and bandwidth expensive and only Google was able to handle the massive costs of doing so at a loss for so many users at once. And even then they never turned a good profit, only barely cracking positive numbers on the good years.

It is all gonna collapse under their own weights.
It reminds me of the old joke about the Irishman buys potatoes for $1.50/lb and sells them for $1.20. "But Paddy," his friend asks, "how do you make any profit?" To which he replies: "Volume!"
 

Capitol Insurrection Network Map​

The below link will take you to the website of START, a think tank focused primarily on terrorism.
They have produced a map of the "networks" of retards that attended J6
START J6 Network Map
 
Trannies really tried aping the shit californians do on a daily basis and then turning it around on them it's fucking hillarious. Seriously imagine calling parents uneducated in fucking California - birthplace of the Karen.
Onion Link of the thread on the Ayy Lmaos on the Farms. And I think the point of Tucker's highlighting of it might be a bit different from what you got: He seems to be implying that this should be a news story because... well it's exactly the sort of thing the Media loves to sperg and talk about. But instead they are ignoring it to talk about the sacred cow of Ukraine and how we must worship it.

I think it's a argument to be made about the media being true propaganda for the government, and not just "profit seeking gossip" that a lot of people insist they are as a cope to pretend there isn't a obvious conspiracy against regular people.
He.. actually does have a point. Holy shit. Journalists fucking love UFO gossip. Why isn't this more prevalent, I actually haven't seen this hit top anywhere.
 
Trannies really tried aping the shit californians do on a daily basis and then turning it around on them it's fucking hillarious. Seriously imagine calling parents uneducated in fucking California - birthplace of the Karen.

He.. actually does have a point. Holy shit. Journalists fucking love UFO gossip. Why isn't this more prevalent, I actually haven't seen this hit top anywhere.
Another point is that well… if the goal is a UFO psyop, wouldn’t TPTB be ORDERING their rags to report on it nonstop?

I’ll be honest, nothing around the UFO story makes sense. I’m still healthily skeptical of its veracity (as actual alien craft) but I’m not entirely convinced that it’s all a psyop either, because if it was it would already be all over the news.
 
It could be a double psyop, a maskirovka like Q-Anon was. Using a designated conspiracy theory outlet to distract and control people in a Operation Trust type of way. But it doesn't really fit because it requires a level of competence that TPTB haven't really shown themselves to still have. Though it could just be a attempt that that that they are just fucking up like everything else.
 
It could be a double psyop, a maskirovka like Q-Anon was. Using a designated conspiracy theory outlet to distract and control people in a Operation Trust type of way. But it doesn't really fit because it requires a level of competence that TPTB haven't really shown themselves to still have. Though it could just be a attempt that that that they are just fucking up like everything else.

I'd think the US Government would enjoy anything, even a legit UFO leak. If it meant distracting people a bit from how the war in Ukraine is going, the Soros DA's, the open southern border, the Economy, smaller banks going under, Doctors coming out saying Pzifer was lying about everything..... Corporations following ESG guidelines organizing crazier and crazier shit. Biden Executive Orders on DEI. The list could go on.

The US Government and Corporations are actively nuking the country from the inside. They'd like people to not notice that.
 
i glad most people are not surprised by this
1686181576114.png
Instagram Helps Pedophiles Find Child Pornography and Arrange Meetups with Children (MUST READ)
Researchers discovered that Instagram has become a breeding ground for child pornography.

The Wall Street Journal study found that Instagram enabled people to search hashtags such as '# pedowhore' and '# preeteensex,' which allowed them to connect to accounts selling child pornography.

Furthermore, many of these accounts often claimed to be children themselves, with handles like "little slut for you."

Generally, the accounts that sell illicit sexual material don't outright publish it. Instead, they post 'menus' of their content and allow buyers to choose what they want.

Many of these accounts also offer customers the option to pay for meetups with the children.

HOW THIS WAS ALL UNCOVERED:

The researchers set up test accounts to see how quickly they could get Instagram's "suggested for you" feature to give them recommendations for such accounts selling child sexual content.

Within a short time frame, Instagram's algorithm flooded the accounts with content that sexualizes children, with some content linking to off-platform content trading sites.

Using hashtags alone, the Stanford Internet Observatory found 405 sellers of what researchers labeled "self-generated" child-sex material, or accounts purportedly run by children themselves, with some claiming to be as young as 12.

In many cases, Instagram actually permitted users to search for terms that the algorithm knew might be associated with illegal material.

When researchers used certain hashtags to find the illicit material, a pop-up would sometimes appear on the screen, saying, "These results may contain images of child sexual abuse" and noting that the production and consumption of such material cause "extreme harm" to children.

Despite this, the pop-up offered the user two options:

1. "Get resources"
2. "See results anyway"

HOW PEDOPHILES EVADED BEING CAUGHT:

Pedophiles on Instagram used an emoji system to talk in code about the illicit content they were facilitating.

For example, an emoji of a map (🗺️) would mean "MAP" or "Minor-attracted person."

A cheese pizza emoji (🍕) would be abbreviated to mean "CP" or "Child Porn."

Accounts would often identify themselves as "seller" or "s3ller" and state the ages of the children they exploited by using language such as "on Chapter 14" instead of stating their age more explicitly.

INSTAGRAM "CRACKDOWN":

Even after multiple posts were reported, not all of them would be taken down. For example, after an image was posted of a scantily clad young girl with a graphically sexual caption, Instagram responded by saying, "Our review team has found that [the account's] post does not go against our Community Guidelines."

Instagram recommended that the user hide the account instead to avoid seeing it.

Even after Instagram banned certain hashtags associated with child pornography, Instagram's AI-driven hashtag suggestions found workarounds.

The AI would recommend the user try different variations of their searches and add words such as "boys" or "CP" to the end instead.

The Stanford team also conducted a similar test on Twitter.

INSTAGRAM VS TWITTER:

While they still found 128 accounts offering to sell child sexual abuse (less than a third of the accounts they found on Instagram), they also noted that Twitter's algorithm didn't recommend such accounts to the same degree as Instagram, and that such accounts were taken down far quicker than on Instagram.

@elonmusk
just tweeted Wall Street Journal's article 2 mins ago labelling it "extremely concerning"

With algorithms and AI getting smarter, unfortunately, cases like this become more common.

In 2022, the National Center for Missing & Exploited Children in the U.S. received 31.9 million reports of child pornography, mostly from internet companies, which is a 47% increase from two years earlier.

How can social media companies, especially Meta, get better at regulating A.I in order to prevent disgusting cases such as this one?
.....
Study
Instagram Connects Vast Pedophile Network
The Meta unit’s systems for fostering communities have guided users to child-sex content; company says it is improving internal controls

By Jeff Horwitz and Katherine Blunt
June 7, 2023 7:05 am ET
Instagram, the popular social-media site owned by Meta Platforms, helps connect and promote a vast network of accounts openly devoted to the commission and purchase of underage-sex content, according to investigations by The Wall Street Journal and researchers at Stanford University and the University of Massachusetts Amherst.
Pedophiles have long used the internet, but unlike the forums and file-transfer services that cater to people who have interest in illicit content, Instagram doesn’t merely host these activities. Its algorithms promote them. Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests, the Journal and the academic researchers found.
Though out of sight for most on the platform, the sexualized accounts on Instagram are brazen about their interest. The researchers found that Instagram enabled people to search explicit hashtags such as #pedowhore and #preteensex and connected them to accounts that used the terms to advertise child-sex material for sale. Such accounts often claim to be run by the children themselves and use overtly sexual handles incorporating words such as “little slut for you.”
Instagram accounts offering to sell illicit sex material generally don’t publish it openly, instead posting “menus” of content. Certain accounts invite buyers to commission specific acts. Some menus include prices for videos of children harming themselves and “imagery of the minor performing sexual acts with animals,” researchers at the Stanford Internet Observatory found. At the right price, children are available for in-person “meet ups.”
The promotion of underage-sex content violates rules established by Meta as well as federal law.
In response to questions from the Journal, Meta acknowledged problems within its enforcement operations and said it has set up an internal task force to address the issues raised. “Child exploitation is a horrific crime,” the company said, adding, “We’re continuously investigating ways to actively defend against this behavior.”
Meta said it has in the past two years taken down 27 pedophile networks and is planning more removals. Since receiving the Journal queries, the platform said it has blocked thousands of hashtags that sexualize children, some with millions of posts, and restricted its systems from recommending users search for terms known to be associated with sex abuse. It said it is also working on preventing its systems from recommending that potentially pedophilic adults connect with one another or interact with one another’s content.
Alex Stamos, the head of the Stanford Internet Observatory and Meta’s chief security officer until 2018, said that getting even obvious abuse under control would likely take a sustained effort.
“That a team of three academics with limited access could find such a huge network should set off alarms at Meta,” he said, noting that the company has far more effective tools to map its pedophile network than outsiders do. “I hope the company reinvests in human investigators,” he added.

Alex Stamos, director of the Stanford Internet Observatory in Palo Alto, Calif.

Technical and legal hurdles make determining the full scale of the network hard for anyone outside Meta to measure precisely.
Because the laws around child-sex content are extremely broad, investigating even the open promotion of it on a public platform is legally sensitive.
In its reporting, the Journal consulted with academic experts on online child safety. Stanford’s Internet Observatory, a division of the university’s Cyber Policy Center focused on social-media abuse, produced an independent quantitative analysis of the Instagram features that help users connect and find content.
The Journal also approached UMass’s Rescue Lab, which evaluated how pedophiles on Instagram fit into the larger ecosystem of online child exploitation. Using different methods, both entities were able to quickly identify large-scale communities promoting criminal sex abuse.
Test accounts set up by researchers that viewed a single account in the network were immediately hit with “suggested for you” recommendations of purported child-sex-content sellers and buyers, as well as accounts linking to off-platform content trading sites. Following just a handful of these recommendations was enough to flood a test account with content that sexualizes children.
The Stanford Internet Observatory used hashtags associated with underage sex to find 405 sellers of what researchers labeled “self-generated” child-sex material—or accounts purportedly run by children themselves, some saying they were as young as 12. According to data gathered via Maltego, a network mapping software, 112 of those seller accounts collectively had 22,000 unique followers.
Underage-sex-content creators and buyers are just a corner of a larger ecosystem devoted to sexualized child content. Other accounts in the pedophile community on Instagram aggregate pro-pedophilia memes, or discuss their access to children. Current and former Meta employees who have worked on Instagram child-safety initiatives estimate the number of accounts that exist primarily to follow such content is in the high hundreds of thousands, if not millions.
A Meta spokesman said the company actively seeks to remove such users, taking down 490,000 accounts for violating its child safety policies in January alone.
“Instagram is an on ramp to places on the internet where there’s more explicit child sexual abuse,” said Brian Levine, director of the UMass Rescue Lab, which researches online child victimization and builds forensic tools to combat it. Levine is an author of a 2022 report for the National Institute of Justice, the Justice Department’s research arm, on internet child exploitation.
Instagram, estimated to have more than 1.3 billion users, is especially popular with teens. The Stanford researchers found some similar sexually exploitative activity on other, smaller social platforms, but said they found that the problem on Instagram is particularly severe. “The most important platform for these networks of buyers and sellers seems to be Instagram,” they wrote in a report slated for release on June 7.

The Palo Alto campus of Meta Platforms.

Instagram said that its internal statistics show that users see child exploitation in less than one in 10 thousand posts viewed.
The effort by social-media platforms and law enforcement to fight the spread of child pornography online centers largely on hunting for confirmed images and videos, known as child sexual abuse material, or CSAM, which already are known to be in circulation. The National Center for Missing & Exploited Children, a U.S. nonprofit organization that works with law enforcement, maintains a database of digital fingerprints for such images and videos and a platform for sharing such data among internet companies.
Internet company algorithms check the digital fingerprints of images posted on their platforms against that list, and report back to the center when they detect them, as U.S. federal law requires. In 2022, the center received 31.9 million reports of child pornography, mostly from internet companies—up 47% from two years earlier.
Meta, with more than 3 billion users across its apps, which include Instagram, Facebook and WhatsApp, is able to detect these types of known images if they aren’t encrypted. Meta accounted for 85% of the child pornography reports filed to the center, including some 5 million from Instagram.
Meta’s automated screening for existing child exploitation content can’t detect new images or efforts to advertise their sale. Preventing and detecting such activity requires not just reviewing user reports but tracking and disrupting pedophile networks, say current and former staffers as well as the Stanford researchers. The goal is to make it difficult for such users to connect with each other, find content and recruit victims.
Such work is vital because law-enforcement agencies lack the resources to investigate more than a tiny fraction of the tips NCMEC receives, said Levine of UMass. That means the platforms have primary responsibility to prevent a community from forming and normalizing child sexual abuse.
Meta has struggled with these efforts more than other platforms both because of weak enforcement and design features that promote content discovery of legal as well as illicit material, Stanford found.
The Stanford team found 128 accounts offering to sell child-sex-abuse material on Twitter, less than a third the number they found on Instagram, which has a far larger overall user base than Twitter. Twitter didn’t recommend such accounts to the same degree as Instagram, and it took them down far more quickly, the team found.
Among other platforms popular with young people, Snapchat is used mainly for its direct messaging, so it doesn’t help create networks. And TikTok’s platform is one where “this type of content does not appear to proliferate,” the Stanford report said.
Twitter didn’t respond to requests for comment. TikTok and Snapchat declined to comment.
David Thiel, chief technologist at the Stanford Internet Observatory, said, “Instagram’s problem comes down to content-discovery features, the ways topics are recommended and how much the platform relies on search and links between accounts.” Thiel, who previously worked at Meta on security and safety issues, added, “You have to put guardrails in place for something that growth-intensive to still be nominally safe, and Instagram hasn’t.”
The platform has struggled to oversee a basic technology: keywords. Hashtags are a central part of content discovery on Instagram, allowing users to tag and find posts of interest to a particular community—from broad topics such as #fashion or #nba to narrower ones such as #embroidery or #spelunking.

A screenshot taken by the Stanford Internet Observatory shows the warning and clickthrough option when searching for a pedophilia-related hashtag on Instagram.

Pedophiles have their chosen hashtags, too. Search terms such as #pedobait and variations on #mnsfw (“minor not safe for work”) had been used to tag thousands of posts dedicated to advertising sex content featuring children, rendering them easily findable by buyers, the academic researchers found. Following queries from the Journal, Meta said it is in the process of banning such terms.
In many cases, Instagram has permitted users to search for terms that its own algorithms know may be associated with illegal material. In such cases, a pop-up screen for users warned that “These results may contain images of child sexual abuse,” and noted that production and consumption of such material causes “extreme harm” to children. The screen offered two options for users: “Get resources” and “See results anyway.”
In response to questions from the Journal, Instagram removed the option for users to view search results for terms likely to produce illegal images. The company declined to say why it had offered the option.
The pedophilic accounts on Instagram mix brazenness with superficial efforts to veil their activity, researchers found. Certain emojis function as a kind of code, such as an image of a map—shorthand for “minor-attracted person”—or one of “cheese pizza,” which shares its initials with “child pornography,” according to Levine of UMass. Many declare themselves “lovers of the little things in life.”
Accounts identify themselves as “seller” or “s3ller,” and many state their preferred form of payment in their bios. These seller accounts often convey the child’s purported age by saying they are “on chapter 14,” or “age 31” followed by an emoji of a reverse arrow.
Some of the accounts bore indications of sex trafficking, said Levine of UMass, such as one displaying a teenager with the word WHORE scrawled across her face.
Some users claiming to sell self-produced sex content say they are “faceless”—offering images only from the neck down—because of past experiences in which customers have stalked or blackmailed them. Others take the risk, charging a premium for images and videos that could reveal their identity by showing their face.
Many of the accounts show users with cutting scars on the inside of their arms or thighs, and a number of them cite past sexual abuse.
Even glancing contact with an account in Instagram’s pedophile community can trigger the platform to begin recommending that users join it.

Sarah Adams, a Canadian mother of two, has built an Instagram audience combatting child exploitation.

Sarah Adams, a Canadian mother of two, has built an Instagram audience discussing child exploitation and the dangers of oversharing on social media. Given her focus, Adams’ followers sometimes send her disturbing things they’ve encountered on the platform. In February, she said, one messaged her with an account branded with the term “incest toddlers.”
Adams said she accessed the account—a collection of pro-incest memes with more than 10,000 followers—for only the few seconds that it took to report to Instagram, then tried to forget about it. But over the course of the next few days, she began hearing from horrified parents. When they looked at Adams’ Instagram profile, she said they were being recommended “incest toddlers” as a result of Adams’ contact with the account.
A Meta spokesman said that “incest toddlers” violated its rules and that Instagram had erred on enforcement. The company said it plans to address such inappropriate recommendations as part of its newly formed child safety task force.
As with most social-media platforms, the core of Instagram’s recommendations are based on behavioral patterns, not by matching a user’s interests to specific subjects. This approach is efficient in increasing the relevance of recommendations, and it works most reliably for communities that share a narrow set of interests.
In theory, this same tightness of the pedophile community on Instagram should make it easier for Instagram to map out the network and take steps to combat it. Documents previously reviewed by the Journal show that Meta has done this sort of work in the past to suppress account networks it deems harmful, such as with accounts promoting election delegitimization in the U.S. after the Jan. 6 Capitol riot.
Like other platforms, Instagram says it enlists its users to help detect accounts that are breaking rules. But those efforts haven’t always been effective.
Sometimes user reports of nudity involving a child went unanswered for months, according to a review of scores of reports filed over the last year by numerous child-safety advocates.
Earlier this year, an anti-pedophile activist discovered an Instagram account claiming to belong to a girl selling underage-sex content, including a post declaring, “This teen is ready for you pervs.” When the activist reported the account, Instagram responded with an automated message saying: “Because of the high volume of reports we receive, our team hasn’t been able to review this post.”
After the same activist reported another post, this one of a scantily clad young girl with a graphically sexual caption, Instagram responded, “Our review team has found that [the account’s] post does not go against our Community Guidelines.” The response suggested that the user hide the account to avoid seeing its content.
A Meta spokesman acknowledged that Meta had received the reports and failed to act on them. A review of how the company handled reports of child sex abuse found that a software glitch was preventing a substantial portion of user reports from being processed, and that the company’s moderation staff wasn’t properly enforcing the platform’s rules, the spokesman said. The company said it has since fixed the bug in its reporting system and is providing new training to its content moderators.
Even when Instagram does take down accounts selling underage-sex content, they don’t always stay gone.
Under the platform’s internal guidelines, penalties for violating its community standards are generally levied on accounts, not users or devices. Because Instagram allows users to run multiple linked accounts, the system makes it easy to evade meaningful enforcement. Users regularly list the handles of “backup” accounts in their bios, allowing them to simply resume posting to the same set of followers if Instagram removes them.


In some instances, Instagram’s recommendations systems directly undercut efforts by its own safety staff. After the company decided to crack down on links from a specific encrypted file transfer service notorious for transmitting child sex content, Instagram blocked searches for its name.
Instagram’s AI-driven hashtag suggestions didn’t get the message. Despite refusing to show results for the service’s name, the platform’s autofill feature recommended that users try variations on the name with the words “boys” and “CP” added to the end.
The company tried to disable those hashtags amid its response to the queries by the Journal. But within a few days Instagram was again recommending new variations of the service’s name that also led to accounts selling purported underage-sex content.
Following the company’s initial sweep of accounts brought to its attention by Stanford and the Journal, UMass’s Levine checked in on some of the remaining underage seller accounts on Instagram. As before, viewing even one of them led Instagram to recommend new ones. Instagram’s suggestions were helping to rebuild the network that the platform’s own safety staff was in the middle of trying to dismantle.
A Meta spokesman said its systems to prevent such recommendations are currently being built. Levine called Instagram’s role in promoting pedophilic content and accounts unacceptable.
“Pull the emergency brake,” he said. “Are the economic benefits worth the harms to these children?”
 
So the Pizzagaters were 100% correct about everything. They were unfairly slandered as Nazis and conspiracy theorists.

Of course they were slandered. Comet Ping Pong was connected to the most wealthy and powerful people in Washington.

Has anyone seen New York today? The Canadian wild fires has turn the place into Blade Runner 2049.

That's odd.




aflgbt.jpg

 
Last edited:
1686182767750.png
1 week if this is to be believed best second week of pride to come.
 
Status
Not open for further replies.
Back