Plagued Pizzagate / Pedogate - Batshit Conspiracy Nuts Who See Pedos Everywhere

Doesn't have to be. How nice of him to go shopping for art that's entirely about physical, emotional, and psychological torment though.

And no, it wasn't.

http://www.biljanadjurdjevic.com/

Specifically, his Paradise Lost series.
http://www.biljanadjurdjevic.com/ParadiseLost.html

Go on. Click the image. View a few of his pieces.

SENSING A PATTERN HERE. A WEE BIT OF AN INTEREST IN TORMENT AND LOSS OF INNOCENCE INVOLVING CHILDREN THAT CROSSES BETWEEN ARTISTS AND EVEN ARTISTIC MEDIUM.

He's got a few paintings of normal forest scenery in the mix. But it turns out it's part of a weird looking installation.

The one I really hate though is is this: http://www.biljanadjurdjevic.com/LivingInOblivion.html

What the hell is up with the feet being so prominent here? All those lined up little girl feet reminds me of Dan Schneider. Nothing good can come from a freak obsessed with children's feet. There's another in the Living In Oblivion collection of a boy touching and looking at his feet. Again, really prominent feet.

There's even a painting of Santa dead on a slab called "The Last Days Santa Claus". That really screams loss of innocence. The Hotbed collection looks like a bunch of girls led into the forest then the aftermath of them laying dead on the ground. They're either face up or face down and look ashen.

You've got to be pretty abnormal to paint this stuff and pretty abnormal to buy paintings like this.
 
  • Informative
Reactions: IAmNotAlpharius
What the hell is up with the feet being so prominent here? All those lined up little girl feet reminds me of Dan Schneider. Nothing good can come from a freak obsessed with children's feet. There's another in the Living In Oblivion collection of a boy touching and looking at his feet. Again, really prominent feet.
I read once (don't ask me to cite it, I wouldn't even know where to begin looking for it) that the girls are in some sort of disciplinary pose that has some affiliation to sexual abuse.
 
You've got to be pretty abnormal to paint this stuff and pretty abnormal to buy paintings like this.
Did some searching and this page from 2009 claims that these paintings was inspired by the Yugoslav War. She was born in the 1970s so she fits the age range. Make of that info as you will.

I'm actually not dismissing the idea that the Podestas might be closet sick fucks, it's just that a lot of the "evidence" is incredibly flimsy.
 
Last edited:
View attachment 315428

I don't know if this image was posted earlier in the thread, but it's a tubby diapered emo kid with PIZZAGATE IS REAL written in shit on his chest.
Wait a moment.... wasn't that the same guy that tried making a lolcow thread on himself not too long ago to appease his humiliation fetish? I thought he would've fucked off with this shit after someone found his youtube channel.

edit: Thanks @hood LOLCOW for helping to refresh my memory on who it was, that is definitely Sean Miller.
 
Last edited:
So I found a crazy for you guys.

Her name is Lisa Vunk (dob August 11, 1966)

lisa.png


Her YT is here: https://www.youtube.com/channel/UCBLP4lVGPjzfzbszJD51xYA/videos

Her website is batshit mix of yoga/Maya/spiritual bullshit: https://www.transcendance.us/about.html (archive)

She runs (archive) a 'reclaim your Sexual Sovereignty' workshop.

Learn how to identify, honor, and eradicate shame and disempowering subconscious limiting beliefs about your sexuality. Find out how to access and experience deeper levels of self-love, self empowerment, and self-expression that will reflect as deeper levels of communication and connection in all your relationships not just sexual ones. Be empowered to more fully engage in and enjoy your birthright, Sacred Sexuality. Come Get Your GodDess On!!!

She claims (archive) to have been a Playboy Bunny, but now she's just an old fat lady.

She is currently in Ubud, Bali, home of many crazy yoga/spiritual bullshit types.

This is her photo after being arrested by the police

44461540_999135390293086_7713062231918772224_n.jpg


She is in hospital (source/archive) under sedation after getting into an argument with hotel staff (tip to white people: don't fuck with the locals in their own country). Possibly following alcohol consumption?

She has been involved with various pedo conspiracy crazyshit.

Videos here: https://www.youtube.com/channel/UCBLP4lVGPjzfzbszJD51xYA/videos

She's actually concentrating more on a Hampstead paedo church/school McDonalds conspiracy, but she ties it all to Pizzagate.

There's a 'Hoaxtead' website debunking it, which I ain't got time to go into, but judging by the quality of the crazies following this, my guess is it's about as true & valid as Pizzagate.

Lisa gets her page on Hoaxtead, for being an exceptional individual:

https://hoaxteadresearch.wordpress....rner-lisa-vunk-in-assault-over-burnt-chicken/

shutter.png

She was also arrested in a liquor store after getting argumentative there.

Judging by the array of alcohol/cigarettes ('tobacco is an ancient medicine') on her latest FB live (which is quite something!) she's a full-on alcoholic.
booze.jpg

booze2.jpg

Apparently she's a shitty mom & had her son taken off her, no surprises there.
lisavunks.png

FB1 (http://archive.is/l5OQB)
FB2 (http://archive.is/hVaNP)
Twitter (http://archive.is/DO32o) (sample Tweet: Pizzagate is REAL & Hostel 2 ISN'T Fiction)

This is a sample video where Lisa explains over 2.5 hours the connections between 666, the New World Order, pizza gate, and global child paedophile gangs.
 
Yeah I get the impression she has some serious coin stashed away from a ex-husband or something. Wine is NOT cheap in Bali, like $25 a bottle minimum, and it looks like she gets through a good number of them.

Edit: This was her stepfather

https://vineyardgazette.com/obituar...l-manager-and-coach-who-mentored-young-people

"He met his wife Judy Frank, an Islander, in 1968. They were married the same year on May 4 and lived in Edgartown where they raised their three children."

From what I read of her ramblings, she says she was a bastard, and formerly called Lisa Frank. So it sounds like Norman made Judy an honest woman (but I'm guessing Judy herself maybe had some $$$ from her family), and adopted bastard daughter Lisa.

"In addition to Judy, his wife of 49 years, he is survived by his son Tyler, his daughter Tabatha and her husband Rob Greene, and his daughter Lisa."

I'm assuming from that there is some kind of longterm $$$$ in the family.
 
Last edited:
So while I do think the whole "Pizzagate" thing is absolutely a paranoid conspiracy ...

... You can't convince me that Podesta is a normal, well-adjusted guy. If I were to visit anyone's home and saw THAT kind of artwork all over the place, I'd think I wouldn't make it out of that house alive and that I'd end up in that person's basement fridge or something.
 
So while I do think the whole "Pizzagate" thing is absolutely a paranoid conspiracy ...

... You can't convince me that Podesta is a normal, well-adjusted guy. If I were to visit anyone's home and saw THAT kind of artwork all over the place, I'd think I wouldn't make it out of that house alive and that I'd end up in that person's basement fridge or something.
Of course Podesta isn't normal, he's a billionaire. Having too much money fucks people's worldview up nine times out of ten.
 
So remember how people was claiming that secret videos exist of Hillary Clinton eating the face of a child as part of Satanic ritual and the risk of ‘deep fake’ AI technology will eventually allow for the creation of convincing fake videos of this nature? Well, that particular conspiracy theory has a name now: Frazzledrip. And as the following Vox article about YouTube’s problems with systematically promoting extremist videos and disinformation points out, it turns out YouTube’s algorithms have been systematically promoting the hell out of Frazzledrip. Surprise!:

Vox.com

YouTube’s conspiracy theory crisis, explained
Why a Democratic representative asked Google’s CEO about the most bizarre conspiracy theory you’ve never heard of.

By Jane Coaston
Dec 12, 2018, 4:15pm EST

The three-and-a-half-hour hearing with Google CEO Sundar Pichai and the House Judiciary Committee wasn’t exactly a showcase of deep knowledge of technology. One Republican representative complained that all of the Google results for the Obamacare repeal act and the Republican tax bill were negative. Rep. Steve King (R-IA) had to be told that Google does not make the iPhone. Rep. Louie Gohmert (R-TX) demanded that Google be held liable for Wikipedia’s “political bias.”

But one lawmaker, Rep. Jamie Raskin (D-MD), raised an actually important and pressing issue: the way YouTube’s algorithms can be used to push conspiracy theories.

“The point at which it becomes a matter of serious public interest is when your communication vehicle is being used to promote propaganda that leads to violent events.” He was alluding to the Pizzagate conspiracy theory which led to an armed gunman showing up at a DC-area pizzeria in 2016 — a conspiracy theory spread, in part, on YouTube.

Raskin asked about another especially strange conspiracy theory that emerged on YouTube — “Frazzledrip,” which has deep ties to the QAnon and Pizzagate conspiracy theories. He asked Pichai, “Is your basic position that [Frazzledrip] is something you want to try to do something about, but basically there is just an avalanche of such material and there’s really nothing that can be done, and it should be buyer beware or consumer beware when you go on YouTube?” adding, “Are you taking the threats seriously?

Raskin’s questions were getting at an important issue: YouTube, which Google purchased for $1.65 billion 12 years ago, has a conspiracy theory problem. It’s baked into the way the service works. And it appears that neither Congress nor YouTube itself is anywhere near solving it.

YouTube and conspiracy theories, explained

One billion hours’ worth of content is viewed on YouTube every single day. About 70 percent of those views come from YouTube’s recommendations, according to Algotransparency, a website that attempts to track “what videos YouTube’s recommendation algorithm most often recommends.”

YouTube’s content algorithms are incredibly powerful — they determine what videos show up in your search results, the suggested videos stream, on the home page, the trending stream, and under your subscriptions. If you go to the YouTube homepage, algorithms dictate which videos you see, and which ones you don’t. And if you search for something, it’s an algorithm that decides which videos you get first.



As Zeynep Tufekci, an associate professor at the School of Information and Library Science at the University of North Carolina, wrote in the New York Times in March, the YouTube advertising model is based on you watching as many videos as they can show you (and the ads that appear before and during those videos).

Whether the subject of the original video selected was right-leaning or left-leaning, or even nonpolitical, the algorithm tends to recommend increasingly more extreme videos — escalating the viewer, Tufekci wrote, from videos of Trump rallies to videos featuring “white supremacist rants, Holocaust denials, and other disturbing content.”

Watching videos of Hillary Clinton and Bernie Sanders, on the other hand, led to videos featuring “arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11,” Tufekci wrote.

On Algotransparency’s website, which tries to reverse-engineer YouTube’s recommendation algorithm, I entered two terms to find out what the algorithm would recommend for a user with no search history based on those terms. First up was “Trump.” (You can try this yourself.)

The first recommended video was from MSNBC, detailing James Comey’s testimony before the House Judiciary and Oversight committees. The second recommended video was a QAnon-themed video — relating to the conspiracy theory alleging President Donald Trump and Robert Mueller are working together to uncover a vast pedophile network including many prominent Democrats (and actor Tom Hanks). (“D5” refers to December 5, which QAnon believers argued would be the day when thousands of their political enemies would be arrested.)

Next, I tried “Hillary Clinton.” The top three recommended videos based on YouTube’s algorithm are all conspiracy-theory driven, from a video from an anti-Semitic YouTube channel that argues Freemasons will escape from the United States on private yachts after America’s eventual collapse to a user alleging that Hillary Clinton has a seizure disorder (she does not) to one alleging that Hillary Clinton has had a number of people murdered (also untrue.)

I spend a lot of time consuming content about conspiracy theories — but these results weren’t tailored to me. These results were based on a user who had never watched any YouTube videos before.

This isn’t a flaw in YouTube’s system — this is how YouTube works. Which brings us to Frazzledrip.

How YouTube helped spread the weirdest conspiracy theory of them all

The conspiracy theory behind Frazzledrip is this, as “explained” on the fake news website, YourNewsWire.com in April: Hillary Clinton and former Clinton aide Huma Abedin were filmed ripping a child’s face off and wearing it as a mask before drinking the child’s blood in a Satanic ritual sacrifice, and that video was then found on the hard drive of Abedin’s former husband, Anthony Weiner, under the code name: “Frazzledrip.”

“The #Hillgramage 2018”. I’m shaking tonight with this drop. This would be an appropriate outcome. From #Epstein to #CometPingPong and everything in between, We’ve been waiting for this. We’re coming @HillaryClinton, I’m sharpening my pitchfork right now. #FRAZZLEDRIP #Pizzagate https://t.co/NFbCc4AZTt— ImMikeRobertson ???? (@ImMikeRobertson) April 15, 2018

For the record: This is not true. There is no such video, and no such thing ever happened. But as Snopes has detailed, multiple conspiracy theories of the Trump era, including QAnon and Pizzagate, overlap, and all of them hold that Hillary Clinton is a secret child pedophile and murderer.

You have probably never heard of Frazzledrip. Most people haven’t heard of Frazzledrip, or QAnon, or perhaps even Pizzagate. But on YouTube, there are hundreds of videos, each with thousands of views, dedicated to a conspiracy theory alleging that a former presidential candidate ripped a child’s face off and wore it as a mask. And there’s markedly little YouTube, or Google, or even Congress, seem able to do about it.

“It’s an area we acknowledge there is more work to be done”

Here’s how Pichai answered Raskin’s question: “We are constantly undertaking efforts to deal with misinformation, but we have clearly stated policies, and we have made lots of progress in many of the areas over the past year. … This is a recent thing but I’m following up on it and making sure we are evaluating these against our policies. It’s an area we acknowledge there is more work to be done.”

While explaining that YouTube takes problematic videos on a case by case basis, he added, “It’s our responsibility, I think, to make sure YouTube is a platform for freedom of expression, but it needs to be responsible in our society.”

But it isn’t easy to balance a platform that claims to be for freedom of expression with societal responsibility. It’s not illegal to believe in conspiracy theories, or to think that the September 11th attacks were an inside job (they weren’t) or that the Sandy Hook shootings never happened (they did) or that Hillary Clinton is a child-eating pedophilic cannibal (this, it must be said, I suppose, is untrue).

YouTube could radically change its terms of service — in a way that would dramatically limit the freedom of expression Pichai and his colleagues are attempting to provide. Or it could invest much more heavily in moderation, or change its algorithm.

But all of that would be bad for business. As long as YouTube is so heavily reliant on algorithms to keep viewers watching, on a platform where hundreds of hours of video are uploaded every minute of every day, the conspiracy theories will remain. Even if YouTube occasionally bans conspiracy theorists like Alex Jones, users will continue to upload videos about Frazzledrip, or QAnon, or videos arguing that the earth is flat — and YouTube’s algorithms, without any change, will keep recommending them, and other users will watch them.

———-

“YouTube’s conspiracy theory crisis, explained” by Jane Coaston; Vox.com; 12/12/2018

“You have probably never heard of Frazzledrip. Most people haven’t heard of Frazzledrip, or QAnon, or perhaps even Pizzagate. But on YouTube, there are hundreds of videos, each with thousands of views, dedicated to a conspiracy theory alleging that a former presidential candidate ripped a child’s face off and wore it as a mask. And there’s markedly little YouTube, or Google, or even Congress, seem able to do about it.”

Hundreds of videos dedicated to pushing the idea that Hillary Clinton ripped a child’s face off and wore it as a mask. That’s a thing on YouTube. And YouTube’s algorithms appear to love it:


How YouTube helped spread the weirdest conspiracy theory of them all

The conspiracy theory behind Frazzledrip is this, as “explained” on the fake news website, YourNewsWire.com in April: Hillary Clinton and former Clinton aide Huma Abedin were filmed ripping a child’s face off and wearing it as a mask before drinking the child’s blood in a Satanic ritual sacrifice, and that video was then found on the hard drive of Abedin’s former husband, Anthony Weiner, under the code name: “Frazzledrip.”


So it should be no surprise that when people with no YouTube search history do a search for “Hillary Clinton”, the top three results are all anti-Hillary conspiracy theories:


On Algotransparency’s website, which tries to reverse-engineer YouTube’s recommendation algorithm, I entered two terms to find out what the algorithm would recommend for a user with no search history based on those terms. First up was “Trump.” (You can try this yourself.)

The first recommended video was from MSNBC, detailing James Comey’s testimony before the House Judiciary and Oversight committees. The second recommended video was a QAnon-themed video — relating to the conspiracy theory alleging President Donald Trump and Robert Mueller are working together to uncover a vast pedophile network including many prominent Democrats (and actor Tom Hanks). (“D5” refers to December 5, which QAnon believers argued would be the day when thousands of their political enemies would be arrested.)

Next, I tried “Hillary Clinton.” The top three recommended videos based on YouTube’s algorithm are all conspiracy-theory driven, from a video from an anti-Semitic YouTube channel that argues Freemasons will escape from the United States on private yachts after America’s eventual collapse to a user alleging that Hillary Clinton has a seizure disorder (she does not) to one alleging that Hillary Clinton has had a number of people murdered (also untrue.)

I spend a lot of time consuming content about conspiracy theories — but these results weren’t tailored to me. These results were based on a user who had never watched any YouTube videos before.



And that’s why Democratic lawmakers were asking Google’s CEO Sundary Pichai about what, if anything, YouTube was planning on doing about the fact that its platform is aggressively pushing things like Frazzledrip. But it doesn’t sound like YouTube has any plans at all other than to explain that “there is more work to be done”:


But one lawmaker, Rep. Jamie Raskin (D-MD), raised an actually important and pressing issue: the way YouTube’s algorithms can be used to push conspiracy theories.

“The point at which it becomes a matter of serious public interest is when your communication vehicle is being used to promote propaganda that leads to violent events.” He was alluding to the Pizzagate conspiracy theory which led to an armed gunman showing up at a DC-area pizzeria in 2016 — a conspiracy theory spread, in part, on YouTube.

Raskin asked about another especially strange conspiracy theory that emerged on YouTube — “Frazzledrip,” which has deep ties to the QAnon and Pizzagate conspiracy theories. He asked Pichai, “Is your basic position that [Frazzledrip] is something you want to try to do something about, but basically there is just an avalanche of such material and there’s really nothing that can be done, and it should be buyer beware or consumer beware when you go on YouTube?” adding, “Are you taking the threats seriously?

Raskin’s questions were getting at an important issue: YouTube, which Google purchased for $1.65 billion 12 years ago, has a conspiracy theory problem. It’s baked into the way the service works. And it appears that neither Congress nor YouTube itself is anywhere near solving it.



Here’s how Pichai answered Raskin’s question: “We are constantly undertaking efforts to deal with misinformation, but we have clearly stated policies, and we have made lots of progress in many of the areas over the past year. … This is a recent thing but I’m following up on it and making sure we are evaluating these against our policies. It’s an area we acknowledge there is more work to be done.”

While explaining that YouTube takes problematic videos on a case by case basis, he added, “It’s our responsibility, I think, to make sure YouTube is a platform for freedom of expression, but it needs to be responsible in our society.”

But it isn’t easy to balance a platform that claims to be for freedom of expression with societal responsibility. It’s not illegal to believe in conspiracy theories, or to think that the September 11th attacks were an inside job (they weren’t) or that the Sandy Hook shootings never happened (they did) or that Hillary Clinton is a child-eating pedophilic cannibal (this, it must be said, I suppose, is untrue).

YouTube could radically change its terms of service — in a way that would dramatically limit the freedom of expression Pichai and his colleagues are attempting to provide. Or it could invest much more heavily in moderation, or change its algorithm.


But as the following article notes, the idea that YouTube simply can’t realistically address the flood of far right conspiracy theory videos getting systematically pushed on users is simply bogus. How so? Because when you put in terms like “Hillary Clinton” into Google’s video searches you don’t end up with a flood of far right conspiracy theory videos. And YouTube is owned by Google. In other words, this algorithmic performance is clearly a choice by YouTube. A choice rooted in a business model of feeding users more and more extreme content to keep them watching:

HmmDaily

YouTube Already Knows How to Stop Serving Toxic Videos

Tom Scocca
Published on Dec 12, 2018 11:49PM EST

Vox has an explainer today about how YouTube automatically steers users toward deranged and conspiratorial videos, even if the users have started out with perfectly ordinary interests. This has been explained before, but it keeps needing to be explained because it’s so incomprehensible in normal human terms: YouTube’s algorithms, which are built to keep people watching as many videos and video ads as possible, have apparently followed that instruction to the conclusion, as Zeynep Tufekci wrote in the New York Times, “that people are drawn to content that is more extreme than what they started with—or to incendiary content in general.”

The humans who run YouTube (and run its algorithms) aren’t exactly proud of the fact that their product showcases misogynist rants or pseudoscientific nonsense or apocalyptic conspiracy theories. But their position is that what happens inside their black box is extremely hard to correct or regulate, and on the scale at which YouTube operates, it’s impossible to apply human judgment to every case. They wish there was a way to serve up video recommendations without poisoning people’s minds till someone believes it’s necessary to invade a pizza parlor with an assault rifle, but that’s a real tough computational challenge.

What this line of defense leaves out is a very basic, obvious fact: YouTube already has access to an algorithm that can sort through videos without promoting unhinged fringe material. It’s called Google. YouTube is part of Google. When and if Google’s search algorithms start giving Google users fringe results, Google treats that as a failure and tries to fix the algorithms.

In the Vox piece, Jane Coaston writes about what happened when she searched “Trump” on a site that tracks YouTube’s video recommendations:

The first recommended video was from MSNBC, detailing James Comey’s testimony before the House Judiciary and Oversight committees. The second recommended video was a QAnon-themed video — relating to the conspiracy theory alleging President Donald Trump and Robert Mueller are working together to uncover a vast pedophile network including many prominent Democrats (and actor Tom Hanks). (“D5” refers to December 5, which QAnon believers argued would be the day when thousands of their political enemies would be arrested.)

Here is what came up when I tried a search for “Trump” on the “Videos” tab of Google.com:

[see image of Google search results showing a lack of conspiracy theory results]

Searching “Hillary Clinton” recommendations from YouTube led Coaston straight to conspiracy theories, including murder. Here’s “Hillary Clinton” on a Google video search:

[see image of Google video search results showing a lack of conspiracy theory results]



It’s true that Google and YouTube are different services, with different architecture. Google was built to index the Web and sort through existing material; YouTube hosts video content itself. That distinction, though, isn’t as big as it might seem—Google video search points toward video on the websites of various news organizations, such as the Washington Post or AP News, and YouTube has to point to YouTube, but the Washington Post and AP News are also YouTube content providers. much everyone is.

And so YouTube doesn’t have to pick out Pizzagaters or MRAs or neo-phrenologists. It has the power to send viewers in the opposite direction. The people who run YouTube made the choice to teach its algorithms to value trash—even if they thought they were teaching the system to value something more neutral, like viewing time. There was a time, in living memory, when the YouTube recommendation system was less aggressive and it acted like Google: stacking up more and more songs by the same band you were listening to, say, or the same subject you were watching a clip about, until you’d had all you wanted and were done.

There is no way to be done in the fever swamps. What distinguishes the people in charge of YouTube from the people in charge of Google search is that the goal of Google search is to settle on satisfying results. The goal of YouTube is to keep people unsettled and unhappy, so they keep watching and keep seeing more ads. A less poisonous index would encourage to people leave the site after they got what they’d come there for. The algorithm the company really doesn’t want to tinker with is the one that tells it to make the most money it possibly can.

———-

“YouTube Already Knows How to Stop Serving Toxic Videos” by Tom Scocca; HmmDaily; 12/12/2018

“The humans who run YouTube (and run its algorithms) aren’t exactly proud of the fact that their product showcases misogynist rants or pseudoscientific nonsense or apocalyptic conspiracy theories. But their position is that what happens inside their black box is extremely hard to correct or regulate, and on the scale at which YouTube operates, it’s impossible to apply human judgment to every case. They wish there was a way to serve up video recommendations without poisoning people’s minds till someone believes it’s necessary to invade a pizza parlor with an assault rifle, but that’s a real tough computational challenge.”

There’s just nothing that can be done. the algorithms are too complex. That’s basically YouTube’s excuse. But as the article points out, all YouTube would have to do is adopt an algorithm closer to the Google Video search. That’s pretty much it. And that’s how YouTube used to operate. But when the goal of the service is to maximize eyeballs, an algorithm that effectively radicalizes the audience is what that business model demands:


What this line of defense leaves out is a very basic, obvious fact: YouTube already has access to an algorithm that can sort through videos without promoting unhinged fringe material. It’s called Google. YouTube is part of Google. When and if Google’s search algorithms start giving Google users fringe results, Google treats that as a failure and tries to fix the algorithms.

In the Vox piece, Jane Coaston writes about what happened when she searched “Trump” on a site that tracks YouTube’s video recommendations:

The first recommended video was from MSNBC, detailing James Comey’s testimony before the House Judiciary and Oversight committees. The second recommended video was a QAnon-themed video — relating to the conspiracy theory alleging President Donald Trump and Robert Mueller are working together to uncover a vast pedophile network including many prominent Democrats (and actor Tom Hanks). (“D5” refers to December 5, which QAnon believers argued would be the day when thousands of their political enemies would be arrested.)

Here is what came up when I tried a search for “Trump” on the “Videos” tab of Google.com:

[see image of Google search results showing a lack of conspiracy theory results]

Searching “Hillary Clinton” recommendations from YouTube led Coaston straight to conspiracy theories, including murder. Here’s “Hillary Clinton” on a Google video search:

[see image of Google video search results showing a lack of conspiracy theory results]



And so YouTube doesn’t have to pick out Pizzagaters or MRAs or neo-phrenologists. It has the power to send viewers in the opposite direction. The people who run YouTube made the choice to teach its algorithms to value trash—even if they thought they were teaching the system to value something more neutral, like viewing time. There was a time, in living memory, when the YouTube recommendation system was less aggressive and it acted like Google: stacking up more and more songs by the same band you were listening to, say, or the same subject you were watching a clip about, until you’d had all you wanted and were done.


So as we can see, YouTube could address this probably fairly easily. But doing so might reduce eyeballs somewhat and cut into the company’s profits somewhat. It’s all a reminder that the assumption that profit-maximization is good for society as a whole is one of those social meta-algorithms that really needs to be addressed too. Especially in an era when serving up disinformation is a proven method for maximizing profits. Otherwise get ready for our Frazzledrip-on-steroids future.
 
But as Snopes has detailed, multiple conspiracy theories of the Trump era, including QAnon and Pizzagate, overlap, and all of them hold that Hillary Clinton is a secret child pedophile and murderer.
'

God this sentence is so poorly written. "Is a secret child pedophile and murderer" lol.

And also it isn't just the "Trump era" where all this shit overlapped. For years retards have contented that all mass shootings and terror attacks were just the US government false flagging, which seems like a bretty big "overlap" to me.

Imagine fucking up on writing an article about the low hanging fruit that is the pizzagate autism.
 
Ok sure, I mean, the pizzagate people are all crazy, but nobody is questioning that she wears the skin of children, right? I thought that was established fact. Wasn't that one of her campaign promises? Babyleather pantsuits for everyone?
Have you noticed, though, that as more time passes, the more woke people end up sunk because of depravity? *twilight zone theme song*
 
  • Semper Fidelis
Reactions: ICametoLurk
Remember that one comment from Stephen Colbert where he said that people who were researching PizzaGate at the time were fucking losers?

Doesn't it seem weird that he called the whole thing a joke while being friends with one of the guy's who email got hacked by NOT Russia and from what WikiLeaks had said in his tweets about the FAKE emails? The only way for these so-called food code name emails to be faked is unless they not only waited years to make shit up but also pull a fast one and claim it's just a prank bro.

And he has shown in his Twitter that the emails that got leaked are real emails due to the way the code is made for them. If for example, the code came from Google? That means it's not fake.

It's kinda funny that the minute either Alex Jones or random people started talking about PizzaGate? EVERYONE IN THE MAINSTREAM MEDIA! Including Fox News have all said the emails were fake news. They never showed why they were fake nor did they shown the pictures or clues about why that is. It was like they were worried that some of that so-called FAKE NEWS was gonna be real for some reason?

Almost like their plan was gonna be destroyed the minute if they were honest about the emails left and right. This is the same dinosaur that claims that devil worship was on the rise in the 80s and 90s.

It just seems weird that Mainstream Media was afraid something was gonna happen if those so called fake emails were true.

They could be full of shit like everyone in mainstream media have said. But what if this so-called child trafficking in any government, not just the US were true?

You could even say this shit is BEYOND our little minds and many of these insane Conspiracy numb jobs.

But then again? Kiwi Farms was taken down during or before the PizzaGate shit kicked into high gear last year in 2016. So who am I to judge?
 
Back