- Joined
- Jul 18, 2020
That article is from May 2022 and it's just some sociopath butcher saying he might someday try it.The first womb transplant was in India
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
That article is from May 2022 and it's just some sociopath butcher saying he might someday try it.The first womb transplant was in India
I know I don't. Letting the audience have a say in things has never improved a show in a positive way. Keep the clout chasers and parasocial weirdos in the text chat where they belong.Yeah I didn't think most people missed call ins. I only had them because Ralph did, I think.
I heard a lot of these weirdos playing grab ass in the Telegram VC and I could absolutely do without them on my biweekly comfy show.Let me tell you right now, you are going to get the most Melvin sounding mouth breathers calling in and talking about either sonic or eugenics.
I doubt the trouble of vetting the calls will actually be worth the value of the call ins.
I have a good speaking voice but I definitely have a bad headset mic and sometimes don’t enunciate.Nothing destroys interest in a livestream quite like call-ins from utter nobodies. Their mics suck, they'll have speech impediments or obnoxious accents, and they have no sense of when to interject and when to give others the floor. People can contribute in other ways besides having their voices represented on the streams.
It's different when it's a stream appearance by Destiny, Nick Rekieta, Ethan Ralph, or Turkey Tom. They enhance the show. No amount of vetting could make a call-in by KF poster #418 an enhancement to the stream.
Viewer call-ins are a means of grifting the parasocial freaks in the audience by making them feel more involved.
It would just be mouthbreathing jimcels calling in to ask about Joshs favorite anime while giggling like schoolgirls and schizos like Elaine calling in to scream "YOUR AUDIENCE DESERVES THE TRUTH ABOUT YOU BEING A PEEEDOUGHFILE!!! REMEMBER THE 6 GORILLION ON BLOCKLAND!!" And maybe some awkward parasocials like the sit-on-your-face-Josh guy making unwanted advances. Call ins suck unless its a lolcow or an interesting person being interviewed.Nothing destroys interest in a livestream quite like call-ins from utter nobodies. Their mics suck, they'll have speech impediments or obnoxious accents, and they have no sense of when to interject and when to give others the floor. People can contribute in other ways besides having their voices represented on the streams.
It's different when it's a stream appearance by Destiny, Nick Rekieta, Ethan Ralph, or Turkey Tom. They enhance the show. No amount of vetting could make a call-in by KF poster #418 an enhancement to the stream.
Viewer call-ins are a means of grifting the parasocial freaks in the audience by making them feel more involved.
I don't think he said anything about believing it, as it's obviously a fake story.@Null bro this is like the 3rd time you've fallen for a fake race baiting reddit post. Reddit loves these tailor made stories about white people getting beaten up and/or cucked because muh racism, and none of them bother to show any skepticism because it's exactly what they want to hear. It hits every note that they want, a story perfectly suited for the cesspit it's posted in.
TBF, who here hasn't seen/heard at least one more extreme story that turned out to be true? I know I have.I don't think he said anything about believing it, as it's obviously a fake story.
I think as far as vetting goes, if Josh has mods (or just specific users) that he trusts who participate in specific threads he could maybe figure out some way to have those mods/users pre-vet some of the active/productive participants from that specific thread and get them a call-in link. I'd still want this to be used sparingly, like during people streams or if something so huge happens that it completely dominates a given MatI stream. Definitely not something I'd want to see regularly or announced ahead of time at all, if that happens you'll get people begging or trying to 'earn' a spot by means of quantity over quality posts and shitting up threads in the processI doubt the trouble of vetting the calls will actually be worth the value of the call ins. That's a lot of extra work / money spent on something that would at best be a nifty addition to the roster of content.
Yes, but I automatically dismiss predditors as pathological liars when it comes to raycism and other woke shit.TBF, who here hasn't seen/heard at least one more extreme story that turned out to be true? I know I have.
It'd take 10 average Kiwis to come up with the testosterone needed to be unafraid of programmers in dressersAbout these call-ins: consider the mind of the man who knows Null has insane tranny coders after him and would still willingly put his voice out there.
Don't act like you wouldn't be scared shitless if you went to get some socks out of your drawer in the morning only to have a troon in thigh highs pop out and accost you.It'd take 10 average Kiwis to come up with the testosterone needed to be unafraid of programmers in dressers![]()
Top it off The Space Lazor is ready from UKAsylees and Refugees With Relevant Information Should Contact the Justice Department
The Justice Department filed a lawsuit today against Space Exploration Technologies Corporation (SpaceX) for discriminating against asylees and refugees in hiring. The lawsuit alleges that, from at least September 2018 to May 2022, SpaceX routinely discouraged asylees and refugees from applying and refused to hire or consider them, because of their citizenship status, in violation of the Immigration and Nationality Act (INA).
In job postings and public statements over several years, SpaceX wrongly claimed that under federal regulations known as “export control laws,” SpaceX could hire only U.S. citizens and lawful permanent residents, sometimes referred to as “green card holders.” Export control laws impose no such hiring restrictions. Moreover, asylees’ and refugees’ permission to live and work in the United States does not expire, and they stand on equal footing with U.S. citizens and lawful permanent residents under export control laws. Under these laws, companies like SpaceX can hire asylees and refugees for the same positions they would hire U.S. citizens and lawful permanent residents. And once hired, asylees and refugees can access export-controlled information and materials without additional government approval, just like U.S. citizens and lawful permanent residents.
“Our investigation found that SpaceX failed to fairly consider or hire asylees and refugees because of their citizenship status and imposed what amounted to a ban on their hire regardless of their qualification, in violation of federal law,” said Assistant Attorney General Kristen Clarke of the Justice Department’s Civil Rights Division. “Our investigation also found that SpaceX recruiters and high-level officials took actions that actively discouraged asylees and refugees from seeking work opportunities at the company. Asylees and refugees have overcome many obstacles in their lives, and unlawful employment discrimination based on their citizenship status should not be one of them. Through this lawsuit we will hold SpaceX accountable for its illegal employment practices and seek relief that allows asylees and refugees to fairly compete for job opportunities and contribute their talents to SpaceX’s workforce.”
The department’s lawsuit alleges that SpaceX discriminated against asylees and refugees based on citizenship status at multiple stages of the hiring process. For example:
SpaceX discouraged asylees and refugees from applying for open positions, through public announcements, job applications and other online recruiting communications that excluded asylees and refugees.
SpaceX failed to fairly consider applications submitted by asylees and refugees.
SpaceX refused to hire qualified asylee and refugee applicants and repeatedly rejected asylee and refugee applicants because of their citizenship status.
SpaceX hired only U.S. citizens and lawful permanent residents, from September 2018 to September 2020.
SpaceX recruits and hires for a variety of positions, including welders, cooks, crane operators, baristas and dishwashers, as well as information technology specialists, software engineers, business analysts, rocket engineers and marketing professionals. The jobs at issue in the lawsuit are not limited to those that require advanced degrees.
Asylees and refugees are migrants to the United States who have fled persecution. To obtain their status, they undergo thorough vetting by the United States government. Under the INA, employers cannot discriminate against them in hiring, unless a law, regulation, executive order or government contract requires the employer to do so. In this instance, no law, regulation, executive order or government contract required or permitted SpaceX to engage in the widespread discrimination against asylees or refugees that the department’s investigation found, as explained in the complaint.
Because SpaceX works with certain goods, software, technology and technical data (referred to here as export-controlled items), SpaceX must comply with export control laws and regulations, including the International Traffic in Arms Regulations and the Export Administration Regulations. Under these regulations, asylees, refugees, lawful permanent residents, U.S. citizens and U.S. nationals working at U.S. companies can access export-controlled items without authorization from the U.S. government. Therefore, these laws do not require SpaceX to treat asylees and refugees differently than U.S. citizens or green card holders. Find more information here on how employers can avoid discrimination when complying with export control requirements.
The United States seeks fair consideration and back pay for asylees and refugees who were deterred or denied employment at SpaceX due to the alleged discrimination. The United States also seeks civil penalties in an amount to be determined by the court and policy changes to ensure it complies with the INA’s nondiscrimination mandate going forward.
Please contact the department’s Civil Rights Division’s Immigrant and Employee Rights Section (IER) at IERSpaceXcase@usdoj.gov or 1-888-473-3845 if you are or were an asylee or refugee who experienced any one of the following at any point in time:
(1) You applied to a job at SpaceX and were rejected.
(2) You were discouraged from applying to SpaceX because you were not a U.S. citizen or lawful permanent resident.
(3) A recruiter or other SpaceX employee told you that SpaceX could only hire U.S. citizens and/or lawful permanent residents.
IER is responsible for enforcing the anti-discrimination provision of the INA. Among other things, the statute generally prohibits discrimination based on citizenship status and national origin in hiring, firing, or recruitment or referral for a fee; unfair documentary practices; retaliation; and intimidation.
Learn more about IER’s work and how to get assistance through this brief video. Applicants or employees who believe they were discriminated against based on their citizenship, immigration status or national origin in hiring, firing, recruitment or during the employment eligibility verification process (Form I-9 and E-Verify); or subjected to retaliation, may file a charge. The public can also call IER’s worker hotline at 1-800-255-7688 (1-800-237-2515, TTY for hearing impaired); call IER’s employer hotline at 1-800-255-8155 (1-800-237-2515, TTY for hearing impaired); e-mail IER@usdoj.gov; sign up for a free webinar; or visit IER’s English and Spanish websites. Subscribe for e-mail updates from IER.
Complaint
Updated August 24, 2023
25 April 2023 - The European Commission adopted the first designation decisions under the Digital Services Act (DSA).
The European Commission designated 17 Very Large Online Platforms (VLOPs) and 2 Very Large Online Search Engines (VLOSEs) that reach at least 45 million monthly active users.
Very Large Online Platforms:
- Alibaba AliExpress
- Amazon Store
- Apple AppStore
- Booking.com
- Google Play
- Google Maps
- Google Shopping
- Snapchat
- TikTok
- Wikipedia
- YouTube
- Zalando
Very Large Online Search Engines:
- Bing
- Google Search
Following their designation, the companies will now have to comply, within four months, with the full set of new obligations under the DSA. These aim at empowering and protecting users online, including minors, by requiring the designated services to assess and mitigate their systemic risks and to provide robust content moderation tools.
Last year the European Union enacted a new set of regulations known as the Digital Services Act (DSA), designed to harmonize content regulations across the EU and create specific processes for online content moderation. The DSA applies to many different online services - from marketplaces and app stores to online video sharing platforms and search engines.
As a result, we have adapted many of our long-standing trust and safety processes and changed the operation of some of our services to comply with the DSA’s specific requirements. We look forward to continued engagement with the European Commission and other stakeholders, including technical and policy experts, as we progress this important work.
Planning ahead of today’s regulatory landscape
First and foremost, safety is good for users and good for our business. That’s why over many years across Google, we’ve made significant investments in people, processes, policies and technologies that address the goals of the DSA. A few examples:
Our Priority Flagger program (originally established in 2012 as YouTube’s Trusted Flagger program) addresses the objectives of the DSA’s Trusted Flagger provision, prioritizing review of content flagged to us by experts.
We give YouTube creators the option to appeal video removals or restrictions where they think we’ve made a mistake. The YouTube team reviews all creator appeals and decides whether to uphold or reverse the original decision. The DSA will require all online platforms to take similar measures and establish internal complaint-handling systems.
In the summer of 2021, after talking with parents, educators, child safety and privacy experts, we decided to block personalized advertising to anyone under age 18. The DSA will require other providers to take similar approaches.
Since launching YouTube’s Community Guidelines Enforcement Report in 2018 to increase transparency and accountability around our responsibility efforts, we’ve continued to publicly share a range of additional metrics, such as the Violative View Rate, to give more context about our work to protect users from harmful content.
And there are many other examples, over many years, of how we have continually introduced the types of trust and safety processes envisioned by the DSA.
Alongside these efforts, the Google Safety Engineering Center in Dublin, focused on content responsibility, has consulted with more than a thousand experts at more than a hundred events since its founding. The center helps regulators, policymakers and civil society get a hands-on understanding of our approach to content moderation and offers us valuable opportunities to learn from and collaborate with these experts.
Tailoring transparency and content moderation to the EU’s requirements
Complying at scale is not new to us. We have invested years of effort in complying with the European Union’s General Data Protection Regulation, and have built processes and systems that have enabled us to handle requests for more than five million URLs under Europe’s Right to be Forgotten.
Now, in line with the DSA, we have made significant efforts to adapt our programs to meet the Act’s specific requirements. These include:
Expanding ads transparency: We will be expanding the Ads Transparency Center, a global searchable repository of advertisers across all our platforms, to meet specific DSA provisions and providing additional information on targeting for ads served in the European Union. These steps build on our many years of work to expand the transparency of online ads.
Expanding data access for researchers: Building on our prior efforts to help advance public understanding of our services, we will increase data access for researchers looking to understand more about how Google Search, YouTube, Google Maps, Google Play and Shopping work in practice, and conducting research related to understanding systemic content risks in the EU.
We are also making changes to provide new kinds of visibility into our content moderation decisions and give users different ways to contact us. And we are updating our reporting and appeals processes to provide specified types of information and context about our decisions:
Shedding more light on our policies: We are rolling out a new Transparency Center where people can easily access information about our policies on a product-by-product basis, find our reporting and appeals tools, discover our Transparency Reports and learn more about our policy development process.
Expanding transparency reporting: More than a decade ago, we launched the industry’s first Transparency Report to inform discussions about the free flow of information and show citizens how government policies can impact access to information. YouTube also publishes a quarterly Transparency Report describing its Community Guidelines enforcement. In the months ahead, we will be expanding the scope of our transparency reports, adding information about how we handle content moderation across more of our services, including Google Search, Google Play, Google Maps and Shopping.
Analyzing risks - and helping others do so too: Whether looking at risks of illegal content dissemination, or risks to fundamental rights, public health or civic discourse, we are committed to assessing risks related to our largest online platforms and our search engine in line with DSA requirements. We will be reporting on the results of our assessment to regulators in the EU, and to independent auditors, and will publish a public summary at a later date.