Call of Duty Thread - Potential return to form? Or nothing but cope on the horizon? You decide!

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
Source | Archive

Caltech and Activision Publishing Team Up to Combat Bad Behavior Online​

Written by Emily Velasco, December 15th 2022

Whether it is trolling, racism, sexism, doxing, or just general harassment, the internet has a bad behavior problem. Researchers from Caltech and Activision Publishing, a video game publisher, are working together to bring their combined expertise to address this behavior in video games.​

Because this kind of toxic behavior makes the internet an unpleasant place to be, there have been many attempts over the years to make sure people behave themselves online. In the earlier days of the internet, websites often relied on moderators—volunteers or staff—who were trained to keep discussions and content civil and appropriate. But as the internet continued to grow and harmful behaviors became more extreme, it became apparent that moderators need better tools at their disposal.

Increasingly, the online world is moving toward automated moderation tools that can identify abusive words and behavior without the need for human intervention. Now, two researchers from Caltech, one an expert in artificial intelligence (AI) and the other a political scientist, are teaming up with Activision on a two-year research project that aims to create an AI that can detect abusive online behavior and help the company's support and moderation teams to combat it.

The sponsored research agreement involves Anima Anandkumar, the Bren Professor of Computing and Mathematical Sciences, who has trained AI to fly drones and study the coronavirus; Michael Alvarez, professor of political and computational social science, who has used machine learning tools to study political trends in social media; and Activision's data engineers, who will provide insight into player engagement and game-driven data.

Alvarez and Anandkumar have already worked together on training AI to detect trolling in social media. Their project with the team that works on the Call of Duty video games will allow them to develop similar technology for potential use in gaming.

"Over the past few years, our collaboration with Anima Anandkumar's group has been very productive," Alvarez says. "We have learned a great deal about how to use large data and deep learning to identify toxic conversation and behavior. This new direction, with our colleagues at Activision, gives us an opportunity to apply what we have learned to study toxic behavior in a new and important area—gaming."

For Anandkumar, the important questions this research will answer are: "How do we enable AI that is transparent, beneficial to society, and free of biases?" and "How do we ensure a safe gaming environment for everyone?"

She adds that working with Activision gives the researchers not only access to data about how people interact in online games, but also to their specialized knowledge.

"We want to know how players interact. What kind of language do they use? What kinds of biases do they have? What should we be looking for? That requires domain expertise," she says.

Michael Vance, Activision 's chief technology officer, says the firm is excited to work with Caltech.

"Our teams continue to make great progress in combating disruptive behavior, and we also want to look much further down the road," Vance says. "This collaboration will allow us to build upon our existing work and explore the frontier of research in this area."

... what? :story:
Thank God I don’t play multiplayer games much anymore, AI moderation sounds like a nightmare. Shit talking in video games should be a human right goddammit!
 
Thank God I don’t play multiplayer games much anymore, AI moderation sounds like a nightmare. Shit talking in video games should be a human right goddammit!
Yeah, there's already a huge percentage of MWII players who say they were falsely accused and banned for cheating by the existing automated anti hacking tools. Imagine that but for something as subjective as speech detection.
 
Source | Archive

Caltech and Activision Publishing Team Up to Combat Bad Behavior Online​

Written by Emily Velasco, December 15th 2022

Whether it is trolling, racism, sexism, doxing, or just general harassment, the internet has a bad behavior problem. Researchers from Caltech and Activision Publishing, a video game publisher, are working together to bring their combined expertise to address this behavior in video games.​

Because this kind of toxic behavior makes the internet an unpleasant place to be, there have been many attempts over the years to make sure people behave themselves online. In the earlier days of the internet, websites often relied on moderators—volunteers or staff—who were trained to keep discussions and content civil and appropriate. But as the internet continued to grow and harmful behaviors became more extreme, it became apparent that moderators need better tools at their disposal.

Increasingly, the online world is moving toward automated moderation tools that can identify abusive words and behavior without the need for human intervention. Now, two researchers from Caltech, one an expert in artificial intelligence (AI) and the other a political scientist, are teaming up with Activision on a two-year research project that aims to create an AI that can detect abusive online behavior and help the company's support and moderation teams to combat it.

The sponsored research agreement involves Anima Anandkumar, the Bren Professor of Computing and Mathematical Sciences, who has trained AI to fly drones and study the coronavirus; Michael Alvarez, professor of political and computational social science, who has used machine learning tools to study political trends in social media; and Activision's data engineers, who will provide insight into player engagement and game-driven data.

Alvarez and Anandkumar have already worked together on training AI to detect trolling in social media. Their project with the team that works on the Call of Duty video games will allow them to develop similar technology for potential use in gaming.

"Over the past few years, our collaboration with Anima Anandkumar's group has been very productive," Alvarez says. "We have learned a great deal about how to use large data and deep learning to identify toxic conversation and behavior. This new direction, with our colleagues at Activision, gives us an opportunity to apply what we have learned to study toxic behavior in a new and important area—gaming."

For Anandkumar, the important questions this research will answer are: "How do we enable AI that is transparent, beneficial to society, and free of biases?" and "How do we ensure a safe gaming environment for everyone?"

She adds that working with Activision gives the researchers not only access to data about how people interact in online games, but also to their specialized knowledge.

"We want to know how players interact. What kind of language do they use? What kinds of biases do they have? What should we be looking for? That requires domain expertise," she says.

Michael Vance, Activision 's chief technology officer, says the firm is excited to work with Caltech.

"Our teams continue to make great progress in combating disruptive behavior, and we also want to look much further down the road," Vance says. "This collaboration will allow us to build upon our existing work and explore the frontier of research in this area."

... what? :story:
I sure hope that no no-good, rotten, scoundrel, Internet 4chan troll uses this as an excuse to script a crawler to find the public email addresses of every Activision and Caltech employee with a LinkedIn profile. It sure would be tragic (basically The Holocaust 2) if someone then used a burner account to email the word "faggot" to every last one of them. Saying bad words on the Internet onToLoGicALly eViL.
 
I sure hope that no no-good, rotten, scoundrel, Internet 4chan troll uses this as an excuse to script a crawler to find the public email addresses of every Activision and Caltech employee with a LinkedIn profile. It sure would be tragic (basically The Holocaust 2) if someone then used a burner account to email the word "faggot" to every last one of them. Saying bad words on the Internet onToLoGicALly eViL.
Better yet, do the "I'm just pretending to be retarded" angle and watch as people get banned for calling out the dumbfuckery
 
Yeah, there's already a huge percentage of MWII players who say they were falsely accused and banned for cheating by the existing automated anti hacking tools. Imagine that but for something as subjective as speech detection.
It’s sad comparing how the world used to be to how shit the world has come. The 2009 Modern Warfare 2 multiplayer chat was so infamously toxic that it became the baseline for how future video games judged toxicity. Honestly at certain points the banter was even better than the game, so much so that my K/D ratio was 0.01 or lower because I would just kill players on my team to get a reaction. Here is a video with 4 million views showcasing the nostalgia people feel when thinking of MW2 lobbies.
Fast forward to 2022 and we have big brother telling us we can’t have banter in a video game because it may hurt a complete stranger’s feelings.
2B39794D-3078-46DF-8584-CCDB689609BA.png
 
I dunno. Haven't they been talking about policing call of duty chat for the better part of a decade now? And I still hear people calling each other nigger. Not as much as in 2010 mind you, but probably about as much as I hear on Kiwifarms.
 
I didn't even know it got added. I thought it was going to be like a week long/month long event similar to past events.

Same gay shit as the End of Verdansk.
Don't you love these Monday afternoon streamer events?

Although I will always insist that The Destruction of Verdansk was done for an hour in the middle of the week because they knew they fucked it up so bad and didn't want a lot of people playing it. I'd assume something similar is happening here just based off the half hour or so of footage I saw while I was on the shitter yesterday. People were going to lengths to get those damn keys only to get killed in seconds.
 
Last edited:
Don't you love these Monday afternoon streamer events?

Although I will always insist that The Destruction of Verdansk was done for an hour in the middle of the week because they knew they fucked it up so bad and didn't want a lot of people playing it. I'd assume something similar is happening here just based off the half hour or so of footage I saw while I was on the shitter yesterday. People were going to lengths to get those damn keys only to get killed in seconds.
From what little I saw of the event, it was a decent idea massacred by the servers absolutely shitting themselves.
 
From what little I saw of the event, it was a decent idea massacred by the servers absolutely shitting themselves.
The Destruction of Verdansk? It was executed in about the shittiest way possible, which was only made worse since it had like 6 months or more of hype surrounding it.

Without going into details, I'm convinced that it was literally thrown together simply because the community hyped themselves into a frenzy the second the nuke was introduced into Warzones story. The whole thing, especially when the Zombies became involved, just reeked of them getting it over with to get it over with.
 
It’s sad comparing how the world used to be to how shit the world has come. The 2009 Modern Warfare 2 multiplayer chat was so infamously toxic that it became the baseline for how future video games judged toxicity. Honestly at certain points the banter was even better than the game, so much so that my K/D ratio was 0.01 or lower because I would just kill players on my team to get a reaction. Here is a video with 4 million views showcasing the nostalgia people feel when thinking of MW2 lobbies.
Fast forward to 2022 and we have big brother telling us we can’t have banter in a video game because it may hurt a complete stranger’s feelings. View attachment 4108734
I'm certain that MWII sold as well as it did because of nostalgia. It's riding on the coattails of MW2, you know, the CoD that cemented itself in gaming and pop culture. But, it's just a reskinned MW2019, faults and alls.

Honestly, MWII appears as a general modern military shooter with no personality. And worse monetization.
 
Shipment is a horrible, retarded map that should be retired permanently
Source | Archive

Caltech and Activision Publishing Team Up to Combat Bad Behavior Online​

Written by Emily Velasco, December 15th 2022

Whether it is trolling, racism, sexism, doxing, or just general harassment, the internet has a bad behavior problem. Researchers from Caltech and Activision Publishing, a video game publisher, are working together to bring their combined expertise to address this behavior in video games.​

Because this kind of toxic behavior makes the internet an unpleasant place to be, there have been many attempts over the years to make sure people behave themselves online. In the earlier days of the internet, websites often relied on moderators—volunteers or staff—who were trained to keep discussions and content civil and appropriate. But as the internet continued to grow and harmful behaviors became more extreme, it became apparent that moderators need better tools at their disposal.

Increasingly, the online world is moving toward automated moderation tools that can identify abusive words and behavior without the need for human intervention. Now, two researchers from Caltech, one an expert in artificial intelligence (AI) and the other a political scientist, are teaming up with Activision on a two-year research project that aims to create an AI that can detect abusive online behavior and help the company's support and moderation teams to combat it.

The sponsored research agreement involves Anima Anandkumar, the Bren Professor of Computing and Mathematical Sciences, who has trained AI to fly drones and study the coronavirus; Michael Alvarez, professor of political and computational social science, who has used machine learning tools to study political trends in social media; and Activision's data engineers, who will provide insight into player engagement and game-driven data.

Alvarez and Anandkumar have already worked together on training AI to detect trolling in social media. Their project with the team that works on the Call of Duty video games will allow them to develop similar technology for potential use in gaming.

"Over the past few years, our collaboration with Anima Anandkumar's group has been very productive," Alvarez says. "We have learned a great deal about how to use large data and deep learning to identify toxic conversation and behavior. This new direction, with our colleagues at Activision, gives us an opportunity to apply what we have learned to study toxic behavior in a new and important area—gaming."

For Anandkumar, the important questions this research will answer are: "How do we enable AI that is transparent, beneficial to society, and free of biases?" and "How do we ensure a safe gaming environment for everyone?"

She adds that working with Activision gives the researchers not only access to data about how people interact in online games, but also to their specialized knowledge.

"We want to know how players interact. What kind of language do they use? What kinds of biases do they have? What should we be looking for? That requires domain expertise," she says.

Michael Vance, Activision 's chief technology officer, says the firm is excited to work with Caltech.

"Our teams continue to make great progress in combating disruptive behavior, and we also want to look much further down the road," Vance says. "This collaboration will allow us to build upon our existing work and explore the frontier of research in this area."

... what? :story:
Honestly I hear more actual black guys in COD calling each other niggers than white people doing it.
 
Back