Google Tests A.I. Tool That Is Able to Write News Articles - Looks like journalism will be automated before trucking

The product, pitched as a helpmate for journalists, has been demonstrated for executives at The New York Times, The Washington Post and News Corp, which owns The Wall Street Journal.​


1689859701241.jpeg
Google is testing a product, known internally as Genesis, that can take in information and produce news stories. Alastair Grant/Associated Press

By Benjamin Mullin and Nico Grant
Published July 19, 2023 Updated July 20, 2023, 12:47 a.m. ET

Google is testing a product that uses artificial intelligence technology to produce news stories, pitching it to news organizations including The New York Times, The Washington Post and The Wall Street Journal’s owner, News Corp, according to three people familiar with the matter.

The tool, known internally by the working title Genesis, can take in information — details of current events, for example — and generate news content, the people said, speaking on the condition of anonymity to discuss the product.

One of the three people familiar with the product said that Google believed it could serve as a kind of personal assistant for journalists, automating some tasks to free up time for others, and that the company saw it as responsible technology that could help steer the publishing industry away from the pitfalls of generative A.I.

Some executives who saw Google’s pitch described it as unsettling, asking not to be identified discussing a confidential matter. Two people said it seemed to take for granted the effort that went into producing accurate and artful news stories.

Jenn Crider, a Google spokeswoman, said in a statement that “in partnership with news publishers, especially smaller publishers, we’re in the earliest stages of exploring ideas to potentially provide A.I.-enabled tools to help their journalists with their work.”

“Quite simply, these tools are not intended to, and cannot, replace the essential role journalists have in reporting, creating and fact-checking their articles,” she added. Instead, they could provide options for headlines and other writing styles.

A News Corp spokesman said in a statement, “We have an excellent relationship with Google, and we appreciate Sundar Pichai’s long-term commitment to journalism.”

The Times and The Post declined to comment.

Jeff Jarvis, a journalism professor and media commentator, said Google’s new tool, as described, had potential upsides and downsides.

“If this technology can deliver factual information reliably, journalists should use the tool,” said Mr. Jarvis, director of the Tow-Knight Center for Entrepreneurial Journalism at the Craig Newmark Graduate School of Journalism at the City University of New York.

“If, on the other hand, it is misused by journalists and news organizations on topics that require nuance and cultural understanding,” he continued, “then it could damage the credibility not only of the tool, but of the news organizations that use it.”
News organizations around the world are grappling with whether to use artificial intelligence tools in their newsrooms. Many, including The Times, NPR and Insider, have notified employees that they intend to explore potential uses of A.I. to see how it might be responsibly applied to the high-stakes realm of news, where seconds count and accuracy is paramount.

But Google’s new tool is sure to spur anxiety, too, among journalists who have been writing their own articles for decades. Some news organizations, including The Associated Press, have long used A.I. to generate stories about matters including corporate earnings reports, but they remain a small fraction of the service’s articles compared with those generated by journalists.

Artificial intelligence could change that, enabling users to generate articles on a wider scale that, if not edited and checked carefully, could spread misinformation and affect how traditionally written stories are perceived.

While Google has moved at a breakneck pace to develop and deploy generative A.I., the technology has also presented some challenges to the advertising juggernaut. While Google has traditionally played the role of curating information and sending users to publishers’ websites to read more, tools like its chatbot, Bard, present factual assertions that are sometimes incorrect and do not send traffic to more authoritative sources, such as news publishers.

The technology has been introduced as governments around the world have called on Google to give news outlets a larger slice of its advertising revenue. After the Australian government tried to force Google to negotiate with publishers over payments in 2021, the company forged more partnerships with news organizations in various countries, under its News Showcase program.

Publishers and other content creators have already criticized Google and other major A.I. companies for using decades of their articles and posts to help train these A.I. systems, without compensating the publishers. News organizations including NBC News and The Times have taken a position against A.I.’s sucking up their data without permission.

Source (Archive)
 
total journo death. i hear, from some good reliable sources, that anyone afraid of losing their job to automation can just learn to code.
here you go journos, start brushing up:
View attachment 5222495
Why the fuck would you curse them to being JavaScript devs JFC we have too many of those as it is
 
There really is no valid argument for AI journalism. At all. It’s a terrible idea and the non-stop “it’ll free people up” excuse is already overused.

Most people don’t trust major news outlets now. If AI is given any influence the industry will be destroyed. Which I suspect is the goal in some cases.
 
  • Thunk-Provoking
Reactions: Bugman's Burden
That was actually a language on my short list along with C+ I wanted to try...
id also say skip c++, it is trash, just stick with good ol' C.
i also highly recomend Lua. sure trannies use it for gamedevs, but it isnt infected anywhere near to the extend that rust is, but it is still a great language and easy to learn. though professionally the options are a bit limited, but for personal projects it is what i always tend to use.
 
  • Informative
Reactions: WelperHelper99
In college while discussing the then popular Occupy protests, I noted that there weren't many scientists, engineers, or tech workers at those protests. A journalism student said "they're all going to be replaced by computers!". I often think about the irony of that statement now, and how it's obvious that journalism is another one of those jobs (when done by people who went to college for it, I don't think James O'Keefe or Tim Pool who are making a decent living/grift out of it even went to college) that is for people who are not really all that useful.
 
  • Like
Reactions: FlappyBat
Is... is no one else going to say it?
Fine, fine, for fuck's sake.

Will we even know the difference?

“If this technology can deliver factual information reliably, journalists should use the tool,” said Mr. Jarvis, director of the Tow-Knight Center for Entrepreneurial Journalism at the Craig Newmark Graduate School of Journalism at the City University of New York.

“If, on the other hand, it is misused by journalists and news organizations on topics that require nuance and cultural understanding,” he continued, “then it could damage the credibility not only of the tool, but of the news organizations that use it.”
Information is fast becoming a trust-based system, and somehow you faggots have managed to miss that you've turned your entire profession's reputation to shambles in the last 8 years. Most of the people who still trust "the news" wouldn't care if it was written by a journalist, a mandrill, or a fucking toaster; those who don't would check their windows if you claim the sky is blue.

While Google has moved at a breakneck pace to develop and deploy generative A.I., the technology has also presented some challenges to the advertising juggernaut. While Google has traditionally played the role of curating information and sending users to publishers’ websites to read more, tools like its chatbot, Bard, present factual assertions that are sometimes incorrect and do not send traffic to more authoritative sources, such as news publishers.
Of all the words of tongue or pen,
The saddest are, "Alex Jones was right again".

Maybe if your "authoritative sources" weren't getting pantsed 24/7 by random asshole blogs that bothered to follow the money (and really, really did not like what they found), you wouldn't be in this situation.
 
Didn't know there was a difference but alright, thanks man!
They're similar enough that I was taught Java in school, and all but immediately picked up Javascript and C# - You have to learn a few different instructions/terms/tricks but for the most part its pretty transferable as long as you actually understand the concepts rather than just learning to repeat rote instructions.
 
There really is no valid argument for AI journalism. At all. It’s a terrible idea and the non-stop “it’ll free people up” excuse is already overused.
There absolutely is. "AI journalism" is just a good search engine.
Will we even know the difference?
We won't, and this is really the best part if no one assassinates Elon Musk. There's no need for a journalist to "write" articles if an AI could write one customized for you from various content feeds.
 
Google is testing a product that uses artificial intelligence technology to produce news stories, pitching it to news organizations including The New York Times, The Washington Post and The Wall Street Journal’s owner, News Corp, according to three people familiar with the matter.
That explains the drop in quality of NYT’s articles.
“Quite simply, these tools are not intended to, and cannot, replace the essential role journalists have in reporting, creating and fact-checking their articles,” she added. Instead, they could provide options for headlines and other writing styles.
Problem is, journalists don’t fact check their articles. US gov would punish them if they would. In 1964 SCOTUS essentially created a federal law protecting journalists from punishment, and in 1968 they clarified that this protection granted the Journalists more protection and securities the worse job they did in their reporting. Quite simply, they put it as them creating a “premium on ignorance, [and] encourag[ing] the irresponsible publisher not to inquire”.

Journalism never recovered after this, for the Supreme Court made it the Supreme Law of the land that if Journalist was to seek any protection for their reporting, their reporting must not be worth enough to wipe your ass
 
They're similar enough that I was taught Java in school, and all but immediately picked up Javascript and C# - You have to learn a few different instructions/terms/tricks but for the most part its pretty transferable as long as you actually understand the concepts rather than just learning to repeat rote instructions.
Just about every language in regular use these days uses C-like syntax, and as such they're all very easy to jump between. The few exceptions are getting deader every day, except for SQL which is going to haunt these lands for the next thousand years.

On topic: AI has been able to out-write the modern journo since Smarterchild. I don't know why it's taken so long for companies to discard the mountain of chaff clogging up their publications.
 
  • Like
Reactions: Neo-Nazi Rich Evans
I don't know why it's taken so long for companies to discard the mountain of chaff clogging up their publications.
Hallucination and Libel Laws. Its not a stretch for an AI to go on a journo-rant against a private figure in an article, and hallucinate that they're also a pedophile or had some criminal record. Unless the editor is checking every detail, which they won't ever do, the AI will have just libeled a private individual. Public figures are harder to nail with those laws, but private individuals? You'll be in legal hell. And "Well the AI got it wrong" doesn't protect them from publishing lies.

Mouthbreathing Journalists aren't much better, but at least in that situation, they can try and shove the legal matter off onto the writer, especially if they're freelance. If it came from an AI run by the publication, and run through a publication editor, its all on them.
 
  • Thunk-Provoking
Reactions: Mooger Meng
Back