Rescuing Local Journalism, One AI Tool At A Time – USC Viterbi

Photo credit: Geber86

Photo credit: Geber86

In a digital age where traditional journalism is shrinking, journalists are few and far between. Local news organizations are shutting down without enough readership and finances to keep them afloat; National publications are shedding jobs. Journalists’ workloads continue to increase to reflect these changes, but there is only so much a human can do.

Then there is artificial intelligence. Alexander Spangher, a graduate student at the University of Southern California’s Information Sciences Institute, set out to reduce this manual burden in a four-pronged project that will automate many of the more tedious aspects of journalism.

Former data scientist for the New York Times, Spangher saw an opportunity to use his computer skills to help reporters drowning under the flow of information – while under pressure to make the right choices. In short: What is newsworthy and can AI have an impact?

“We’re trying to take the really boring, mundane parts of the job and make them easier,” Spangher explained, “so journalists can chase more stories, find more sources, use more sophisticated resources, promote their work on more platforms, and cover more relevant topics overall.” away.”

Spangher was inspired by the compelling role journalism plays in supporting democracy and the need to preserve it.

“If journalists didn’t tell us about our world, we wouldn’t understand our communities or how to get involved. We wouldn’t be able to vote informedly,” Spangher said. “There is so much crucial, important information that journalists uncover for us. I am always impressed by the role of these journalists in liberal democracy. For example, one study found that for every dollar spent by a news outlet, $1,000 benefited society. Another found that local journalism is the single greatest factor in good government.”

The first aspect of the solution he designed is to train a model to determine the newsworthiness of information given a newspaper. The model attempts to predict whether a piece of text will appear on the front page or end up in the middle or at the back.

“This front page vs. not” model covers such a small part of the spectrum of newsworthiness – there are so many other layers, like the stories that aren’t covered at all. So we weren’t sure if it would do anything useful. But we’ve applied it to material that journalists consistently use in their reporting – like the minutes of a Los Angeles City Council meeting – and found that we’ve come up with some really interesting leads! For example, one about the LA City Council’s enforcement of equal pay between women’s and men’s soccer teams, which our algorithm ought to have made the front page of, but hasn’t been covered yet,” Spangher explained. “There are all these important stories that journalists just haven’t had the time to find and cover.”

Spangher’s adviser, Jonathan May, a research associate professor of computer science at USC, says local news has been hardest hit by the technological revolution, although there is significant need for it right now.

“A big problem with local journalism is that even in the best of times, it’s difficult to keep it going,” May noted. Some territories are “news deserts” and just don’t get the coverage they used to, but with tools like these “can.” the limited bandwidth of a local beat reporter can be increased”.

For example, regarding the LA City Council meeting notes, the goal is “to try to take these public records and go up to the most important aspects of the meeting that are worth following up on,” May added.

The second part of the project focuses on who to interview once a story idea is in place, by setting up a model that “helps journalists find sources they wouldn’t have considered, perhaps more diverse sources that still be able to meet their information needs”. Spangher highlighted.

One of the most time-consuming tasks for a journalist is posting and promoting their story on social media platforms. The goal of the third part of Spangher’s work is to develop a system that can optimize a journalist’s article to promote it on different platforms such as Facebook, LinkedIn or Twitter. The AI ​​helper could formulate different versions of the article to get the best performance and get the widest reach.

“The question at the end is what will work on the different platforms based on the dynamics of the platform, how it’s performing and what people are paying attention to. All three are largely algorithmic,” Spangher explained. “So it makes sense that one algorithm should be able to meet the needs of another algorithm.”

The final piece of Spangher’s work examines how stories have grown – or not – since they were published. He amassed a large data set of various articles and their versions with the intention of “tracing the arc of a story over time”. Spotting trends and tracking “which stories are evolving and which stop evolving are important questions,” he stressed. He received an Outstanding Paper Award at the 2022 NAACL for this work.

Spangher is adamant that despite the remarkable impact of these new tools, he believes AI will never replace the journalist. “The core of journalism is a fundamentally human process,” he said.

However, one of the most challenging aspects of this work, according to Spangher, was figuring out “how to think about tools that we think will be helpful without scaring journalists or making them feel like machine learning was doing their job.” takes”.

Research is still in its infancy, but Spangher’s ultimate goal is for these tools to be available in some form to news organizations and reporters everywhere.

“In maybe five or 10 years, I’m hoping to explore the possibility of creating a start-up that’s like a service platform that really brings all these tools together and helps journalists along the entire story production pipeline,” said the graduate student.

Published on 11/28/2022

Last updated on November 28, 2022