Tuesday, December 24, 2024
HomeNews ExchangeProtecting Local News in the New AI Economy

Protecting Local News in the New AI Economy

The ground is once again shifting under the nation’s beleaguered local news industry. For the past twenty years, traffic has been the cornerstone of the web economy. The larger a news site’s audience, the more advertisers are willing to pay. That model, tenuous at best given the dire financial straits news outlets find themselves in, is giving way to a new, AI-centered web economy where content, not traffic, is king, explains David Gehring, founder of Distributed Media Labs. Gehring spoke with EMS Editor Peter Schurmann about what this means for the local news industry and what needs to happen to ensure its survival in this new AI era. This interview has been edited for length and clarity.

How would you describe AI’s relationship to local news at this moment?

Relatively non-existent. Some AI products are focused on how to use AI for content creation… in other words, in aiding newsrooms to do what they do in creating content. I’m not as interested in that. I’m much more interested in the distribution and monetization side of the game. How do you position local media and local information in a way that doesn’t get destroyed economically as we move from our current digital economy, which is based on traffic and scale, to an AI economy which is really based on content.  

What does that transition from a web economy based on traffic to an AI economy based on content look like?

David Gehring is the co-founder and CEO of Distributed Media Lab. (Image via DML)

Take for example the New York Times. Say hypothetically the New York Times has 150 million monthly unique visitors. And they are publishing 100 stories a day. They make money because of their 150 million visitors, not because of the 100 stories per day that they publish. Now, as we move into the AI economy, the value of the New York Times to an AI platform is its content, not its audience, because that content is what is used to train the technology that AI is built on.

Meanwhile, the local news industry across North America is composed of four or five thousand independent publishers publishing 15,000-25,000 stories per day. These stories are not going to get the same economic value with an AI platform that the 100 stories the New York Times is producing, because the New York Times is one publisher that can work out a deal with an AI company that the thousands of individual publishers can’t work out.

What does that mean for local media?

The strength of democracy depends on vibrant local journalism. Vibrant local journalism depends on a sustainable business model. But the fact that the local media industry is inherently disaggregated… means everyone’s audience is being monetized individually. And under the current model, the way to aggregate that audience is to have third party web advertising companies put code on all those sites so they can monetize it on the media’s behalf. With the AI economy, where content is more valuable than audience, the same question persists; how do we aggregate local media but in a way that benefits publishers?

And do you see any viable paths forward on that front?

The key question is, what are the technologies we can create that would enable the local media industry to organize itself rather than relying on third-party proprietary technologies that are operated by independent companies that are venture funded and need to extract value for the sake of their shareholder return. What we’re living through right now is an opportunity to not make the same mistakes that were made 20 years ago, when publishers were content to relinquish the value of their web-based content to the ad tech industry. In doing so, they disconnected themselves from their customers, who are the advertisers. And I think AI platforms should be interested in establishing a proper economic relationship with content providers because they don’t want to get down the road and have the game change on them through regulatory developments.  

 The strength of democracy depends on vibrant local journalism. Vibrant local journalism depends on a sustainable business model.

-David Gehring, Co-Founder and CEO of Distributed Media Lab

A couple of bills in California right now – AB886 and SB1327 – are looking to charge big tech to bolster local media. Are these examples of the kinds of regulatory frameworks you mention?    

The crux of SB1327 is to establish a tax on digital ads, which is not necessarily a good or bad thing as much as it is irrelevant to the whole AI conversation. It’s not future facing in that when you establish a tax to generate revenue from the way ads work today, you effectively codify into stone the way ads work today. And that is not how they will work tomorrow.

The original version of 886 was a link tax, meaning that when Google returned a link and a snippet in a search result, they would have to pay the publisher. That got gutted in later versions of the bill. But if Google has to pay for that, they’ll want to do it less. And with AI, Google search just returns an answer with no links to it. So, search ceases to be a valid referral source.

(Note: State Assemblymember Buffy Wicks, who authored AB886, announced Wednesday her office had secured an agreement between the state and major tech companies for funding of up to $250 million for local journalism in the state.)

Where do you see ethnic media in all of this?

Ethnic media content in an AI economy is valuable because it’s being created, written and edited by humans. This is high quality content if it is published in a way that allows it to be efficiently ingested into an LLM (Large Language Model, used to train AI). The problem is LLMs are focused on English now. Maybe we’re somewhere with Spanish, but hardly. Outlets publishing in a language other than English are not even on the field. So, the big question is, what are the tech standards that we need publishers to adhere to that structure data so it can be efficiently ingested into LLMs and tag it so it can be tracked.

How does your company propose to tackle this question?

I participated in launching an open-source project back in 2014 now referred to as the AMP project, which was designed to create an open-source specification for content syndication on the web. Basically, structuring a webpage in a way that could be easily rendered on other domains. That project still exists. We’re exploring ways to revisit some of those original ideas that were intended to structure content in a way that makes it much easier to manage through syndication and perhaps also to AI platforms. I am just a big fan of commercial solutions whenever possible so we don’t get 10 or 15 years down the road and need government intervention, which risks making everything unintentionally upside down and backwards.

David Gehring is the co-founder and CEO of Distributed Media Lab, which works with publishers to aggregate and syndicate content across the web.

Social Ads | Community Diversity Unity

Info Flow