BoSacks Speaks Out: They Didn’t Fix the News Desert, They Stripped It for Parts

By Bob Sacks

Fri, Apr 3, 2026

BoSacks Speaks Out: They Didn’t Fix the News Desert, They Stripped It for Parts

There is a long and deeply unsettling article from Poynter titled An AI company set out to fix news deserts. Instead, it copied local journalists’ work. It is worth your time, not because it is shocking, but because it is familiar. We have seen this movie before. New technology arrives wrapped in civic virtue, investors nod solemnly about access and inclusion, and somewhere in the middle of all that noble language, somebody decides that stealing is a business model.

Let’s be blunt about what happened.

Nota did not set out to destroy local journalism. They set out to save it. That is precisely what makes this story so dangerous. The villains are never interesting when they twirl their mustaches. The dangerous ones are the people who convince themselves they are doing good while quietly cutting the legs out from under the people actually doing the work.

The pitch sounded irresistible. Use AI to bring bilingual coverage to underserved communities. Fill the news deserts. Produce stories in minutes, not days. Do it for less than ten dollars a story. Scale local news the way software scales everything else. In PowerPoint land, it probably looked like mercy with a profit margin.

Instead, according to Poynter, the company copied the work of 53 journalists across 29 outlets over a six month period. Quotes were reused. Photos were reused. Story structures were reused. Sentences were reused. In some cases, the material came from Nota’s own paying clients. That takes a special kind of chutzpah(nerve). Or perhaps a special kind of startup brain fog, where every ethical red light is reclassified as a growth opportunity.

This was not AI having a bad day. This was management making a choice.

One contractor reportedly described a workflow that was as crude as it was effective: search the county, find local reporting, run it through Nota’s tools, and publish the result under a byline. When concerns about photo rights came up, leadership apparently waved them away. That tells you everything you need to know. Once the institution decides output matters more than process, the ethics are already gone. The memo just has not been written yet.

Then came the predictable dodge.

Nota’s CEO described the sites as “live experiments,” not really public facing, just limited tests, nothing to see here. Maybe a handful of views per story. Perhaps only the engineers. That might have been slightly more believable had there not been a press release, public promotion, LinkedIn posts encouraging people to share, a Wall Street Journal feature, and even a donation widget asking readers to support local journalism. You do not build a tip jar and then claim you were only doing private lab work. That is not an experiment. That is publishing. Sloppy, dishonest publishing, but publishing all the same.

What Nota built was not a solution to news deserts. It was a laundering system.

Take real reporting from real journalists. Feed it through a large language model. Strip away the attribution. Repackage it as low cost local coverage. Then point to the output as proof that AI can solve the collapse of community news.

That is not innovation. That is recycling with a stolen label.

And here is the part too many people in tech still do not understand: local reporting is not just the act of assembling sentences into article-shaped objects. It is judgment. It is memory. It is context. It is knowing which city council squabble actually matters, which school board argument is a proxy for something bigger, which bakery deserves coverage because it says something true about the community around it. It is knowing the people, the neighborhoods, the rhythms, the history. That does not come from scraping. That comes from showing up.

One of the most revealing examples in the Poynter piece involved Dina Weinstein, a journalist with decades of county-level experience. She visited a bakery, interviewed staff in Spanish, photographed the scene, researched local demographics, and wrote a story grounded in lived reporting. Nota reportedly lifted nearly all of it. That is the difference in a single example. One person did journalism. The other side performed extraction.

That is the real divide in this business now. Not human versus machine. Not old media versus new media. It is creation versus extraction.

I have said before, and I will say again, AI will absolutely have a place in local journalism. It should. Use it to transcribe meetings. Use it to search public records. Use it to translate stories into Spanish or other languages so communities are actually served. Use it to reduce routine burdens so editors and reporters can spend more time on the work that requires human judgment. There is no virtue in making a reporter waste two hours doing what a machine can do in six minutes.

But that only works if the machine is supporting journalism, not impersonating it.

The deeper problem here is not the software. It is the incentive structure. Contractors were reportedly pushed to produce 10 to 15 stories a day, with little editorial guidance and no meaningful oversight. That is not a newsroom. That is a content mill wearing civic drag. When the quotas cannot be met honestly, management has already chosen dishonesty. The only remaining question is how long it takes for someone outside the building to notice.

And yes, the underlying crisis is real. More than 50 million Americans live in areas with little or no reliable local news. That is a national failure. It should concern anyone who claims to care about civic life, democratic accountability, or whether people know what the hell is happening two miles from their front door. But a real crisis does not justify a fake solution. You do not save local news by cannibalizing the few local reporters left standing.

In some of the Nota examples, the AI-generated output was not only uncredited, it was wrong. Dates were off. Facts were stale. Legislative status was misrepresented. Lawmakers cited were no longer in office. This is what happens when a system is designed to mimic the appearance of journalism rather than uphold its standards. The result is not coverage. It is contamination.

That is not filling a news desert. That is salting the earth.

The industry should pay very close attention here, because this is not an isolated embarrassment. It is a warning. We are entering a period where more companies will claim to “support journalism” by automating away the journalists, and more investors will nod along because the unit economics sound delicious. Cheap content always sounds smart right up until you need accuracy, trust, accountability, or any of the other old-fashioned ingredients that make journalism journalism.

Nota did not fix the news desert. They found one and mined it.

That difference is not semantic. It is moral. It is strategic. And for publishers watching this unfold, it should be a reminder carved in stone: if the future of news is built on extraction instead of reporting, then the desert is not the problem. The builders are.

BoSacks Newsletter - Since 1993

BoSacks Speaks Out

Copyright © BoSacks 2026