Why I’ll Quit Publishing the McKinley Park News

Published May 24, 2023

Since 2016, the McKinley Park News has served Chicago’s McKinley Park neighborhood with unique, free news coverage and reporting, neighborhood calendars, memberships, interactive features and community support.

We’ve published tens of thousands of event listings and nearly 1,000 original local news articles on topics usually covered nowhere else, with ongoing reporting following important stories and regular scoops on neighborhood news with appeal that often reaches beyond McKinley Park.

We’ve supported dozens of our neighborhood’s community groups, non-profits, schools, elected officials and others through our free “Institutions” program, as well as shared their news and updates, and published neighborhood resources and guides, which no other news sources do.

Step by step, we’ve also been working very hard to build this effort toward a “sustainable” local news business: an enterprise that can support the livelihood of a neighborhood journalist.

But all of this means nothing if AI is allowed to steal from us.

Terms of Service

From our beginning, I’ve rigorously protected our original content and Service from unauthorized use: everything from ejecting attempted spammers from our membership rolls to placing on every page of our website a copyright notice and link to our Terms of Service, which clearly define what’s allowed and what’s not.

One explicitly prohibited use is unauthorized access to and employment of our Service for training of automation systems, informing machine learning or algorithms, integration into large language models, or any of the other technologies currently falling under the rubric of “AI.”

Large Larceny Models

Imagine my despair — but perhaps not my surprise — when I discovered the McKinley Park News had seemingly been hoovered en mass into multiple large language models, all without asking permission, providing notice, or offering any remuneration.

In case you don’t know, large language models (or “LLMs”) grab monstrous amounts of content from the Internet to build human-seeming responses based on the statistical chances of word relationships. They’re the basis for a new way users will interact with interfaces, as well as the seed for an arms race among the biggest players and upstarts in the tech industry, supported by billions of dollars in profits and funding.

However, not a cent of this money is going to local news publishers like the McKinley Park News, despite the reliance on our content to build and power these commercial products and services. It doesn’t matter that we’re just a tiny part of the whole: Given the uniqueness of our content, I’ll bet any McKinley Park neighborhood AI query will rely on the McKinley Park News to provide an answer.

Competing Against Ourselves

We won’t even get any web traffic, eyeballs or audience benefit from AI’s theft of our content, since its users stay inside its interface. There’s simply no reason to visit a local news website anymore: “Hey AI, tell me all about the top stories the McKinley Park News reported on this week.”

So not only is our content being stolen, it’s then being used as the basis of products and services that directly compete against us, all the while attracting massive funding, monetization and revenue that we’ll never see. Even the thieves’ customers will be able to easily launch products that directly compete with us, built from our own content and labor. How is this fair, legal, or in any way sustainable?

Robots Will Replace Us

Stretching beyond theft of our content is theft of our expertise. The language, editing and context used to present the McKinley Park News not only represent untold thousands of hours of my human labor, but are also the result of a career's worth of hard-won expertise and knowledge applied to this project and business.

In much the same way that a for-profit writing school can’t use our content without authorization as the basis of their curriculum, so also can’t an AI or any automated process purloin our expertise by training itself on the high quality and editorial soundness of our content.

AI’ed Myself Out of a Job

It makes me sick to my stomach to think about human copy editors who will be fired from newsrooms because they can be replaced by automated processes trained in part on our excellent and consistent grammar. I might even be putting myself out of a future job by publishing my own content and sharing my own expertise, assuming AI is allowed to continue stealing all of it.

Local news outlets are already strapped for revenue, and content syndication is a well-established and often fruitful strategy to support operations and profits. By stealing our content, large language models rob us of all past, current and future licensing opportunities and revenue toward classic content syndication, as well as new paid services like training AI or whatever other solutions we’d care to develop based on the output of our human labor.

Perpetual Theft

This applies to not only the LLMs themselves, but also to every one of their customers, who will be able to employ our content and expertise for their labor benefit without remuneration or credit to creators, including the McKinley Park News.

The theft lasts forever: The large language models are only going to keep getting bigger and building on what they’ve already created. Content and expertise from publications like the McKinley Park News could be used without authorization for decades, supporting ongoing profits and benefits for others, but providing nothing for ourselves.

Grim Legal Options

Unfortunately, my initial research on accessible legal options is grim. This is a “new frontier” for law, so neither media attorneys nor their legal institutions have seemingly developed approaches or practice for dealing with these problems (although our research and outreach is continuing).

Likewise, legal and journalism institutions seem to have completely missed the ball on this fundamental problem with AI and how it’s going to destroy local news outlets like the McKinley Park News. It’s simply not a topic under consideration.

Floundering or Complicit

Even worse, many institutions and news enterprises are wholeheartedly embracing, adopting and advocating for AI technology without any attempt at considering ethics or ensuring they’re not using products and services based on stolen goods. Everyone seems to be either floundering or complicit.

Of course, government is way behind the times on protecting local news from this theft. The U.S. Copyright Office uses an antiquated standard to let publications bundle content into “serial” editions that offer qualified copyright protection for all the material within.

However, websites are explicitly excluded from this protection: It’s only something enjoyed by print. I’d apparently have to register each individual online news article — at a cost of $95 each, plus all the labor to manage filing — to get direct copyright protection. There’s neither time nor money for this at a tiny micro-local news outlet like the McKinley Park News.

Perpetuating Harm and Inequity

In addition to the media law firms I’ve contacted being gun-shy about what seems to be a straightforward tort, many of them are automatically disqualified as potential resources, since they’re already doing business with the giant tech firms selling large language model services based on stolen content, including ours.

As I wrote in a reply to one of the lawyers who had to turn us down: “I would encourage you and your colleagues … to consider the caliber of clients you represent and how their actions with support from firms like yours may perpetuate harm and inequity, including and especially for local news.”

Grim Product Options

Our technical options to fight this are as grim as our legal ones. Right now, all of our news is free and available to the public, and it attracts good and growing traffic. To shield it from unauthorized AI use, I could put all of our content and features behind our paywall, so it would be unavailable to AI … and to everyone else who didn’t log in.

Even keeping most of our content and features available at no cost, removing our news from public access would kill our traffic, audience and community. It would be especially bad for our advertising revenue.

Subscriptions Are Not Enough

Make no mistake: Advertising must be a primary part of our local news revenue strategy. Analysis over the past year with the Metro Media Lab at Northwestern University’s Medill School identified the likely number of paid subscribers we could attract, given our audience and coverage. It’s not nearly enough to fund a neighborhood news business.

There’s certainly plenty of money to be made in advertising: It’s how Google gets most of its money, and it’s how newspapers used to make lots of money. For AI startups, advertising is one of its biggest attractors of funding as investors salivate at AI’s ad monetization potential.

However, just like all the other money going to AI now and in the future, independent publishers like the McKinley Park News won’t see any revenue from AI advertising. Indeed, AI’s theft of our content may power competitors who sell advertising directly against us in our own market: another example of how our stolen content will be used against us.

People Are Behind This

I don’t think it’s coincidence that some of the biggest companies investing in and launching AI are also some of the biggest names in online advertising, including those who corruptly manipulate all ends of the online advertising marketplace via a cross-industry monopoly that should have been broken up long ago by the U.S. Federal Trade Commission.

One of the fallacies of the media’s reporting on contemporary AI (and other new technologies like blockchain) is falling prey to the tech industry’s masking itself in obscure terminology and assigning blame to the technology itself, instead of the people behind it.

Move Fast, Break News

However, it is people making these decisions to research, develop, market and sell products that include stolen goods: from the McKinley Park News and seemingly many others. “It’s OK because we’re stealing from everyone” is no excuse.

A motto of tech and business bros is “move fast and break things.” For businesses that profit from pirate AI, one of the things they are breaking is local news, including businesses like ours, maybe forever.

Tech Won’t Act Responsibly

Thieves will never do the right thing, and we cannot count on the tech industry to police and regulate its own AI use. They’ve already shown us they are not to be trusted.

The McKinley Park News calls for the following things to remedy this situation.

For all Large Language Model developers, operators and purveyors:

  • Immediately remove any McKinley Park News content, language and information from your Large Language Models and any other services, products, servers and development environments. We have no active agreements to license our Service with anyone, so if you have our content in your product or service, or are using it to develop such, you’re using it without authorization.
  • Ensure that in the future, you do not inadvertently include our content in your products and services, including your Large Language Models, without authorization.
  • Have your business team get in touch with us to discover how authorized licensing of our high-quality content can make your AI believable, relevant and current, especially for the unique, micro-local news topics we cover.

For Large Language Model developers, operators and purveyors who have included our content and information in their products and services:

  • Have your comptroller contact us immediately about invoice receipt and billing for past and current unauthorized use of our content in your product or service and toward support of your solicitation of investment and other funding.
  • Immediately halt all business and other operations that are connected in any way to your products and services that employ our content without authorization, including advertising your services, soliciting funding or partnerships, and offering options to investors.

For Large Language Model business customers and enterprise users who employ LLM services toward their own products, services and/or labor benefits:

  • Immediately discontinue use of any Large Language Model-powered product or service that includes unauthorized content from the McKinley Park News. The LLMs do not have a license to use or benefit from our content and expertise, and neither do you.

For journalism and media institutions, including schools, associations, conferences and advocacy organizations:

  • Place an immediate moratorium on any AI training, advocacy or adoption of products and services that include unauthorized content from the McKinley Park News. If you don’t, you are training others to steal from us.
  • Develop and internally implement an AI and LLM ethics policy and usage standards that consider and prohibit unauthorized use of content from local news publishers like the McKinley Park News.
  • Be leaders in our industry by adopting policy positions and best practices for AI and news issues based on a solid media philosophy that anticipates, rather than reacts to, new technology development and related industry upheaval.

For government elected and regulatory officials at the municipal, state and federal levels:

  • Place an immediate moratorium on any government use of Large Language Models that have been built on stolen content, such as that from the McKinley Park News.
  • Adopt regulation to compel developers, operators and purveyors of Large Language Models and related services to ensure they do not include or use any unauthorized content.
  • Adopt regulation to compel developers, operators and purveyors of Large Language Models and related services to use and license third-party content and information following the basis of an opt-in relationship, where third parties must explicitly consent to such use, rather than an opt-out relationship, which compels publications like the McKinley Park News to engage in arbitrary, patchwork and often ineffective measures that require extra cost and labor.
  • Overhaul the antiquated standards of the U.S. Copyright Office to support accessible copyright protection for sources of original news, including those solely publishing online.

For leadership and fellow members of LION Publishers and the Chicago Independent Media Alliance (CIMA):

  • Quickly and formally adopt a motion to include the following requirement for membership:
    If a member organization uses a Large Language Model or similar AI technology or service, the member organization must qualify that the service does not contain or use unauthorized content or expertise, including any taken from fellow member organizations.
  • Abstain from, or proceed with caution, for discovery related to AI and LLM use or misuse for your own publication, since engagement with such services may remove rights and your options for restitution in case your content has also been purloined.
  • Come together and collectively organize for best benefit of our publishing businesses and organizations in regards to AI and LLMs that wish to employ our content. We’re all starved for revenue, right? They want our content! We should be able to make money off this new revenue stream, on our own terms.

For any consumer and user who’s engaging with contemporary AI based on Large Language Models, including all the recent chat tools:

  • Don’t do it: You’re likely assisting those who are stealing content and expertise from the McKinley Park News and so many others.
  • Realize that a likely condition of your use of AI is that it learns from all the information and interaction you give it: for its products, services and profits. Just like the McKinley Park News, you won’t see any remuneration either, while others profit from your contribution.

From what I’ve seen thus far, the tech industry’s response to concerns about AI theft have ranged from proposals to bully all publishers into a corporate-controlled licensing regime, to having to add a new tag to our web pages or else the AI will steal them from us. "Solutions" like this are both unfair and unacceptable.

Don't Be a Thief

I have no problem with the technology itself: Do you want to use a Large Language Model for yourself, your organization, or your business? Great! Just make sure it doesn’t include or employ content stolen from the McKinley Park News.

I’m infuriated about the current state of affairs, and I’m ready to fight! I hope that other publishers and creators want to as well. Please reach out through this website’s communication channels or via my contact information, below: I’d like to start getting like-minded publishers, creatives and other allies together for action.

Goodbye, Local News

If this situation is not remedied, there will be simply no reason for me to continue publishing the McKinley Park News, writing neighborhood journalism, or trying to launch my own local news business. Everything I publish will be stolen and used against me in rival interfaces and products I’ll never be able to compete against.

That’s why I’ll have to quit publishing the McKinley Park News if things don’t change. I’d imagine the same applies to countless other independent publishers and media outlets. If AI is allowed to steal from us, AI will destroy us, including things we value and rely on like local news.

I thank you for your attention and for your interest in the McKinley Park News and other victims of larcenous AI and the people behind it.

Justin Kerr
Publisher, McKinley Park News
(312) 560-1115

Related Topics

Audrey Teabow
This is horrible to hear about! Without local events and news and all things going on, I’m going to feel like a Datsun dog paddling in circles in a tiny pool.
You are a fantastic journalist and I hope this works out somehow.

Justin Kerr
Thanks, Audrey! Your unwavering encouragement and support is always very much appreciated.

Log In to comment on this item.