Tech Leaders Once Cried for AI Regulation. Now the Message Is ‘Slow Down’

Tech Leaders Once Cried For Ai Regulation. Now The Message Is ‘slow Down’

The real puzzle of this bill, which Schiff himself referred to in a committee meeting this week as a “first step,” is that no one knows whether using copyrighted work for AI training is legal. (Maybe it’s not such a puzzle when one notes that Schiff is running for a US Senate seat and that the bill is supported by all of Hollywood’s unions and trade groups.) Despite the current lack of reporting requirements, multiple suits have been filed against AI companies by creators who have identified their works in the data sets. (I should disclose here that I sit on the council of the Authors Guild, which is among a horde of plaintiffs suing OpenAI and Microsoft, and a supporter of Schiff’s bill. I speak for myself here.) The success of those lawsuits ultimately depends on whether the courts determine that the companies are violating the fair use provision of copyright laws.

Whatever tack the courts take, it will be based on copyright law that didn’t anticipate an artificial intelligence that could suck up all the prose and images the world has to offer. Figuring out what fair use means in the age of AI is a job for Congress. That’s the kind of difficult decision that legislators need to make when dealing with an innovation that alters the landscape. And like privacy and other tech-driven modern problems, it is exactly the kind of decision that our 21st-century lawmakers manage to avoid. (Schiff staffers told me, “We are certainly waiting to see the fair use issue play out in the courts.”)

So it’s no wonder that Levie, when I reached him on the phone after his day in DC, told me he’s feeling pretty good about the system. “The overall message from Congress is, ‘Let’s get this right, there’s not a lot of points for moving too fast.’ They’re taking it with a high degree of thoughtfulness, versus ‘Let’s just rush to have something to say that we are regulating.’” It turns out that Levie didn’t have to stop the government single-handedly. It’s already hit the brakes.

Time Travel

AI legislation was touted as a slam dunk last May, when it seemed that the government was all in on passing laws to contain a possible menace. I wasn’t so sure. The headline of my Plaintext essay was “Everyone Wants to Regulate AI. No One Can Agree How.” A year later, despite signs of reined-in ambition, that’s still the case.

The White House has been unusually active in trying to outline what AI regulation might look like. In October 2022—just a month before the seismic release of ChatGPT—the administration issued a paper called the Blueprint for an AI Bill of Rights. It was the result of a year of preparation, public comments, and all the wisdom that technocrats could muster. In case readers mistake the word blueprint for mandate, the paper is explicit on its limits: “The Blueprint for an AI Bill of Rights is non-binding,” it reads, “and does not constitute US government policy.” This AI bill of rights is less controversial or binding than the one in the US Constitution, with all that thorny stuff about guns, free speech, and due process. Instead it’s kind of a fantasy wish list designed to blunt one edge of the double-sided sword of progress. So easy to do when you don’t provide the details!