Why you should be skeptical of the Senate’s push to regulate AI
Artificial intelligence technology could help solve some of the world’s biggest problems. But it also invokes sci-fi-esque fears and concerns among the public about its potential abuse. So, it’s not surprising that the federal government is increasingly moving toward regulating AI — yet a Tuesday Senate hearing reminds us exactly why we should be deeply skeptical of this push.
The Senate Judiciary Committee hearing featured numerous industry insiders and experts who all agreed, to varying extents, that the federal government should step in and regulate AI. The various senators, both Republicans and Democrats, almost all seemed similarly inclined, with the only distinctions among them perhaps in the scale of the desired regulation and their particular areas of concern.
This level of consensus is your first red flag. The fact that even the business leaders featured, such as OpenAI CEO Sam Altman, were lobbying the Senate for more regulation should immediately tip you off that something sinister is at play here.
Sen. Dick Durbin (D-IL) acted bewildered that a businessman was asking to be regulated, but as the Washington Examiner’s Tim Carney points out, this is actually an incredibly common phenomenon. In everything from banking to Uber, we have seen industry incumbents lobby for more regulations because they know these barriers to entry will block their competitors and, as established entities, they can weather them more easily. It’s a crony-capitalist way of pulling up the ladder behind you to ensure long-term profits by using the government to block your would-be competitors.
https://twitter.com/scottlincicome/status/1658621592409079814
This is a corrupt economic practice known as “rent-seeking,” and we must fiercely guard against it as the conversation about AI regulation unfolds.
That said, some reasonable-sounding ideas were discussed during the hearing, such as regulations requiring that users be informed whenever they are interacting with an AI bot and transparency requirements that make it publicly known what data sets the AI models (which are really more like interactive search engines at this point) are being trained on. But other ideas were brought up that could very obviously rig the industry toward incumbents and block small competitors.
Sen. Lindsey Graham (R-SC), for example, suggested that the federal government ought to issue licenses to AI developers and only permit those with the government permission slip to work on this technology. While this might sound reasonable enough at first glance, we’ve already seen in countless other sectors this kind of licensing regime be used to favor politically connected companies and put costly hurdles up that competitors must struggle to jump through. It also threatens innovation because cutting-edge technological progress often comes from decentralized exploration and even individual innovators, not just big companies that can lobby for approval to do this work.
https://twitter.com/adamkovac/status/1658495384803573760
Some of the senators and experts also discussed appointing new regulators or even creating a new agency to regulate AI. They pointed to nuclear energy regulators and the Food and Drug Administration as models that this could potentially follow. But if we did take a similar approach to AI like we did with nuclear energy and pharmaceutical approval, that would actually be an unmitigated disaster.
Nuclear energy regulators have made it all but impossible to build more nuclear power plants in the U.S., even though it’s a safe, cheap, and zero-carbon-emission form of energy. Meanwhile, the effect the FDA’s slow, ultra-conservative approval process has had on blocking potentially lifesaving innovation is well studied. AI is central to the future of our economy and the world, so we should be looking at the exact opposite of these models.
Any AI regulation should be narrowly tailored to address specific problems, avoid creating and empowering cumbersome bureaucracy, and not create barriers to competition.
Lawmakers should also approach regulation with humility, acknowledging the immense limitations of our geriatric politicians’ understanding of this incredibly complex technology and the distinct lack of competency that plagues our government bureaucrats. And Congress shouldn’t forget that it has another option: Leave these complicated questions to the market and consumers closest to the industry to sort out for themselves.
This column originally appeared in the Washington Examiner
Like this article? Check out the latest BASEDPolitics podcast on Apple Podcasts, Spotify, or below: