Technology

One senator’s big idea for AI

With other lawmaking thin on the ground, Sen. Gary Peters is quietly pushing an idea: On AI, the government should start by regulating itself.

Sen. Gary Peters (D-Mich.) arrives for a classified briefing.

The abrupt rise of generative artificial intelligence has kicked off a flurry of action on Capitol Hill, with Senate Majority Leader Chuck Schumer leading the charge through his sweeping proposal to regulate all aspects of the AI landscape.

But with no clear legislative targets or even a strong sense of what needs solving, concern is growing that Schumer and the three senators tapped as his AI lieutenants could end this term without passing meaningful rules for the AI industry.

There is an exception to the legislative inertia: Sen. Gary Peters (D-Mich.).

While not a headline name in the broader conversation about AI in Washington, Peters — who chairs the Senate Committee on Homeland Security and Governmental Affairs — pushed several AI bills through Congress in the years preceding this spring’s sudden hype cycle. He’s already sent two AI bills to the Senate floor this year. And last week Peters quietly introduced a third bill, the AI LEAD Act, which is scheduled for a markup on Wednesday.

His bills focus exclusively on the federal government, setting rules for AI training, transparency and how agencies buy AI-driven systems. Though narrower than the plans of Schumer and others, they also face less friction and uncertainty — and may offer the simplest way for Congress to shape the emerging industry, particularly when it’s not clear what other leverage Washington has.

“The government is going to be one of the largest purchasers of AI systems,” said Daniel Ho, a member of the White House’s National AI Advisory Committee and associate director at Stanford University’s Institute for Human-Centered AI, “so the standard that it sets will have a pronounced impact on responsible AI innovation.”

Not all of Peters’ ideas are new ones, and the track record of his previous efforts suggests they may take longer than expected to have an effect. But because Peters has a reputation as an inside operator skilled at getting his priorities into bigger bills, his moves are attracting the attention of the tech industry.

“Sen. Peters is a workhorse, and he grinds away at issues, and he makes progress that way,” said Craig Albright, vice president for U.S. government relations at BSA - The Software Alliance.

Overall, Washington’s effort to regulate AI over the past few months has been frenzied but unfocused and hard to track. Competing proposals have been introduced in both chambers, largely eclipsed by Schumer’s promise to deliver a big AI regulation — an effort that still lacks many specifics. Schumer, on Tuesday, announced the inclusion of a separate amendment to the Senate’s year-end defense bill that would change how the Pentagon approaches AI. The majority leader called it the Senate’s “first opportunity this year to pass real AI legislation.”

In an interview with POLITICO, Peters said he doesn’t know how Schumer chose his three AI lieutenants — Sens. Todd Young (R-Ind.), Mike Rounds (R-S.D.) and Martin Heinrich (D-N.M.) — but added that he and his staff are in regular contact with Schumer and the group on AI, and that the majority leader is committed to passing his government-focused bills this Congress.

Peters’ bills aren’t radically new ideas; in fact, AI experts said the senator and his staff are largely running with recommendations painstakingly developed well before this spring’s runaway AI hype cycle. But they suggested that’s no small feat in a Congress riven by dysfunction and delay.

“They have been very effective at putting pen to paper,” said Ho.

The senator’s Transparent Automated Governance Act, which advanced out of his committee last month, would require agencies to be open about their AI use and create an appeals process for citizens who believe they’ve been wronged by an automated system.

“They operate in a fairly opaque fashion, it’s sometimes difficult to know exactly how they arrived at the ultimate decision that was made,” Peters said. He called that particularly problematic on the government side, “where some federal AI systems may be making decisions related to benefits, for example.”

If that bill becomes law, Peters believes those rules will eventually trickle out into the private sector.

“It allows us to test some of these ideas and see how it actually works in practice, and that could be a model for what we want to do broadly in the commercial side,” Peters said.

In addition to the TAG Act, Peters has also pushed out of committee his AI Leadership Training Act, which would establish new AI training programs at federal agencies. Both bills have been sent to the Senate floor, where they await either a standalone vote or (more likely) inclusion in a must-pass package somewhere down the line.

The most recent bill, Peters’ AI LEAD Act, is scheduled for a markup in the senator’s committee on Wednesday. That bill would require federal agencies to appoint a “chief AI officer” that serves as the point person on the acquisition and use of AI systems by that agency. It would also create a “chief AI officers council” — modeled on similar interagency confabs for chief acquisition, information and data officers — that would meet regularly to swap notes and sketch out a government-wide AI strategy.

Peters said he plans to advance more AI bills later this year, including a proposal to overhaul the government’s moribund procurement process for advanced technologies.

“Weapons systems, particularly those driven by AI systems, are coming very quickly,” Peters said. “Our adversaries are developing them very quickly. And the current procurement system we have in the federal government is simply not equipped to deal with fast-moving technology.”

Peters said his engagement with AI started several years ago, when he tried (and failed) to pass a bill that would have boosted federal support for self-driving cars.

“When I was meeting with the engineers on this, they basically told me the self-driving car is the moonshot for artificial intelligence,” Peters said. He added that he came away from those conversations “thinking an awful lot about AI in all of those applications, from a military perspective to commercial.”

In 2019 Peters introduced the Deepfake Reports Act, legislation that directed the Department of Homeland Security to conduct an annual study of how AI is supercharging the quality of forged videos. The bill was signed into law as part of 2020’s year-end defense bill. Peters’ AI in Government Act, which tasked the Office of Management and Budget with crafting guidance on the agency use of AI, also became law in 2020.

Last year’s sprawling CHIPS and Science Act included a Peters bill, the AI Scholarship-for-Service Act, that established a recruitment pipeline for AI experts at all levels of government. The senator’s AI Training for Acquisition Workforce Act, signed into law last September, required the creation of new training programs for federal workers that purchase and manage AI systems. And his Advancing American AI Act, included in last year’s defense bill, gave further guidance to OMB as it sketches out an agency-wide AI framework.

The fate of those laws suggests one possible hurdle for Peters’ new legislation. The impact of Peters’ AI bills will ultimately depend on how seriously they’re taken by federal agencies they affect, and adoption of his prior provisions has not always been quick. OMB, for example, is just now getting around to crafting the agency guidance required by Peters’ AI in Government Act, which became law more than two and a half years ago — an eternity compared to the fast-evolving AI landscape.

“We can’t be operating at the normal speed that we’re used to when it comes to agency implementation of laws,” said Peters, who explicitly criticized OMB for dragging its feet. The senator said he won’t hesitate to haul agency leaders before his committee if he thinks he needs to “press them on moving forward with legislative mandates.”

Some of the bills Peters has yet to unveil this Congress could face stiff headwinds — particularly his plan to reform federal procurement processes, which the senator said could be “broader than just AI systems.”

Christopher Padilla, vice president of government and regulatory affairs at IBM, praised Peters for attempting to streamline how the federal government procures AI systems. But he also warned it would be a heavy lift.

“Federal procurement regulations are like layers and layers and layers of an onion, right?” Padilla said. “Peeling it back and trying to reform it is a very good goal, [but] it’s very hard to do.”

Peters ultimately wants his committee to go even further, including an examination of how federal agencies could limit the potential for AI to undermine or destroy civilization. “It’s my intent to have a hearing bringing in some leading philosophers to actually talk about some of those broader issues related to AI’s potential impact on humanity,” Peters said.

But for now, Peters is regularly putting up singles and doubles when it comes to AI. And while bills related to federal AI leadership or transparency might feel small-ball when compared to the ambitious efforts building elsewhere in the Senate, Peters said Congress ultimately needs to “take first steps.”

“It’s important to get individual members acquainted with the issue and comfortable with the issues associated with AI, and then you can go broader and bigger,” Peters said. “But you’ve got to start somewhere. And that’s what we’re doing.”