Georgia lawmakers are taking small steps to limit the vast reach of artificial intelligence, treading lightly on a topic the White House has warned could prompt a fierce response if states go too far.
The legislative session ends Thursday. But lawmakers have already sent Gov. Brian Kemp two proposals aimed at keeping humans in the loop amid the proliferation of AI products and services.
One proposal would make sure a human decides whether a patient’s medical procedure is covered by insurance. The other would require AI companion chatbots — designed to mimic personal, sometimes romantic, relationships — periodically remind people that they are speaking with a robot, not a person.
The bills are among the first AI proposals to make it to the governor’s desk since President Donald Trump signed an executive order in December that threatened to withhold billions of dollars in federal broadband funds from states that enact “onerous and excessive” AI laws. That order is designed to protect innovation in an industry the administration believes is key to national and economic security. AI companies and executives have also donated heavily to Trump and his allied committees and projects.
Georgia Republicans, wary of angering Trump, have taken steps to ensure the bills don’t run afoul of the White House. State Sen. Kay Kirkpatrick said she reached out to the Trump administration to confirm they were OK with the AI insurance bill. The companion chatbot bill wouldn’t take effect until next summer, giving Congress room to act before then.
“We shouldn’t have a patchwork of 50 different rules. But we can’t expect that Congress will move as expeditiously as we would like them to,” said state Rep. Todd Jones, R-Cumming. “This gives Washington, if passed, about a year and three months to get it right.”
The insurance measure builds upon Georgia’s recent push to put guardrails on an industry practice known as “prior authorization.” It’s when insurance companies require patients to get permission ahead of time before undergoing treatment. Failure to do so could leave patients paying for everything themselves.
Insurance companies have been using computers to help make prior authorization decisions for years. The computers can screen requests a lot faster than humans, reducing wait times. But Georgia and other states have been working to make sure humans are still involved.
A state law passed in 2021 required denials be reviewed by a “clinical peer,” defined as a health care provider licensed in a relevant specialty. But the rise in artificial intelligence has stoked fears about a surge in denials of coverage. A survey from the American Medical Association earlier this month found 61% of physicians said they fear insurers’ use of unregulated AI would increase coverage denials.
“I’m not trying to keep them from using AI. They can use it all they want to approve medical claims or to determine eligibility or any other mundane task,” said Kirkpatrick, the Republican sponsor of the bill and a retired orthopedic surgeon.
“But when it comes to denying somebody’s medical care, I just wanted to be very clear that a human needs to be in the loop.”
Companion AI chatbots act as digital friends and romantic partners. They have advanced features, including remembering past conversations and asking unprompted, emotionally-tinged questions.
A joint study led by researchers at the nonprofit Common Sense Media and Stanford University found these bots to be particularly potent for young people, whose brains are not fully developed. They cited the case of a 14-year-old boy who died from suicide after forming “an intense emotional bond” with an AI companion modeled after a character from the “Game of Thrones” franchise.
Senate Bill 540 would require these chatbots to disclose every three hours that it is a chatbot. For minors, it would be once every hour.
“It is imperative that we start putting some type of guardrails around how the chatbots are going to interact, not just with adults, but also with minors,” Jones said.
About the Author
Keep Reading
The Latest
Featured



