ChatGPT for State and Local Agencies? Not so Fast.

Stanislaw Pytel/Getty Images

Tasks that can benefit from automation such as software development, traffic management or rote tasks are good candidates for ChatGPT, but those that need more subjectivity still require human intervention, experts say.

ChatGPT is dominating headlines as the new technology gets mixed reviews as it is applied to government business challenges from tourism to education. And what it means for state and local governments is just as unclear.

ChatGPT is an artificial intelligence model that interacts conversationally. Developed by Open AI, an AI research and deployment company, it can answer questions, help write correspondence with constituents or automate rote tasks, for instance. On the surface, that all sounds great.

But Arthur Holland Michel, a senior fellow at the Carnegie Council for Ethics in International Affairs, isn’t so sure. “In terms of the negatives, there are the obvious ones, like the fact that these systems have shown a propensity to produce untrue information and … to reflect certain societal biases, particularly against marginalized and vulnerable groups,” Holland Michel said. “We still don’t really know what is lost when humans stop doing some of the jobs that this technology is potentially capable of.”

ChatGPT could work in areas that don’t require human subjectivity, such as transcribing and summarizing calls from constituents about nonserious issues, but not for assessing whether someone can get a driver’s license. That may require considerations such as whether the applicant has a history of speeding tickets or previously revoked license. “With something like a license, that’s a critical application, [and] you really want to make sure that there’s a high degree of reliability in the system,” Holland Michel said.

Michael Ahn, associate professor at the John W. McCormack Graduate School at the University of Massachusetts-Boston, agrees. Tasks that can benefit from automation such as traffic management are good candidates for ChatGPT, he said, while areas that need more subjectivity still require human intervention.

But ChatGPT presents other opportunities, he added. For instance, it has the potential to increase the number of citizen developers because the technology can do the coding, rather than governments relying on experts in Python or C++. Where agencies historically have had to find a software development company to make and implement code for them, the development of the technology marks the start of “an era where potentially government officials can create applications and apps themselves using ChatGPT,” Ahn said. “Now, government can come up with a service very quickly whenever they see a need for a particular service.”

He likens ChatGPT’s launch to the advent of 3D printers. Traditionally, when companies have an idea for a product, they come up with a design and specs, find a manufacturer and send the job off to the factory. 3D printers cut out the middleman and the lengthy manufacturing process. “You can come up with the idea and then you print it right there,” he said.

But, he added, “this assumes government will work with ChatGPT or similar technologies and they will implement it correctly.” To do that, education and training are crucial, both of the people using ChatGPT and of the technology itself, which is trained on data. 

“If they use ChatGPT in government, they need to focus a lot of effort in making sure that data is transparent and that bias is addressed,” Ahn said, referring to research that shows how using biased data in AI can result in biased outcomes. “Government needs some kind of feedback or oversight to make sure that doesn’t happen, and that will require [agencies] to have some kind of review process or transparency in the kind of data that’s in use.”

Overall, he’s excited about the potential of the “groundbreaking” technology. “It provides a real platform to port individually customizable government services,” he said. “If used properly, this can really increase the convenience of citizens dramatically.”

Holland Michel is more bearish on the technology, viewing it as trendy, not revolutionary. “Something that happens often with emerging technologies is that when the technology first comes out, everyone gets really excited and comes up with a billion ways to use it,” he said. “The reality is often that a new technology will, sooner or later, reveal itself to be poorly suited to most of those imagined applications…. When you talk about using a text generator to come up with innovative ideas, or government programs, that to me feels a little bit out there.” 

But with companies like Microsoft announcing the incorporation of ChatGPT into major tools such as Word, Excel, PowerPoint, Outlook and Azure, the technology may be tough for governments to escape. ZenCity, a platform for fostering local governments’ community engagement, announced that it is using ChatGPT in software to help communications staff write press releases and social media posts.  

Holland Michel recommends that agencies use their power to set strict rules limiting the tech’s use until they see how ChatGPT plays out, but Ahn welcomes the possibilities.

“It’s a force to be reckoned with in our society,” Ahn said of AI. “We are coming into the era or age of collaboration between humans and machines…. Government can do a lot of good things with it.”

Stephanie Kanowitz is a freelance writer based in northern Virginia.

This article was changed March 30 to correct the spelling of Michael Ahn's name.