Keeping AIs on an ethical short leash is important but there are other aspects to consider.
Fueled by government and industry partnerships as well as a plethora of private companies, the United States is poised to take the lead in the worldwide development and use of artificial intelligence and its close cousins machine learning, cognitive computing, deep learning and advanced expert systems. The technology is already revolutionizing many industries, making inroads into government, and shows no sign of slowing down its evolution.
The federal government has been a proponent of AI and its intrinsic advantages for some time. The AI in Government Act of 2020 (H.R. 2575) passed the House and has been placed on the legislative agenda in the Senate. The bill would create centers of excellence in the General Services Administration that will help agencies adopt AI technologies and plan for its use across government.
Military and intelligence agencies have also been actively working to integrate AI into their capabilities. They are actively studying the ethics of the technology, including what AI should and should not be allowed to do. In February, the Defense Department adopted five ethical principles for using AI. The intelligence community released its own Artificial Intelligence Ethics for the Intelligence Community, which is very similar to the DOD plan, though with a few variants more suited to civilian and non-battlefield deployments.
Keeping AIs on an ethical short leash is important because on some level people fear AIs, or at least highly mistrust them. Quite a few sci-fi movies and TV shows feature a power-hungry AI trying to take over the world or eliminate humanity. It’s unlikely that anyone would be stupid enough to build an AI that wants to kill people, much less give it a platform to do so. But it turns out that with AI, we should have been worried about a different kind of power craving.
An article in The Print magazine recently covered this year’s virtual Semicon West conference, which is generally attended by those who manufacture computer chips. At the show, Applied Materials CEO Gary Dickerson warned his colleagues during the keynote address that the use of AI would spike power consumption in data centers to the point where it might make them difficult to maintain.
“AI has the potential to change everything,” Dickerson said. “But AI has an Achilles’ heel that, unless addressed, will prevent it from reaching its true potential. That Achilles heel is power consumption. Training neural networks is incredibly energy-intensive when done with the technology that’s available today.”
As an example of the scope of the problem, Dickerson said that data centers today consume just 2% of the world’s electricity supply. Because of the use of AI, by 2025 he predicted that demand would shoot up to 15%.
The problem I think is not just the chips and hardware, but the fact that AIs are generally not optimized to use computing resources. Most of them grab as much power as they need, or whatever is available, to complete their tasks. To test this out, I experimented with some AIs in my test lab which I was planning on covering in a future column.
One of the things that I can do in my lab is monitor the exact power consumption of various devices and machines being reviewed. I do that to confirm that devices are as efficient as they claim, or to check to see how much standby power they drain when not in use. But I can also apply a standard electrical payment rate to determine how much each task or operation that a machine performs will cost.
For example, on a test workstation, it costs just one cent to open up a Microsoft Word file, and almost double that to open Adobe Photoshop. You generally don’t think of individual computer tasks costing money, but doing something like opening a file causes the computer to use more resources like the disk drive, graphics card and memory. That in turn generates heat, which forces more power to the cooling system. My calculation is not fully precise because to do that I would need to take into account the system’s thermal design power, which individual actions generally won’t be able to measurably affect. But it does show the relative power-hungry nature of different components or programs.
Looking at the different AIs that I had in my lab, I first used one that was designed to scan my incoming email and generate automatic responses based on my previous interactions. When it was initially ingesting data it ran the workstation pretty hard, consuming 53 cents worth of power above what the workstation would normally need over the same period. Thereafter, it spent between two and three cents every time an email came in, though it generated more when updating its database or learning new information.
Another AI that I tested is designed to look at programming code to search for vulnerabilities and then suggest alternative fixes. It can also be set to automatically make changes to the code, which I allowed. In the case of that AI, it only pushed the workstation when I was actively feeding it code, but when it was active it was quite a beast. The workstation’s internal fans sounded like jet engines preparing for takeoff. Had the AI been constantly on duty, the workstation would have consumed about 1,300 kilowatts of power over a calendar year, which is about five times more than if the machine were idol or performing less intensive tasks.
Based on those results, it’s easy to see how AIs could force data centers to consume five or six times more power than they do right now. I’m not sure what all the ramifications are at having one-sixth of the world’s total power output flowing into U.S. data centers, but it’s something we should think about. Even the effects on the environment and global warming should probably be taken into consideration.
We are doing a good job at keeping AIs on the right side of ethics, but perhaps we should also find ways to curb their appetite for power. It might be time to add some kind of resource efficiency guideline to those ethics statements to help keep future AIs in check before the power consumption problem becomes too big to manage, and before it puts the brakes on our otherwise lightning fast AI development programs.
John Breeden II is an award-winning journalist and reviewer with over 20 years of experience covering technology. He is the CEO of the Tech Writers Bureau, a group that creates technological thought leadership content for organizations of all sizes. Twitter: @LabGuys