John Breeden II is an award-winning journalist and reviewer with over 20 years of experience covering technology and government. He is currently the CEO of the Tech Writers Bureau, a group that creates technological thought leadership content for organizations of all sizes. Twitter: @LabGuys
The 2016 Rio Olympics kick off this week with all the celebrations, anticipation, medal counting, competition and doping controversies such an event entails. And that last item, oddly enough, got me thinking about the encryption government uses to protect its most sensitive documents and communications.
Doping is a term used when an athlete takes illegal substances to improve their performance, with the most common forms being stimulants and hormones. Russia was accused of state-sponsored doping on a massive scale this year, and while its athletes are going to be allowed to compete, they are going to be subjected to extra scrutiny and testing.
It’s always been a bit of a cat and mouse game between athletes who dope and the organizations tasked with trying to catch them. Detection technology is constantly improving, but so are the ways doctors have found to hide or mask the presence of those illegal substances.
» Get the best federal technology news and ideas delivered right to your inbox. Sign up here.
Because of that, more than one sample is taken from many athletes participating in major events. Tests are run on one sample immediately and used to clear athletes of any wrongdoing. However, those other samples are put on ice and can be re-tested at any future date within several years of the original collection.
Why? Because detection technology is constantly evolving. An athlete may be able to beat the system in place at the time, only to get caught years later with new testing.
Which brings up encryption. In many ways, the creation of encryption algorithms and the ability to break them have been in competition for a long time, just like doping and the tests to unmask it.
Back in 1977, the government standardized on the Data Encryption Standard, a 56-bit standard considered secure at the time because the power of computers was not strong enough to break it. Government documents stored using DES were perfectly safe.
But computers evolved, grew more powerful and people found ways to network them together to increase their power even further. In 1999, the Electronic Frontier Foundation along with distributed.net worked together to publicly break the encryption on a DES-protected document in just 22 hours.
Suddenly, just like athletes who might initially be able to hide their doping only to later get caught by improved testing standards, the government found itself relying on a protection scheme that could no longer keep hidden its secrets.
In 2001, the National Institute of Standards and Technology set a new standard of encryption for government use to replace DES, the Advanced Encryption Standard. AES uses three different key lengths, either 128, 192 or 256 bits. The 256-bit AES encryption is the most secure, though all three are considered almost completely unbreakable given today’s technology.
But NIST is already looking toward the future of encryption, hoping to improve the standards before encryption-breaking technology can catch up.
While traditional computing, or even distributed computing, would find the current level of AES encryption impossible to break even given hundreds of years or more to try, the wildcard is quantum computers.
A recent internal report from NIST clearly states the danger quantum computing could pose to government encryption. The report notes that “If large-scale quantum computers are ever built, they will be able to break many of the public-key cryptosystems currently in use.” The only solution, according to NIST, is to “begin now to prepare our information security systems to be able to resist quantum computing.”
Once existing only in the realm of science fiction, modern quantum computers are built inside completely blacked-out and sound-proof enclosures, and are kept at temperatures close to absolute zero. Any disruption, even a photon in a beam of light or a soundwave, can massively disrupt their calculations.
Whereas digital computers use binary calculations with each bit assigned to either a one or a zero, quantum computers seemingly defy the laws of physics in that each of their bits, called quantum bits or qbits, are represented by an electrical charge that exists as both a one and a zero, and every possible state in between. This gives them the theoretical power to look at every possible solution to a given problem instantly, while a digital computer must try each possibility one at a time. And that is really bad news for encryption.
Right now, most quantum computers only exist in the laboratories of agencies like NASA and companies like Google. And they are only able to support a few qbits. There is also a steep learning curve to be able to program a quantum machine to work on a problem and output functional results. That said, larger-scale working models could start to become more common within the next 10 years, according to some estimates.
NIST isn’t taking any chances and is working on creating preliminary evaluation criteria so it can accept proposals for quantum-resistant public key cryptography standards in 2017, with the new standard targeted for deployment within three to five years.
That should put government encryption ahead of encryption-breaking quantum computing development by at least several years, though given the amazing potential of those machines, creating a puzzle they can’t quickly solve will be no easy feat.
If quantum-resistant encryption algorithms for government can be created, then NIST and whoever it ends up partnering with certainly would deserve a gold medal for competing and winning in a game increasingly stacked against them.