recommended reading

Former NSA Director: Big Data Is the Future

Gen. Keith Alexander, Commander, United States Cyber Command

Gen. Keith Alexander, Commander, United States Cyber Command // Carolyn Kaster/AP

The National Security Agency has been in the business of collecting information for a long time, but technological advancements over the past decade are the primary driver of the intelligence community’s ability to collect data on the grandest scales.

That those advancements continue to evolve at a frenetic pace – seemingly slowed only by Moore’s Law – presents an air of inevitability regarding the production of data by humans, sensors and machines and the consumption of it by governments, corporations and other organizations: For better or worse, data is going to get a lot bigger.

And it’s already quite big.

According to Gen. Keith Alexander, who retired in March after eight years as the director of the NSA, the world will produce some 3.5 zettabytes of information in 2014 – enough to fill the hard drives of 3.5 billion high-end desktop computers.

“We’re living in the age of big data and we have to figure out how to harness it,” said Alexander, speaking at the American Council for Technology – Industry Advisory Council’s (ACTIAC’s) Management of Change conference on Monday.

“That’s what the future is going to be about,” Alexander said. “Think about 3.5 zettabyes of data. Big data is absolutely vital. The changes that will come to our nation in science, technology, biomedical and health care will be phenomenal.”

Yet with great potential comes heightened risk, as evidenced by last June’s intelligence leaks perpetrated by former NSA contractor Edward Snowden.

Snowden downloaded as many as 1.7 million files containing classified or highly sensitive information, and ironically, a pile of the NSA’s own data has allowed the public a clearer vision over the last ten months of how the agency takes advantage of technology to collect information on everyone. In his speech, Alexander painted the NSA as “victims of a crime,” citing Snowden’s actions, which Alexander subtly hinted “might have something to do with the country (Snowden’s) sitting in (Russia).

The NSA, Alexander said, “faithfully did” its job, with the exception of one thing.

“The one thing we failed in was protecting data from those we trusted,” Alexander said. “Now we have a new responsibility. Big data is something we’ll all have to work with.”

Alexander suggested stronger continuous monitoring efforts as one way to mitigate some problematic issues posed in the era of big data. Tracking data down to the cell level, wherein every piece of data has its own security infrastructure built in, is a further step down to ensuring that larger data sets aren’t entirely compromised by one bad actor or insider with an unclear agenda.

“Not everyone needs all the data,” Alexander said. “Only that data that you have a need for can you access and decrypt. We have to come up with a more defensible architecture.”

The NSA may have been one of the first organizations to dive into big data, but it’s clear now that big data is a reality for the rest of the world, too. And he challenged civilian agencies and industry to lead in big data or face the uncertain prospect of not being the world’s top dog in technology.

“I really feel strongly that this area needs leadership,” Alexander said. “We stand for freedom as a nation and with our allies. We need to lead in this area. We need you to lead in this area, we need vision and persistence to make it happen.”

Threatwatch Alert

Thousands of cyber attacks occur each day

See the latest threats


Close [ x ] More from Nextgov

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

  • Featured Content from RSA Conference: Dissed by NIST

    Learn more about the latest draft of the U.S. National Institute of Standards and Technology guidance document on authentication and lifecycle management.

  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security


When you download a report, your information may be shared with the underwriters of that document.