Emerging Tech


Agencies Often Lack Strong Authentication and it’s a Big Problem

By Frank Konkel // April 24, 2015


The cyberbullies of the world like to beat up on the U.S. government.

The Office of Management and Budget’s annual Federal Information Security Management Act report to Congress revealed that agencies reported nearly 70,000 cyberincidents in fiscal 2014, a 15 percent bump up from the previous year. My colleagues at Nextgov did an excellent job visually explaining the vast array of cyberthreats agencies face today, but what’s particularly troubling is that many of the cyber-beatings the government takes are preventable.

The FISMA report states that U.S. Computer Emergency Readiness Team incident reports "indicate that in FY 2013, 65 percent of federal civilian cybersecurity incidents were related to or could have been prevented by strong authentication implementation. This figure decreased 13 percent in FY 2014 to 52 percent of cyberincidents reported to US-CERT.”

Before we go further, here is the FISMA report’s definition of strong authentication:

The use of an “identification authentication technology to ensure that access to federal systems and resources is limited to users who require it as part of their job function. Strong authentication requires multiple factors to securely authenticate a user: (1) something the user has, such as a PIV card ...

Innovation in Federal IT? Sorry, Not in the Budget This Year

By Frank Konkel // April 23, 2015


The new federal chief information officer, Tony Scott, recently noted that more than 80 percent of the government’s IT budget is spent on legacy technology. He’s not alone in thinking government has its technology investment priorities backward. 

A survey by software provider SolarWinds of 123 public sector IT managers and directors unsurprisingly found “budget limitations” as the top barrier to new IT adoption.

The government IT gurus surveyed in this research were polled in December, before the Obama administration revealed a budget forecast that actually increases IT spending 2.7 percent over last year. Yet, they’d most certainly understand spending money on ancient mainframes that run legacy applications means you’re not spending it on cloud computing pilots, improved analytics or other strategic innovation investments.

Phrased another way, you know those polled would crank up the Geto Boys and go all "Office Space" on various legacy technologies if they could, but they can’t.

Some of the other survey statistics won’t surprise anyone familiar with federal IT.

A majority (55 percent) indicated security and compliance concerns detracted from new IT adoption, and 53 percent outright said the need to continue supporting legacy technology kept them from ...

Martha Dorris Moves Positions at GSA

By Frank Konkel // April 21, 2015


Martha Dorris will take her talents to another directorate within the General Services Administration.

Dorris, currently the director of the Office of Innovative Technologies in GSA’s Office of Citizen Services and Innovative Technologies, will May 1 become the director of GSA’s Office of Strategic Programs within GSA’s Office of Integrated Technology Services.

Dorris replaces acting director, Maynard Crum, in that role.

ITS Assistant Commissioner Mary Davie tweeted out the news over the weekend and Jason Miller of Federal News Radio was the first to report it.

More important than the news, though, is that Dorris is going to bring her customer-centric skillset over to ITS. I’ll have more to write about Dorris soon, but suffice it to say, she’s excited about the new gig.

“I will be helping ITS and the Office of Strategic Programs transform into a more customer-centric, category management focused organization,” Dorris said. “Being customer-centric means listening to our customers' feedback and determining both the experience they are having currently and the products and services that may be needed in the future.”

With Dorris, ITS gets immediate customer experience cred, and with the Obama administration’s recent emphasis on improving customer experience ...

Tech Giants Team Up to Unleash NOAA’s Data

By Frank Konkel // April 21, 2015

Engineers work on the CERES Instrument which will be integrated onto the Joint Polar Satellite System spacecraft, scheduled for launch in early 2017.
Engineers work on the CERES Instrument which will be integrated onto the Joint Polar Satellite System spacecraft, scheduled for launch in early 2017. // NOAA

Just over a year ago, the National Oceanic and Atmospheric Administration sought industry feedback to help it better share the more than 20 terabytes of weather data it produces every day.

Today, Commerce Secretary Penny Pritzker revealed the culmination of NOAA’s efforts: a big data project between NOAA and a handful of tech giants.

NOAA will partner with major cloud computing providers Amazon Web Services, Google Cloud Platform, IBM, Microsoft and the Open Cloud Consortium to explore ways the agency might better unleash its vast environmental data stores for the public good.

The individual collaborations between the companies and NOAA will be established through cooperative research and development agreements. These agreements provide the framework for each data alliance between the companies and NOAA.

According to a Commerce statement, the data alliances “will work to research and test solutions for bringing NOAA’s vast information to the cloud, where both the public and industry can easily and equally access, explore and create new products from it, fostering new ideas and spurring economic growth.”

It’s clear NOAA wants to better share the data it produces, but it wants to do so without spending additional taxpayer dollars. That’s been its ...

These 3 Steps Could Prevent 85 Percent of All Data Breaches

By Frank Konkel // April 17, 2015


Last year, data breaches of both private sector companies and the federal government dominated headlines.

In short, a lot of organizations got owned. And if early 2015 is any indication, there’s much more to come.

Yet, a great many of these calamities are preventable through basic cybersecurity hygiene, according to Ann Barron-DiCamillo, one of the U.S. government’s foremost cybersecurity experts.

DiCamillo, the director of the Department of Homeland Security’s Computer Emergency Readiness Team, told an audience at the Symantec Government Symposium on Wednesday that about 85 percent of data breach incidents could be prevented by following three essential steps:

  • Reducing administrative privileges (think Edward Snowden’s access to National Security Agenda data);
  • Application whitelisting (Not letting unauthorized programs run because, well, why would you?); and
  • Software application patching (This has been a problem for more than a decade).

“These controls, if monitored, would reduce about 85 percent of incidents,” DiCamillo said. “We’re trying to emphasize the importance of getting back to cyber hygiene.”

Information sharing is also key, DiCamillo said.

“Cyber has no borders, so it’s important to have those relationships” with the private sector, between agencies and in some cases, international partners, she ...