The first thing executives need to understand is that the cloud providers aren’t responsible for protecting data.
As we head into 2020, there’s little doubt that the organizations across the globe and the US Federal government has become increasingly reliant upon the public cloud as a core component of digital transformation. The recent award of the controversial Joint Enterprise Defense Infrastructure contract is but one example. IDC predicts that 49% of the world’s stored data will reside in the public cloud by 2025. The United States has had a “Cloud First” policy since 2011, which recently evolved into a “Cloud Smart” policy.
Increased use of public cloud services will provide government agencies with greater flexibility and scalability options, with a dark twist: Attackers will have an increasingly attractive bullseye to target. Expect to see more breaches in 2020 as cloud applications become more ubiquitous.
So, if the federal government is putting such an emphasis on being Cloud Smart, why are so many agencies still acting so “cloud dumb”? Why are they not doing more to protect their data?
Let’s look at three strategies agencies can adopt to get smarter about public cloud security.
Protect the Data
The first thing executives need to understand is that the cloud providers aren’t responsible for protecting data. That falls under the purview of the organizations that are using the cloud infrastructure.
For example, Amazon Web Services’ Shared Responsibility Model clearly states that “AWS is responsible for protecting the infrastructure that runs all of the services offered in the AWS cloud,” while “customers are responsible for managing their data.” This includes managing encryption methods, monitoring user access to data, managing configurations, monitoring system vulnerabilities, implementing patches and observing anomalous user behaviors.
This is a key point that many, including people within the public sector, still do not fully understand. The recent Capital One breach set off a flurry of senatorial inquiries as to who was responsible for the loss of data. While some modicum of blame could be attributed to both sides, some experts said that Capital One may have used a weaker type of encryption or failed to properly store decryption keys, allowing a hacker to access personally identifiable information Capital One was storing in Amazon’s cloud. This has led to debate around whether or not the company understood or upheld its role in the shared responsibility model.
Thus, the first step toward becoming smart about the public cloud is to get everyone to understand that protecting data begins and ends with the agency. Organizations cannot just assume that because a public cloud provider is handling their data that it’s automatically protected. That’s not the case. Agencies must treat that data just as they would the data that resides on-premise. They must take proactive steps toward protecting their information before it enters the public cloud and while it’s there.
One of the primary reasons why the cloud has grown so far, so fast, is because it makes things easy. It’s easy for an employee to spin up cloud services all day long, without having to go through IT, so long as they have the budget to do so. After all, who wants to wait months in line to procure a business, communications or enhanced productivity application that they could have today?
That’s not smart, though. Downloading unauthorized applications can expedite workflow processes and boost productivity—but that action can also expose an organization to unnecessary risk and contribute to challenges related to shadow IT. Users who aren’t security savvy frequently make the mistake of storing unencrypted data in cloud servers and failing to set security controls to lock out unauthorized users. Often they are not monitoring user or admin behavior, similar to the Capital One incident. The last thing you want to do is find your agency’s proprietary information on Github.
Agencies can’t afford to trade off security for productivity and instant gratification. Instead, they must find a balance between users’ need for speed and good cybersecurity hygiene. In their security policies, administrators should clearly articulate a process that users must go through when using cloud services. Users should be encouraged to work with IT and keep them apprised of any application or service that the users are interested in procuring. In some cases, this may mean that it might take two weeks to procure an application rather than two hours, but that’s a fair trade-off in the interest of better security. As we enter the new decade, we must find ways to bring security into the fold without compromising speed of mission.
The Cloud Smart policy lays out how a more streamlined approach to procurement can be balanced with better security. Agencies should follow that guidance to improve their timelines for application delivery without increasing risk.
Take Steps to Mitigate Risk
Many organizations make the mistake of loading as many applications and as much data as they can up into the public cloud, bypassing the security teams, and hoping for the best. That’s a “cloud dumb” approach that only exacerbates security challenges.
Instead, agencies must take the time to really understand the cloud, both its pros and cons. Understand the risks associated with the public cloud, and how those risks will increase as public cloud usage rises. Build policies to help mitigate those risks, educate users on how to follow those policies. Keep the most sensitive data and applications on-premises where they can be more tightly controlled. Implement proper security procedures and tools to monitor user behaviors and secure cloud applications. Create a solid architecture designed to protect data stored in the public cloud. As the 2019 Federal Cloud Computing Strategy states, it’s all about procurement, security and workforce modernization.
In the 2020s, we need to be Cloud Smart, not “cloud dumb” as we move our nation’s critical assets to the Cloud. Cloud Smart is about equipping agencies with the tools, knowledge and flexibility they need to move to the cloud according to their mission needs while protecting data.
Eric Trexler is vice president of global governments and critical infrastructure at Forcepoint.