How Edge Computing and Hybrid Cloud Are Shifting the IT Paradigm

canadianPhotographer56/Shutterstock.com

With cloud at the center of a whole new operating model, organizations will have distributed DevOps teams spinning up new applications in real time.

Artificial intelligence and 5G—two of tech’s biggest buzzwords—are intrinsically linked to edge computing and the hybrid cloud. In simplest terms, 5G fuels edge computing in the cloud, which will lead to more opportunities for AI and automation. But those opportunities won’t be realized if having a hybrid cloud is seen as a simple destination or box to be checked. Instead, it must be understood as a whole new operating model. 

For a very simple example of edge computing and its relationship to the cloud, consider a Tesla. The car’s many sensors and cameras can only have situational awareness if they can respond to what they’re seeing—whether trees, other cars, or traffic—in near real-time. There isn’t the time, latency, or bandwidth to send data back to a central, public cloud data center. Instead, the cloud must be brought to the edge. 

Thus, a Tesla isn’t just a next-generation car; it’s an edge compute node. But even with Tesla, a relatively straightforward use case, building and deploying the edge node is just the beginning. In order to unlock the full promise of these technologies, an entire paradigm shift is required.

Becoming Cloud Native

With cloud at the center of a whole new operating model, organizations will have distributed DevOps teams spinning up new applications in real time. They’re often disconnected from the infrastructure team, which is tasked with managing the cloud architecture. The goal is not to deploy a brand new edge node for every mission. Instead, it’s to build new applications on the existing node to meet changing demands. A Tesla, once again, has hundreds of sensors in it, including cameras on its sides and back. Recently, developers added a functionality to allow drivers to also see the side cameras while in reverse—not just the rear camera. Thus, developers delivered an entirely new functionality without changing the hardware. 

The Federal Emergency Management Agency offers another example. The agency often deploys nodes with portable LTE communications for rapid response in emergencies like hurricanes. In the FEMA model, the first mission includes immediate needs like search and rescue, organizing aid, and facilitating communication between first responders. When the cloud is embraced as an operating model, the same infrastructure for those missions can be used for subsequent ones as well, such as long-term care, managing housing and logistics, and distributing money to displaced people. You don’t have to drop a new set of servers into the field to support other missions. Instead, the same infrastructure is used with different apps. Shifting to 5G and its software-defined nature we can now also deliver more smart programming to the network and the tactical edge.

Embracing Risk-Based Security

When agencies build applications to be deployed into a cloud-native environment, they also must ensure security is baked in from the beginning. It should live and move with the application. If agencies don’t evolve their security posture to match this new paradigm, they are poised for failure. Put simply, forcing legacy security models onto cloud-native apps is a fool’s errand. Those models are centered on protecting the perimeter, both in the physical world with guards, guns, access cards, and so forth, and in the digital world with technology like firewalls. When your server is sitting in a metal box in a field, while locking your hardware is still key, the security equation naturally changes—and measures like encryption become far more important. 

In an edge environment, agencies need to think of security as risk-based. They must ask: What is actually being deployed to the edge capability, and how much risk and exposure does it include? Going back to Tesla, the backup cameras mentioned earlier are not critical. If they’re lost, there’s some risk, but the car can still function. Thus, security for that application does not need to be as stringent as steering control for example. At the edge, security controls should change as the risk profile does. In a cloud-native world, risk becomes one of those dials for the application as it moves out to the edge just like power, bandwidth, storage, memory and compute. 

The Bottom Line

A hybrid cloud approach underpins edge computing, but there’s not a fixed number of stops along the way from data to the cloud. Oftentimes, there are many destinations depending on the application. Being cloud-native lets federal agencies leverage cloud infrastructure for a wide range of objectives. Even as the underlying infrastructure becomes more complex, software simplifies it. Agencies must make sure they shift their entire paradigm, integrating developing apps in a distributed manner and baking security right in.

Only by seeing the cloud as an operating model as opposed to a finish line will the benefits of 5G and AI be realized. Those transformations won’t happen if organizations aren’t taking the necessary steps to be cloud-native and cloud-secure.

Steve Orrin is the Intel Federal chief technology officer and Cameron Chehreh is the Dell Federal chief technology officer.