Migrating big systems can be a huge project. But technologies are standing by to help you achieve it.
Since the introduction of COBOL in 1959 and its standardization 10 years later, it has served as the backbone of mainframe business computing. In fact, the name itself is an acronym for “Common Business-Oriented Language.” Although seldom used in scientific work, COBOL has been widely used by the Defense Department, which resulted in its extensive adoption by other users, both inside and outside of government. But even with updates, COBOL’s fundamental architecture has become outdated, limiting its ability to make use of important innovations, including automation software for application development and security.
Today, COBOL is still in wide use for applications on mainframe computers. However, a variety of factors have contributed to a steady decline in its popularity, including major advancements in computing technology such as commercial clouds, and the steady drumbeat of retirement by programmers experienced in using COBOL. Even so, many DOD systems still run on that platform, meaning that they lack the flexibility and features available on more recent platform languages and, by extension, limit the armed forces’ focus on critical missions. Perhaps most important, at least from a technical standpoint, is that as a programming language built for mainframe computers in private data centers, COBOL is simply unsuited to the operating systems commonly used in today’s cloud environment.
As a result of COBOL’s limited legacy system functionality, compounded by the increasingly desperate search for scarce COBOL engineers, our company was engaged as part of a consortium tasked with modernizing and transitioning a major DOD system from its long-term home in government mainframe data centers to a Java-based system running on the AWS GovCloud. However, the strategies we devised through our Application Modernization and Cloud Centers of Excellence, and the phasing of that migration, could apply equally well to Microsoft Azure, Google Cloud Platform, and other major commercial cloud service providers, as well as to hybrids involving multiple vendors. Any of them would allow DOD to leverage the high-value services provided by cloud operators—an important advantage at a time of tight budgets and limited technical manpower.
Modernization, however, is not something DOD does just to have a shiny new system in place; it’s only undertaken if it can increase the organization’s overall effectiveness. In the case of this particular project, the system was a key component of DOD’s mission-critical defense IT program used servicewide and globally. But it suffered from serious issues of reliability, scalability, poor documentation, and lacked more modern features like microservices and containerization that would allow for easier future enhancements, at the time we started work, was running on hardware more than 50 years old.
What the project needed to accomplish was an extensive system modernization that included migration to an affordable hosting environment, accomplished without downtime, data loss, loss of functionality or diminished performance, and done with minimal risk of security or mission impact. It was an ambitious project of a sort DOD had never attempted before.
We started the project by identifying and then evaluating various alternative solutions with a view to formulating a high-level technology roadmap for the transition. One option was a total manual rewrite and re-architecting solution. But that would have exceeded the program’s time constraints, involving a high risk of failure, and it would have cost too much. Another strategy was to completely replace it with an altogether newly procured system, but that would have sacrificed DOD’s carefully crafted business rules, so it was rejected. A COBOL emulator re-host solution was also considered as a stopgap measure, but we decided that it would have failed to meet the system’s future state architectural requirements. In every case, the need for training was a major consideration. But in the end, we settled on a COBOL-to-Java code automated refactoring solution, to be implemented in three phases over approximately three years:
- Phase One: Automated Code Conversion. This involved using automated COBOL-to-Java refactoring to run on an Intel x86/RHEL platform.
- Phase Two: The Purge. This involved further refactoring the Java code to remove any lingering remnants of COBOL and to enhance ease of maintenance.
- Phase Three: The Big Move. All of the remaining infrastructure was migrated to the AWS GovCloud, including development, staging, production and support. The cloud service also enabled us to satisfy DOD’s stringent cybersecurity requirements.
Most important is that the project delivered a variety of benefits designed to meet today’s critical defense challenges. In addition to delivering the data where it needs to be for the people who need to have it, they include reduction of security risks and technical debt resulting from legacy hardware and software; cost savings due to lower hosting expenses through economies of scale; the ability to incorporate valuable new application functionality such as artificial intelligence and machine learning; the preservation of critical DOD business logic; the use of cloud services including virtualization and metering; and enabling the transition toward an Agile DevSecOps approach to system management. Because of this shift to the cloud, they were able to greatly reduce their recurring annual operating costs by approximately 70%. The shift to cloud not only drives down cost but also increases flexibility which gives them the opportunity to rapidly provide critical new capabilities to better support their mission. Cloud services like backup, disaster recovery, auto-scaling, elastic compute and storage allow those utilizing the cloud to be able to do so much more with less. It allows for quicker integration at the right speed.
But perhaps the biggest obstacle to any transition is security. It’s a huge concern, particularly now. Of course, that’s not to say legacy systems are inherently insecure. Transitioning to an unfamiliar environment always carries risks of the unknown. From our perspective, however, all the major cloud service providers have gone through extensive accreditation processes. They’re focused on security but it’s a shared responsibility, and the only way to be confident that an application is secure is by making security your own priority and then monitoring for it. That involves using automation and architecture to make sure you’re secure, although it can be a challenge, particularly when you’re interfacing with someone else’s legacy systems. Even so, for any major migration project, leveraging automation needs to be a central goal. And, as we learned, embracing a DevSecOps approach to its implementation needs to be a key element of any serious technology roadmap.
Bill Osterhout is the director of Cloud & IT Solutions for Array Information Technology.
NEXT STORY: The End of Hygiene Theater