Federal agencies continue to confront cloud migration challenges a decade on

rain cloud

Most U.S. federal agencies began the process of moving their data to cloud-based services about ten years ago. In 2011, the White House issued their Cloud-First strategy, requiring agencies to evaluate safe and secure cloud technologies. This marked the first step taken to accelerate cloud adoption amongst government agencies in a bid to help reduce costs and improve the efficiency of services provided to citizens.

Since then, many agencies have begun moving more and more of their infrastructure to cloud platforms. Recent research revealed that nearly two-thirds of federal IT leaders are either using or starting to use the cloud for mission critical applications. However, despite this uptick in adoption, many federal agencies continue to grapple with cloud migration challenges.

There are several challenges federal agencies face when deciding to migrate to cloud platforms: one of the biggest being the use of legacy applications and technologies.  Before cloud adoption was a big initiative for the federal government, agencies operated from on-premises environments and within datacenters.  Due to the sensitivity of the data housed within these networks and largely due to the inoperability of the data migration and system functionality with rising technology, some agencies have continued to use these outdated operating systems, legacy databases, and other forms of information systems and technologies because the mission and the protection of data is bigger than the next big ticket in updated technology. 

With the rise of connectivity on the Internet of Things (IoT) and with Big Data and the problem of not being able to mine data as efficiently as before, technologies have been modified to handle IoT’s high connective behavior as well as the vast amount of data produced minute by minute.  With that, the way that data and technology integrate and behaves with each other is far different than the way the federal government is used to it in the on-premises and datacenter environments.  An easy solution would be for the federal agencies to move what they can to the cloud and to leave everything else on-premises or in a datacenter; however, this one challenge of legacy applications and technologies, bleeds over into other areas such as budget, security, and governance, risk, and compliance (GRC). And with the factors that makes this challenge so pressing to resolve, makes the above recommendation not a viable solution at all.  

This challenge is so pressing because remaining in these stagnant environments uncovers a more heinous problem, not just for the federal government but for all businesses and organizations whose portfolios include legacy systems.  Although these legacy systems have not caught up with current technologies, unfortunately, the threat landscape has.  With the rise of quantum computing and threat actors using quantum computing to initiate attacks against information systems, it is imperative that the federal government continue to move away from older technologies.  Moving to the cloud, especially cloud platforms authorized under frameworks such as FedRAMP and FedRAMP+ which is the DoD SRG Impact Level authorizations, will not only address GRC, protection of the data and ensure the same level of security across platforms and systems, but will be a very strong layer in the defense-in-depth of warding off these attacks. 

For the federal government to feel comfortable with such a drastic change in the way they approach cybersecurity, there needs to be standardization.  Standardization in the requirements for compliance in protecting federal data by employing the previously mentioned frameworks is just the starting point of how the federal government is overcoming the challenges of migrating to the cloud.  Different branches of the government are owners of different types of data that vary in type as well as classifications, so the FedRAMP framework addresses federal agencies where the FedRAMP+ framework (DoD SRG Impact Levels) addresses the executive branch of the government that handles other specific types of data related to their missions and business functions, but all these frameworks are derived from the NIST 800-53 series which greatly adds to that standardization.  The unique perspectives of each of these branches of government are addressed through detailed overlays, which provide standardization as well.  When migrating to the cloud, having this standardization gives the government the assurance that the same protective measures they employed in their on-premises and datacenter environments, that are FISMA-reportable and derived from NIST 800-53 security control requirements, are the same protective measures employed in these cloud systems that holds the FedRAMP and/or FedRAMP+ authorizations. 

It is impossible to pick up these legacy systems and just drop them into a cloud environment and then everything continues to operate as usual.  Migrating to the cloud requires strategic planning as well as ensuring that during migration the data remains protected. The level of effort is not easy, nor can it be done haphazardly, but with the threat landscape increasing and advanced persistent threats becoming more sophisticated in style of attack, level of attack, and kinds of attacks, the migration should be done quickly and be of top priority.   

The directive by the President of the United States found in the FISMA Act of 2014 is the reason why it may seem federal agencies are lagging in adopting cloud solutions.  Under FISMA, agencies are required to adhere to the NIST Risk Management Framework (derived from NIST 800-53 series) and report various elements of their information systems and technologies.  Regardless of cloud adoption, this requirement remains.  The confusing part of this for agencies is when they attempt to bring their cloud solutions into a "FISMA" reporting realm.  And although cloud solutions have some reporting elements as that of FISMA, the way in which cloud solutions are managed are quite different.  Nevertheless, when agencies, cloud service providers (CSPs), and other stakeholders are coming up with their own set of procedures of how to manage the cloud solutions, this develops roadblocks or even show-stoppers to cloud adoption.  Much in the way that there is standardization in the security requirements for information systems and cloud environments derived from the NIST 800-53, so should be a standardized methodology in cloud adoption.  In the frameworks mentioned here, there are standardized methodologies, processes, and procedures for receiving authorizations under those frameworks, but the segment from identification of a need for a cloud solution to cloud adoption is missing and made up as agencies go along with migrating to the cloud. 

Migration to the cloud is not a nice to have for federal agencies, it is a must.  Not just from the cost savings of cloud adoption and standardization of cybersecurity requirements, but to address the threat landscape that is here and forth coming as well.  Programs such as FedRAMP and FedRAMP+, have done an excellent job in creating standardizations to meet the requirements of the information security as well as helping agencies migrate to the securest platforms as possible.  These are also frameworks that continually improve based on feedback from federal agencies and requirements set forth by presidential and executive directives.  And although much has been done, continuous improvements are being made, and more initiatives are set in motion as we speak, there remains a high need for a standard methodology to be created that guides agencies toward these processes in a clear, succinct way that promotes and gives high assurance of successful cloud migration. 

Image Credit: Sergey Nivens / Shutterstock

Dr. Stephanie Carter is Principal of FedRAMP Advisory Services at Coalfire, a cybersecurity consulting firm.

Comments are closed.

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.