MicroPact Blog

Gartner Symposium/ITxpo 2015: Challenges with Cloud Initiatives in Government

Lessons On Managing The Cost of Migrating To The Cloud

Sandeep Khare's avatar

By: Sandeep Khare

October 7, 2015

"In theory there is no difference between theory and practice, but in practice there is." - Yogi Berra (professional baseball player: May 1925 - September 2015)

In recent years, governments across the globe have taken bold steps to move their data and many of their applications into cloud environments. They are working to implement cloud-first policies driven by market forces but mostly due to rising expectations of the citizens who want reliable services delivered quickly and in an efficient manner. The problem is that many are not getting the desired results. One of the biggest reasons for this is cost.

At Gartner Symposium/ITxpo 2015, Analyst and Research Director Neville Cannon discussed the reasons why government cloud initiatives may become a burden and what lessons have come out of real-life examples of migrating to the cloud environment.

Neville outlined four key factors that have resulted in much higher costs than originally forecast, which in turn caused agencies to move to an alternative approach, or abandon their initiatives altogether.

Factor One: Private Cloud Failures

Neville Cannon speaks at the Gartner Symposium/ITxpo

Neville Cannon speaks at the Gartner Symposium/ITxpo

Many government agencies as part of their cloud-first strategy tend to prefer private clouds contained within their own IT environment. Many leaders see this as a way of maintaining control over their data while enhancing security. Neville pointed out that one of the biggest myths around public cloud is that it is not secure. In actual fact, many well established public cloud vendors provide far more secure environments that are less prone to data breach or external hacking.

Private cloud initiatives may become costly or fail because:

  • Agencies focus on moving more and more things to the cloud without any clear procedures or forecasts of how much it will cost.
  • Of the use of the wrong technologies.
  • There is a culture of over promising and under delivering.
  • They focus on the wrong benefits.
  • Customers are not seeing the value, so the agency ultimately moves away from using these systems.

Neville noted that NASA had deployed a private cloud within its own IT environment but that the initiative was stopped after 3 years as it was not yielding the desired value for the stakeholders. Its performance was poor and its ability to push content to the users was not highly scalable. NASA eventually moved to a public cloud, which turned out to be much more reliable, robust, well supported, easily scalable and equally secure. More importantly by moving to the public cloud NASA decreased the risk of failure.

Another factor to consider is that the maintenance of private cloud environments often cost more. Additionally, there is a high risk of failure as they tend not to employ the latest technologies that are able to scale easily to handle large volumes of data transactions and unpredictable user loads.

In an example of a successful use of cloud Neville pointed to the CIA who had started using Amazon Web Services (AWS) public technology for many of its applications last year. Amazon pushed 500 functional upgrades in just one year, something that would not have been possible within a private cloud. The AWS environment proved to be extremely agile, cost effective, and highly secure with cutting-edge technology.

Factor One Takeaway: Consider the use of a public cloud from a known, well-established vendor if possible. Alternatively, if you intend to use the private cloud, keep it simple and focused.

Factor Two: Unexpected Costs Will Occur

Unplanned or unexpected costs are another major factor Neville addressed that can make cloud focused investments really expensive. Many organizations may not fully understand the intricacies of a cloud environment and the compliance related regulations that apply to many government systems. 

In one of the example, a federal agency had 44 self-declared applications in the cloud but the audit found 130 and interestingly none of them met FedRAMP certification. Reportedly, there was no oversight when the agency started moving its applications to cloud. Audits like these can suddenly add to the overall cost and may require moving the applications back out of the cloud environment. Another federal agency cited spent more than $20 million on a cloud initiative before the vendor went bankrupt. There were no clearly laid out recovery plans so retrieving the data became a herculean task.

Factor Two Takeaway: IT or another supervising body is needed to maintain oversight of SaaS acquisitions across the organization to prevent placing an excessive number applications in the cloud. Additionally, there should be a plan for an exit from the cloud if it comes to that.

Factor Three: Legacy Migration Is Not Free

Another major challenge that many agencies face is what to do with the legacy applications. Should those be moved to the latest technology? Should they stay in-house or be transitioned to cloud? Should those be re-hosted or re-architected?

Taking a cloud-only approach may not be the best path when it comes to legacy applications because of the cost involved in reprogramming them. Oftentimes, a lack of proper documentation makes it difficult to understand the original business requirements. Additionally, many of these applications have been running reliably for many years, possibly for decades – there is no guarantee that moving them to cloud will result in the same level of stability.

In a success story Neville cited, the Dutch government consolidated its 64 data centers down to four, with three of them managed by commercial vendors. It’s taking a phased approach to sharing the applications as it improves the underlying architecture to move the selected systems into a cloud environment.

Factor Three Takeaway: Legacy migrations to the cloud must be done with extreme caution, slowly, and in a phased approach. Look for vendors who specialize in a specific type of legacy application and have technical knowledge to support such migration.

Factor Four: Poor Operational Practices

Cloud is not just an IT change, it brings a change at the organizational level. Many organizations do not think about the services and change capabilities that are required to go with the migration to a cloud environment. This can be compounded by the fact that vendors may not know how to engage with the CIOs or other business leaders on the government side and communicate the right strategy when it comes to moving assets into cloud. Operational practices that are not of the highest standards or a lack of oversight may result in redundant or low value applications moving to the cloud which in turn will lead to higher overall costs.

Factor Four Takeaway: It’s important to focus resources where there is a potential of maximum return (such as the value delivered to the stakeholders). Also, it helps to carefully select the cloud attributes you want to exploit such as scalability, agility and/or interoperability with other applications. Decide what data you want to share. Lastly, consider the design of the system, it should be kept simple, have a focus on the end-user, and have tools to monitor performance and system usage.

Look to Scalable Secure Hosting Solutions

It is essential that government agencies look to scalable enterprise solutions capable of accommodating increased user traffic and content through highly secure data-hosting operations such as MicroPact’s. MicroPact is a FedRAMP Compliant Cloud Service Provider (CSP). The MicroPact Product Suite is available to federal agencies under FedRAMP via a Platform as a Service (PaaS) model and Software as a Service (SaaS) model. Additionally entellitrak is also accredited and secure with C&A’s based on NIST 800-53, DIACAP and DCID 6/3.

About the Author

Sandeep Khare worked at MicroPact from 2003-2016, most recently in the Marketing department.