Close this search box.

Cloud Migration for Government Agencies in 3 Simple Steps

Authored by:

Federal and government agencies are being mandated to move data center services to the cloud and accelerate their cloud migration journeys. Instead of continuing to pay the costs to operate and maintain their on-premises data centers, government agencies can move their data backup and disaster recovery to the cloud cost effectively. Anything in the data center — the client, software and server — can be virtualized in the cloud. All employees need is a laptop and secure connection. 

Here are three simple steps that government agencies can take to begin their cloud migrations: 

  1. Consolidate: Implement Qumulo’s file data platform on prem for your scale-out NAS 
  2. Extend: Set up backup and recover to the cloud with Qumulo continuous replication 
  3. Transform: Cut your workflows over to the public cloud, seamlessly, when ready 

Running one centralized file system — a global namespace– on-premises and on the public cloud, means your live data is already there when you are ready to move your workflows to the cloud.

Government Use Cases for Scalable Storage on Hybrid and Public Clouds

Qumulo’s file data platform provides massively scalable storage on hybrid and public clouds with the robust enterprise features government organizations need in a global namespace. A rich set of data services are built into the platform, including:: 

  • single volume with multi-protocol file access for Linux, Windows, and Mac clients and applications
  • data protection features such as snapshots, replication and quotas
  • data awareness – real-time analytics to oversee and control data use
  • programmable API – allows full automation and integration
  • Data security with encryption at rest and in flight

Federal and government agencies such as the National Institute of Health (NIH),  the Department of Energy (DoE), and the Virginia Tech Transportation Institute (VTTI are benefiting from using public or hybrid cloud services. For illustration, I’ve highlighted two use cases below. 

Use Case 1 : Collaborative Research Across Multiple Sites, Agencies, and Entities

The Covid19 pandemic has highlighted the need for multi-agency and even international collaborative research. However, traditional data centers do not have a means to provide access to digital resources (files) produced and analyzed by distributed research teams. 

They don’t have the infrastructure, security, and technology required to establish a “global namespace” whereby users at different locations can work together and share data in real time.  

To solve this problem, agencies can use the public cloud as a centralized “data center” where researchers from all over the world can use cloud-native compute resources to run their applications with a global namespace on the cloud and on-premises.

For example, the Institute for Health Metrics and Evaluation (IHME) leveraged Qumulo, Western Digital, and Microsoft Azure to host data visualizations. This enabled IHME to deliver COVID-19 visualizations within days of the initial requests, and rapidly scale to meet the needs of  governments and healthcare officials tasked with keeping communities safe around the world.

“Visualizations are core to IHME communications with policymakers for the scientific papers that are rigorously peer-reviewed by journals. Qumulo is critical to enabling us to distill hundreds of millions of data points into a single visualization, which allows policymakers to easily view the results and communicate them to their teams,” said Serkan Yalcin, Director of IT Infrastructure, IHME.

Use Case 2: Global Continuity of Operations on the Cloud

With data created and accessed from endpoints around the world, it is difficult to have a global continuity of operations infrastructure that will ensure all data is accessible in the event of an outage or disaster. 

By utilizing public cloud services for backup, archive and cloud disaster recovery, customers can build a flexible, on-demand architecture that takes advantage of all of the high availability features of cloud infrastructure providers. With Qumulo, organizations can easily and affordably leverage the value of massive data sets distributed across on-premises data center and multi-clouds with security built in. 

The Qumulo file data platform provides multi-tiered data protection for your agency’s digital assets, and  runs the identical filesystem, features, and user experience on-premises and on the public cloud. For example:

  • Continuous Replication moves only changed data, instantly and without major bandwidth implications
  • Instant Failover and Failback allows customers to turn cold sites into hot sites immediately 
  • Qumulo SHIFT for Amazon S3 allows customers to move archive data to S3 or lower-cost storage in AWS 
  • Backup/recovery and archive capabilities include: on-prem to on-prem, on-prem to cloud, cloud to cloud, one to many, many to one, and many to many

Federal and government agencies can support continuity of operations while easing migration with a global namespace–one centralized file system on-premises and on the public cloud.

  • Eliminate risk–do all your testing on the backup in the cloud
  • Lower costs–stop storing secondary copies in your data center
  • Ease migration of workflows–with a nondisruptive cutover to public cloud 

“At SDSC, delivering critical analysis and results is paramount, yet high-performance computing workloads are incredibly dependent upon their storage system. As an organization, we are moving towards integration of cloud for both compute and storage, as a part of our science gateways. As a result, it’s important for us to make leading cloud technologies available via our Research Data Services division,” said Amit Majumdar, Ph.D., Director of Data Enabled Scientific Computing

Learn More
Contact Us

Take a test drive. Demo Qumulo in our interactive Hands-On Labs.

Subscribe to the Qumulo blog for customer stories, technical insights, industry trends and product news.

Related Posts

Scroll to Top