Search
Close this search box.

Azure Native Qumulo Now available in the EU, UK, and Canada – Learn More

Three Ways Qumulo Helps Top Research Institutes Grow New Business Models to Advance Discovery and Innovation

Authored by:

Like many of you, we’re looking forward to the upcoming Bio-IT World virtual conference coming up on October 6-8. This is always an interesting event for industry leaders focused on biomedical research, drug discovery and development, and healthcare research.

Academic institutions, scientific research labs, bio-sciences firms and others are under constant pressure to speed innovation and time to results. As a result, they need an IT infrastructure and file data platform that can:

  • Offer extreme scalability to keep up with massive data growth, especially in artificial intelligence (AI) and machine learning (ML)
  • Manage billions of small and large files and enable consistent high performance
  • Improve overall collaboration by enabling regionally dispersed teams to quickly and easily access shared data resources, for processing, analytics, and simulations.
  • Employ hybrid and cloud environments, as and when needed for burst and elastic computing, plus the ability to leverage cloud applications not available in their data center

Organizations focused on research computing and high performance computing (HPC) workloads are utilizing Qumulo’s file data platform to accelerate sequencing, simulations, and analytics to speed research, learning, and discovery. With Qumulo they’re able to focus on their science instead of their storage.

Here are three ways that scientific research labs, research institutes and academic organizations have been able to shift their focus from managing an inefficient storage system, towards developing innovative new forecasting models, growing their business to better serve end users, and making informed decisions to management about their data usage, system capacity and performance.

 

Implement Data-Based Forecasting Models Faster – The Institute for Health Metrics and Evaluation (IHME) is an independent global health research organization based at the University of Washington School of Medicine. In February 2020, IMHE was urgently tasked to help with pandemic modeling to help hospital systems and state governments.
Almost overnight, IHME needed massive additional data resources. By coupling two powerful technologies, Qumulo for data processing on-prem and Microsoft Azure for hosting data visualizations, IHME was able to respond in just 48 hours of initial requests and rapidly scaled to meet needs from governments and healthcare officials tasked with keeping communities safe around the world.

The agility of Qumulo’s architecture enabled IHME to accommodate the sudden growth in both data ingest and processing demands without the need to re-architect. Qumulo made it possible to ingest, process and leverage data coming in from a wide range of internal and external sources.Since the COVID-19 pandemic hit, IHME’s Qumulo systems helped to analyze up to 20x more data every day.

 

Grow a Managed Service Offering for High Capacity, High Performance Workloads – Few research organizations can afford their very own supercomputer and advanced storage systems and depend on specialized Managed Service Providers (MSPs) like The San Diego Supercomputer Center (SDSC) to offer remote computing and storage capacity to data-intensive research clients. As an MSP, SDSC’s data-intensive gateways and client technology stacks must support high-performance and high-capacity data storage for massive amounts of big data – much of it unstructured. Its previous data platform lacked massive capacity and the features necessary to support big data, fast access, and advanced analytics.

Qumulo’s file data platform was simple to deploy, manage, and access – critical components for both SDSC and its clients. As a result, the Center can focus its staff and resources on highly impactful grants from the National Science Foundation, the National Institute of Health, and other funding agencies versus on managing an inefficient IT infrastructure.

Since deploying Qumulo’s file data platform, SDSC has acquired a new set of customers, including more than two dozen University of California research labs and departments.

 

Shift File Data Storage From a “Black Box” to Precious Resource – The University of Utah’s Scientific Computing and Imaging (SCI) Institute is an internationally recognized facility specializing in visualization, scientific computing and imaging analysis. With a strong focus on medical research, the Institute saw that as the size of imaging data sets grew, its legacy platform was unable to meet the growing capacity and performance needs.

With Qumulo, Nick Rathke, Assistant Director, Information Technology noted a “fundamental difference with Qumulo: the ability to linearly scale capacity and performance at the same time,”

That’s a real benefit for grant-based funded research organizations like the SCI Institute because it allows them to incrementally fund and expand storage capacity and performance as needed.

“It’s impossible to make smart decisions about data when your storage is a figurative black box,” notes Rathke. “Qumulo lets me know in an instant how the data is used, who touches it and how often – so storage is no longer a technical issue, but a management decision.”

If you’d like to learn more about Qumulo for high performance computing, read our solution brief here or download this Technical Insights paper by The Evaluator Group that discusses today’s healthcare research data and storage challenges and requirements for high-performance, scalability and cloud.

Contact us here if you’d like to set up a meeting or request a demo.

Related Posts

Scroll to Top