This is part three of a four-part blog series that will take a closer look at the visual effects (VFX) workflow. In our previous posts, we covered the components of the VFX workflow, as well as the rendering process. Today we’re discussing the common data-related challenges that studios and artists face with VFX workflows.
Complex VFX Workflows, Unstructured Data, and Legacy Storage: What VFX Studios Need to Know
A visual effects (VFX) production pipeline is complex, with many processes.
One of the biggest challenges is managing the sheer volume of data required to produce photorealistic imagery. A single digital creature might be composed of hundreds, if not thousands, of digital assets. It is often necessary to assemble terabytes of data that must be rendered and/or composited.
Volumetric data, which is essential to many bread-and-butter effects, including clouds, dust, water, fire, and fluids, is another example of extreme complexity. The data is challenging both because of its large footprint and because it often requires conversion to other formats before it can be used by other tools.
In general, the driver behind the growth of VFX is the fact that technology has gotten better, cheaper, and faster. This means that animation and special effects have gotten increasingly ambitious. In addition, visual effects and animation have become critical during COVID to complete shows, commercials and movies that otherwise would have been shot on-set with film.
These complex effects require lots of unstructured data management and a modern file data platform that is designed to handle billions of files. The emphasis is on “file,” because VFX is a file-based workflow; files are the medium of exchange between applications that were not necessarily written by the same company. Workflows must integrate across applications, and efficient unstructured data management is the way to do that.
What VFX studios need to know: Legacy scale-out and scale-up solutions were not designed to handle today’s data volumes, file types, applications, and workloads. Data isn’t the same as it was 10 years ago. For this reason, VFX studios should not settle with outdated storage technology. Legacy storage systems can’t provide the visibility, control, and scale that you need to manage your data. (That’s why Qumulo built a better file data platform.)
The New Enterprise Data Storage Playbook
Unstructured data is everywhere, and it’s growing at uncontrollable rates. CIOs and IT leaders are turning to scalable storage solutions to manage this data and remain competitive. Block storage, object storage, and file storage each have unique capabilities and limitations, meaning enterprise-level storage systems are not “one size fits all” solutions.
Learn why unstructured data matters — and which storage solution is right for you.
Legacy Storage Pains: Performance, Scalability, Adaptability, and Visibility
Organizations that rely on unstructured data management platforms as heavily as VFX studios are generally sensitive to a variety of data storage pains. These include performance, scalability, visibility, and flexibility.
As more studios move to 8K and higher resolutions, performance will become increasingly critical. Storage systems that are too slow can starve the rendering farm or keep artists from working while rendering is going on.
Scaling is another challenge. The ever-increasing amount of data that VFX shops generate can easily fill up their data storage system.
Problems such as scalability and performance, which you would expect from any organization that has to deal with an ever-growing number of files, are exacerbated for VFX studios because they cannot tolerate delays in their schedule to add more resources.
Depending on what type of data storage system they’re using, demands for more capacity may mean that the studio must replace the entire system or buy new metadata controllers and storage shells. With tight deadlines, there’s usually no time to build out the physical infrastructure.
Visibility is another limitation of legacy file storage systems. Most VFX shops use a treewalk script/program to manage their file data. Treewalks are extremely slow when file systems are large. Administrators can literally wait for days to get answers. By then, it’s too late for them to use the data in any way that matters.
Here’s a common scenario: A small VFX shop might, when it first starts, use a storage system the team has put together themselves. As their business grows, that system will need to be replaced. The artists will need a system that’s fast enough, so that they can work while frames are being rendered. They will need to move to a scalable, commercial file data platform with enterprise features will help them take on more projects and create more sophisticated effects.
The Challenges of Rendering in the Cloud
The ability to leverage the cloud is a major benefit for VFX studios, but cloud rendering is not without its own set of challenges.
Many of the same considerations for on-premises rendering apply to cloud rendering, but there are some specific issues that need to be considered.
Unfortunately, while High-Performance Computing (HPC) resources in the cloud are readily available, traditional file-based storage solutions are often inadequate, or are versions of legacy file storage with some patches applied to make them “cloud-ready.” Problems include lack of protocol support, performance and capacity limitations, and complexity in setting up the cluster. (RELATED: Block Storage vs Object Storage vs File Storage: What’s the Difference?)
It’s important that studios match the performance of their on-premises render farm in the cloud. They need to be able to scale performance and capacity separately, to take advantage of the flexible resources the public cloud offers. Studios also need the ability to transfer files from the on-premises cluster to the instance in the cloud, and then return the results back to the on-premises cluster. (RELATED: How to Copy Objects Between S3 Buckets and Clusters.)
Most of the file-based cloud systems cannot address all of these requirements. Qumulo Cloud Q was designed to solve all of the pains outlined above.
- Performance: Ultra-fast performance handles massive files with ease.
- Scalability: Simple modular design. Just add nodes to scale to petabyte levels. Gain performance and capacity in minutes. No disruption or downtime.
- Flexibility: Runs on-premises and/or in the cloud.
- Visibility: Monitor performance, capacity, and usage built-in real-time analytics.
The Definitive Guide to Qumulo on AWS
Qumulo simplifies migrations to the Cloud where unstructured data is being stored in file systems, making Cloud Q for AWS an attractive choice for many workflows.Download Now
Qumulo has several helpful resources for learning more about VFX and how our file data platform helps speed VFX production pipelines. Check out our additional resources and data sheets on VFX, Animation, Media & Entertainment to learn how we’ve helped companies like FuseFX and Cinesite Studios to bring movies and animation programming to audiences faster!
In part 4, the last of this series, we’ll share examples of how VFX studios are innovating faster with Qumulo’s file data platform.
VFX 101 series
- Part 1: The History of VFX and Its Workflows for Modern Film and Animation Production
- Part 2: What is Rendering and Why are Render Farms Important?
- Part 3: Common Data Challenges For VFX (And How to Solve Them)
- Part 4: VFX Rendering In The Cloud With Qumulo
Special thanks to Matt Ashton and Kristi Whitman for their contributions to this blog series.