Creating 3D Immersive Experiences at Falcon's Creative Group

Posted April 12, 2018. Filed under Company.

Creating 3D Immersive Experiences at Falcon's Creative Group

This is a guest post from QF2 user Falcon's Creative Group. Visit their website to learn more about the exciting work that they do.

In 2016, National Geographic was invited to document the restoration of The Edicule, a small structure within the Church of the Holy Sepulchre, which encloses the remains of a cave that has been venerated since at least the fourth century A.D. as the tomb of Jesus Christ.

Once the restoration was complete, they wanted to create an exhibition for their museum in Washington, D.C. that would let visitors virtually visit the church and learn about its history. They knew no one was better suited to the work than Falcon's Creative Group, in Orlando, Florida. Falcon’s Creative Group specializes in immersive experiences for theme parks, museums, hotels and more.

The result was Tomb of Christ: The Church of the Holy Sepulchre Experience, a 10 minutes long 3D experience. Audiences stand in front of a 270-degree screen that goes below the level of their feet and up to the ceiling. Visitors feel as if they're flying through the building.

You can get an idea of the experience here:

 

The projects that Falcon's Group takes on requires that they stay on top of the latest advances in live action filmmaking, computer generated imagery (CGI), visual effects (VFX), virtual reality (VR), mixed reality (MR) and audio design. Saham Ali, Director of Technology at Falcon's Creative, knows that working with demanding technologies requires file storage that's fast, scalable and easy to manage.

In this post, Saham explains the storage requirements for 3D and VR workloads and talks about how QF2 helped Falcon’s Creative Group bring The Church of the Holy Sepulchre to life.

Turning LIDAR scans into a 3D experience

Falcon's Creative is one of the first companies to render a 360-degree dome where objects come out of screen space into audience space and back into screen space, and can be viewed from any angle without breaking the stereo effect. To deal with the volume of data that VR generates, we switched to GPU-based rendering, but found that the existing storage couldn't handle the workloads and were maxing out on performance. So, before starting on the project with National Geographic, we brought in Qumulo File Fabric (QF2) on a four-node QC104 cluster.

National Geographic handed over the massive amounts of data they had scanned so we could replicate them as 3D objects. The scans themselves were shot at 10K resolution. An average EXR file that merged all the AOVs and was rendered in Maya would be anywhere from 2GB to 2.5GB per frame, with a frame rate of 30 fps. (AOVs, or Arbitrary Output Variables, provide a way to render any arbitrary shading network component into different images.)

A compositing artist could use anywhere from two to six read nodes, each of them reading an EXR file. With the old storage, they would scrub a frame and then have to leave their computer while they watched the scanline slowly come down the screen, and that was when nothing was even rendering.

As soon as we migrated to QF2, there was perhaps a second's delay but then that line would tear down the screen 100 times faster, and that was with rendering taking place. People could work while renders were happening. I never had to worry about how many nodes were accessing the storage at any given time and I didn't have to schedule when the artists could work. The problem of not working when we rendered disappeared, which is awesome. We threw everything we had at QF2 and nobody complained that things were slowing down.

The polygon problem

We knew there would be technical challenges when we agreed to take on the National Geographic project. One was the complexity of the LIDAR scans, where a single section of the tomb could be made up of from 5 to 10 billion polygons. However, that was just the beginning. Because audiences would experience a seamless flythrough, multiple rooms needed to be loaded simultaneously, depending on what the camera would see.

Billions and billions of polygons would require many terabytes of RAM, so a brute force approach was impossible. We had to develop a caching mechanism that would drastically reduce the amount of RAM required. What we developed generated these very large caches and they all sat on QF2. It was such a relief when we knew we could actually render with the limited hardware we had and that QF2 could keep up and not negatively affect the artists.

Falcon's Creative Group at NAB Show 2018

After his presentation at NABShow 2018, Saham Ali, the IT Director at Falcon's Creative Group, spoke with Qumulo about why he uses universal-scale storage in his data center and the advice he has for those building their own studios.

 
thumb_CS_Falcon-Letter

Download the Falcon's Creative Group Case Study

You can read more about the work of Falcon’s Creative Group by downloading this case study (PDF, 4 pages)

Download

Log In to Qumulo

Log In

Let's start a conversation

We are always looking for new challenges in enterprise storage. Drop us a line and we will be in touch.

Contact Information

REACH US

EMAIL

General: info@qumulo.com
PR & Media: pr@qumulo.com

WORK WITH US

SUPPORT

Search

Enter a search term below