Stage Presence

by Jeromy Hopgood and Jeremy Gibson Bond
View inside Stage Presence
View inside Stage Presence
Print

A Collaboration That Resulted in a New Approach to Design Communication

You don’t have to look around very far these days to see news about virtual reality (VR) and how it’s revolutionizing our world. For almost as long as the technology has been available, people have asked about ways that VR can change how we experience live performances like theater or concert events. For the authors of this article, the question became not how it can change the performance, itself, but rather how it might change the ways in which we create those performances. Finding an answer led to the collaboration between Eastern Michigan University’s Jeromy Hopgood, and Michigan State University’s Jeremy Gibson Bond. They wanted to envision a new approach for using VR as a tool for facilitating design communication between designers and directors, as well as being a time and cost-saving pre-visualization tool.

“The approach of our multidisciplinary program is to combine the traditional academic components of theatrical design and technology with interrelated programs from across campus,” explains Hopgood of the Entertainment Design & Technology program at Eastern Michigan University. “A cornerstone of the program is ‘learning by doing,’ with an emphasis on each student taking on design or technology projects in their primary area of interest. These projects could be anything from designing for a theater piece, to dance productions, interactive media projects, or runway shows.”

As faculty in the ED&T program, a big part of Professor Hopgood’s job is mentoring student designers through the production process—from concept to opening night. “One of the hardest things for any designer, first time or even seasoned veterans, is effectively communicating your design in a way that the director understands both the possibilities and limitations of the design,” says Hopgood. “This can be especially true of scenic design. Many directors admit to having a disconnect between seeing a design on the page and understanding the spatial relations to the real world. Even a color model only goes so far towards ensuring that what the director sees in their head is accurate to what they will have in the theater. While programs like SketchUp allow students to create 3D renderings and even animated walk-thrus of scenery, these still don’t give directors the feeling of actually being on set. Virtual Reality, however, can offer the next logical step in this process—a tool that could allow the director to literally walk around the design and experience it in a realistic fashion. The question was how to do it?” As luck would have it, this worked out in the way all good theater projects do; through collaboration.

Setting the Stage

Professor J. Bond specializes in game design and development as a Professor of Practice in the Media & Information Department at MSU. He explains, “Prior to teaching at MSU, I was a Professor of Practice at the University of Southern California’s Interactive Media and Games Division and in both graduate school and my teaching at USC, I’ve worked extensively with VR.” J. Bond was hired by MSU in 2016 and it was at this time that the school purchased an HTC Vive VR headset for him. J. Bond was collaborating with his wife, Professor Melanie Schuessler Bond, EMU’s costume designer and digital design professor and it was then that M. Bond had her first experience with room-scale VR. M. Bond realized the possibility and encouraged J. Bond to develop a project aimed at creating a software application that allowed users to experience a scenic design through VR. This idea eventually become the Stage Presence software resulting from Hogood and J. Bond’s collaboration. 

For the first test, EMU Technical Director John Charles, provided a Vectorworks 3D model for the set of an upcoming production of Macbeth. “Using the HTC Vive VR headset and Stage Presence in one of the theater spaces, director Lee Stille was able to experience a virtual walk-thru of the set,” describes J. Bond. “After just a few minutes, Lee discovered an issue with the placement of a wall that would lead to blocking complications. Had he not seen the set in VR, the wall would have been built before the problem was noticed, at which point, moving the wall would have been both costly and time-consuming. Because of Stage Presence, the change took minutes on the plans. On that first day, Stage Presence proved a huge hit at EMU and for three hours there was a nonstop parade of faculty, students, and even the Dean of the College of Arts & Sciences, waiting to take turn in VR.”

Professor Melanie Bond’s concept was now proven and she put together a grant to get the ED&T program its own HTC Vive and laptop to run Stage Presence. The main requirement of the laptop is a high-performance graphics card; the current recommendation is an NVIDIA GTX 1070. Since getting the equipment, Stage Presence has been used for every mainstage show at EMU, including student-designed productions.

The Technology

Stage Presence is built in the Unity game engine and runs in the HTC Vive VR headset. While other headsets can provide similar image quality to the Vive, only the Vive allows a true, “room-scale” experience that allows players to walk around the set and feel like they are actually there with the absolute minimum of motion sickness. Unlike other headsets that use a camera (PlayStation VR & Oculus Rift) or gyroscope (cell phone VR like Google Cardboard) to track rotation and motion, the Vive uses “lighthouses” positioned in opposite corners of the physical space, which has a maximum size of about 15’x10’. These lighthouses each sweep a carefully-timed infrared laser across the space, and sensors on the Vive headset and controllers track when they are hit by this laser. The precise timing of this allows the Vive headset and controllers to know exactly where they are in the interaction volume to within less than 1mm of accuracy. This method of tracking drastically improves immersion and decreases motion sickness by allowing the virtual world to react to players’ movements just as the real world does and by enabling the player to walk around the interaction volume with each movement in the virtual world matching their real-world movement 1:1.

In Use

Three different scenic designers at Eastern Michigan have used Stage Presence on five separate shows. Hopgood comments, “The results have been overwhelmingly positive. In each instance, there were different challenges aided with the use of VR. When I designed both scenery and projections for a production of Christmas Carol’d, the backdrop was composed of a series of 3-dimensional floating elements, such as windows, clock faces, doors, etc., used as projection surfaces. Since the play is a fast-paced retelling of the Dickens classic, the unit set had to accommodate quick scenic shifts and overlapping scenes. Terry Heck-Seibert, the director for the production specifically requested to use Stage Presence as an opportunity to better understand the set and how the projection surfaces would work. It let Heck-Seibert better plan the fast-paced transitions and discuss projection design options.” 

Hopgood offers another example, “For a student-designed production of A Raisin in the Sun, set designer Aaron Delnay created a 3D model of the set with SketchUp to use in Stage Presence. One strength of the Stage Presence software is the ability to use a number of different 3D modeling applications to create the set, instead of being tethered to any single one. For this production, the theater space is something between a thrust and a traditional proscenium, with complicated sightlines. Using Stage Presence allowed the designer and director to literally look at the set from the problem spots in the auditorium and determine the adjustments needed to the set to accommodate sightlines. This example was a particularly beneficial collaboration, as understanding how best to deal with these types of hurdles can be a unique teaching challenge.”

The Possibilities

“The exciting thing is how many challenges can be alleviated with the technology, even at this early point,” says J. Bond. “Stage Presence is a remarkable tool for facilitating the design process, especially with scenery.” “That said,” Hopgood adds, “we see a lot of potential for other design areas, as well. Currently, we are looking at the possibility of plugins for Unity that can pull lighting information from a plot. One possibility is pre-visualization for projection designs, as well. With the built-in audio capabilities of the Vive headset, it is also possible to integrate sound design elements into the VR experience. Another exciting option is using VR as a collaborative tool for envisioning design elements—with a director in the VR headset and a designer on the computer creating a “virtual white model” of sorts.” 

J. Bond does clarify that “there are some limitations to how we can currently use the software. For instance, in multi-set designs, it is less time consuming to simply create individually rendered designs of the different scenes and switch between them inside of Stage Presence, rather than trying to manipulate the transition inside of a single file.” Still the advantages of the software are numerous with a wealth of possibilities yet to be explored. Both professors conclude, “We’re excited to see where the technology takes us.” 

If you would like to check out the current beta version of Stage Presence and see how easy it is to incorporate it into your productions, you can download the software and instructions from http://stagepresence.net.