Faculty and students integrate live performances into VR, providing a unique theatrical experience

Links in this article are preserved for historical purposes, but the destination sources may have changed.

Wednesday, December 12, 2018

The action took place on stage, but audience members of a cutting-edge theater performance at the University of Iowa were virtually transported by elevator to multiple levels of an early 1900s hotel, including a meadow on the roof.

The performance, titled Elevator #7, was a collaboration between UI faculty and students in computer science, theater arts, dance, and art and art history that created an extraordinary new artistic experience.

“The basic idea was to explore what it would mean to have a theatrical experience in a virtual environment,” says Joseph Kearney, professor in the Department of Computer Science and interim dean of the College of Liberal Arts and Sciences.

Creative team

Faculty
Joseph Kearney: co-creator, virtual reality technical director
Daniel Fine: co-creator, artistic director, live video
Alan MacVey: co-creator, writer, director
Bryon Winn: co-creator, sound design
Monica Correia: co-creator, 3-D design
Joseph Osheroff: primary performer

Students
Marc Macaranas: live video, choreography
Sarah Gutowski: 3-D modeling, texture
Xiao Song: virtual reality programmer
Ryan McElroy: sound design, sound board operator
Chelsea Regan: costume designer
Actors: Ashlynn Dale, Octavius Lanier, Chris Walbert, Shelby Zukin
Production assistants: Courtney Gaston, Nick Coso

The project represents the next generation of immersive theater, building on works such as the site-specific, interactive theatrical production Sleep No More, by Punchdrunk Theater Company in New York City, and mixed-reality performances that have premiered at film festivals around the world. While public interest and commercial investment in virtual reality (VR) is booming, project leaders say there are few examples of successful integration of live performance into virtual worlds—at least as purely artistic endeavors.

“There is a lot of research being done using live performance and theatrical techniques within virtual reality, but the end goal often is not to solely create a piece of art,” says Daniel Fine, assistant professor of digital media in performance. “More often they are using it for activities such as to train rapid responders to deal with a viral outbreak.”

Kearney says he and Alan MacVey, professor and chair of the UI Department of Theatre Arts and director of the Division of Performing Arts, had been talking on and off for many years about combining virtual reality and theater. Finally, they decided to put the idea into action. Fine and Kearney applied for and received a UI Creative Matches grant, which allowed them to purchase additional trackers and software for generating the virtual environments, and to pay undergraduate and graduate students to develop interactive VR apps and build 3-D scenes. More than a year later, they rolled out the performance to a live audience on Dec. 6.

Walking into a virtual hotel
The premise of the UI’s performance was simple: One audience member at a time is escorted on to the stage of E.C. Mabie Theatre, where they find a desk, a chair, and a chandelier. Curtains hide the back of the stage and also the seats where the usual audience sits. While music plays, an actor portraying a concierge welcomes the audience member, who then puts on a VR headset.

The curtain at the back of the stage rises, revealing a green screen, a row of computers, and crew members manning lights, sound, live video, and other technical elements of the show. The audience member, however, sees none of this. Instead, they still see the concierge, table, chair, and chandelier, which now appear in the lobby of an early 20th century hotel.

The concierge invites them to step into an elevator. Then begins a series of scenes that include rising two floors and then falling into the basement. There the audience member tries to  find a circuit breaker to fix the elevator, finds a prisoner in a cage, peers into a tiny door to see two people trying to avoid being crushed by giant tap dancing feet, and discovers an electrician who tries to solve the problem. At last, rising on the elevator again, they arrive at a beautiful meadow on the roof of the hotel.

The show is a mix of live and prerecorded action. The giant tap-dancing feet are pre-recorded, but the concierge and the people trying to avoid those feet are actors who appear in real time.

“We’re trying to blur reality and virtual reality,” Kearney says.

One of the first—and most difficult—decisions that project leaders had to make was how to embed the live actor into the virtual environment. One option was to have the actor also wear a VR headset. This would mean they would be in the same physical space as the audience member, but they would appear as an avatar. Even if the avatar was as realistic-looking as possible, they risked entering the “uncanny valley,” or the unsettled feeling people get when simulations or robots closely resemble humans but aren’t quite perfect.

In the end, they decided to use green screen technology to play live video of the actor into the audience member’s headset. While the actors are on the same stage as the audience member, they are not standing next to them, even though the audience member may perceive them to be.

“What feels more real in this virtual world? The real representation of a person or a digital avatar?” Fine says. “In theater, we have thousands of years of experience and mostly know that this will work or this won’t. Someone has done it all before. We don’t know if this method we spent a year developing is the right approach. What’s exciting is the research, and these tests will help us find out.”

Because the actors are being projected on an invisible video screen in the virtual space, they appear 2-D, not 3-D. However, Kearney says this wasn’t a problem.

“While it’s a two-dimensional person, your head fills in a lot of information, particularly when the actor turns, gestures, and moves aside as you enter the elevator,” Kearney says. “When you watch TV, people don’t look flat. As long as we can prevent certain viewpoints, it looks 3-D.”

Overcoming challenges
One of the obstacles project leaders faced with Elevator #7 involved space and movement. Kearney says movement can be difficult in a virtual environment because the user’s head must be tracked so the virtual environment knows where they are and can show the correct images. While tracking technology is improving, it’s still limited to a confined space—in the case of the show on the E.C. Mabie Theatre stage, sensors tracked the audience member in a 16-foot-by-16 space.

“That’s all you can move around,” Kearney says. “So, we played with having a combination of the person moving and being moved. In this case, using an elevator to expand the space of experience.”

Along with the technical obstacles the project presented, the actors faced their own unique challenges. To look at the audience member, the actor looked into a camera in front of the green screen. Next to the camera was a monitor that showed what the audience member was seeing. Nearby was another monitor that showed different camera angles of the actor.

“I have to constantly be aware of where I’m standing and looking,” says Joe Osheroff, a lecturer in acting in the UI Department of Theatre Arts. “There are different marks on the floor that I have to hit because if I’m on the wrong mark, I can be cut off or I can appear to be 10 feet tall. The plane of existence I’m in is different from what is being projected in the audience’s headset.”

Performing in front of a green screen isn’t new for actors, but Osheroff says he’d usually have something to look at, such as an object on a fishing pole that later would be turned into a computer graphic.

“This is a totally different experience,” Osheroff says. “I don’t think there are a lot of actors out there who would know how to do this. And I still don’t know how to do it. But that’s one of the exciting parts of this project, that we are trying to do something totally new and there is no blueprint to follow.”

This was just one way in which participants found themselves in the wild west.

“We’re all pioneers here,” Fine says. “We know how we’d normally do a play. We know how we’d normally do a film. We know how we’d do—for lack of a better term—a game or 3-D experience in a head-mounted display. We have workflows for those, but not a combination of all three. It’s been exciting to make things up as we go.”

Plans for version 2.0
While project leaders say a few things that they wanted to accomplish weren’t possible this time around, they hope to build on it in the future.

“We were very ambitious and just couldn’t get to everything,” Fine says. “Hopefully some of those things will be in version 2.0.”

“This is just the tip of the iceberg for what we can do,” Kearney adds.

One thing that project leaders would like to incorporate into a future project is more advanced haptic feedback—or simulating the sense of touch. Elevator #7 uses some haptic elements, including audience members feeling heat from a space heater when near flames in a boiler, being tapped on the shoulder to make them turn around to see another virtual actor, and experiencing shaking when the elevator shudders to a stop—caused by people banging on a wall behind the audience member.

To make the experience more communal, they also want to experiment with networked VR to allow multiple people to be in the virtual environment together. This would make it possible to have more than the 30 audience members who participated in this round’s 10- to 15-minute theatrical experience.

“What if you have a whole cast of actors on a virtual reality set interacting with each other and with the audience?” says Osterhoff. “This is just a microcosm of what we can look forward to with the technology in theater.”

Kearney and Fine say the UI is getting in on the ground floor of experimenting with the possibilities of applying VR to live theater.

“Everyone is trying to figure out what works and what doesn’t work,” Kearney says. “What technology and processes work best? How do you interact with the environment? And that doesn’t even touch on augmented reality, which is also burgeoning.”

The UI is well positioned to pave the way in this area, with VR experts working on the Virtual Soldier Research Program and National Advanced Driving Simulator within the UI College of Engineering’s Center for Computer-Aided Design, and the Hank Virtual Environments Lab in the UI College of Liberal Arts and Sciences. In addition, it has a long tradition in storytelling thanks to the Iowa Writers’ Workshop and the Iowa Playwrights Workshop.

“What kind of stories can you tell in this kind of world that you can’t tell in traditional theater?” Fine says. “I think where we’re heading is the ability to enter fantasy and dreamscapes and to switch worlds immediately and be immersed in a world that we can’t in traditional theater. Storytelling is storytelling is storytelling, but some stories require different modes in which to tell them.”

Osheroff says he’s blown away by what the team has created.

“I love having the opportunity to be involved in an experiment like this,” Osheroff says. “This isn’t happening anywhere else that I’m aware of. And if this is going to be a hallmark of what theater can become, I feel good about being in on it right now. Where else would I be able to do this but the University of Iowa?”