- by John Luxford
By John Luxford, CTO & Co-founder - Flipside
With Flipside’s first two show productions in full swing, we've now been through a number of production days with live actors working inside the platform. We learned a ton as a result, and wanted to share what we learned with you.
This first post explores some of the more general lessons we learned that have helped streamline our productions and helped us empower our actors to do the best job they can.
Processes need to be honed, but they also need to be documented. These are living documents that will evolve rapidly, but you won’t be able to iterate on them as fast if you don’t have them written down to begin with.
We currently have documentation covering:
These act as checklists to make sure we don’t miss a step that may have cost us time, or worse, a usable end result.
Because Flipside is still in beta, our "known issues and workarounds” document becomes critical. The purpose of this is to provide quick actionables we can use when an issue arises, without having to worry about finding a creative solution on the spot. Not having a workaround ready can quickly eat into your production time.
At first, there were little things that we would have to reset between takes, and early on some of these even required restarting the app, which doesn’t help the actors stay in the right frame of mind. Context switching hurts creativity, and our goal isn’t just to be a virtual studio, but to use that opportunity to eliminate as much of the friction that goes into show production as possible.
So we iterated on ways of reducing the time between takes as much as possible. We now have a process that is impressively automated, with one person manning the camera switcher and director tools, while the actors are free to concentrate their attention on what they do best.
Actors need to learn and get into their characters, how they move, how they talk. They also need to get comfortable acting with a VR headset on. One request we got was a simple mirror scene, so the actors could practice their parts while seeing themselves from the front, side profile, and back all at the same time. Actors can now hop in there and see exactly how their movements translate to their virtual counterparts.
The actors need to know when something is about to change, or where they should be standing and facing to be in frame. For example, we added virtual marks for each camera position, which update prior to the next camera change so the actors can know to move into place or turn if needed for the next shot.
This can also be as simple as counting down to “Action!” in VR when the director clicks record instead of starting recording immediately after pressing the button. These little things add up to make a more intuitive experience for everyone.
This means we put a lot of work into making eye contact feel right, and blinking feel natural, because we don’t have eye tracking available in consumer VR headsets just yet.
We also devised our own system for more natural neck and spine movement, as well as arms that emulate more traditional animation techniques that emphasize for style over accuracy. Since today’s full body inverse kinematics options still don't feel quite natural, and the closer you get to the character feeling alive, the more you risk falling into the uncanny valley.
The more you play into the strengths of the medium, the more the quality of the content can shine. Counter-intuitively, the better things get, the more noticeable the remaining issues become.
We quickly realized that even with lip syncing and natural eye movements, the avatar faces felt dead. To solve this, we created an expression system that the actors can control with the joystick on their motion controllers that allows them to express four distinct emotions, but also blend between them (smoothly transitioning between happy to upset, while blending naturally with the lip syncing).
With a little practice, these expressions can become reflexive actions for the actors, giving them a new level of expressive control as they embody their characters.
There are lots of unsolved problems in VR, probably most notably locomotion without causing motion sickness. But there are other subtler causes of motion sickness too, which can include anything that creates even slight disorientation.
Image source: techradar.com
One of the strangest examples we encountered in Flipside was in our preview monitors (which are just flat virtual screens for the actors to see the 2D output). We found that there was a perceived parallax in the preview monitors which caused a tiny amount of motion sickness over time. Nothing crazy, but present nonetheless. The solution we came up with was to flip the video on the preview screens horizontally. This had the effect of making any on-screen text appear backwards, but eliminated the perceived parallax which slowly caused discomfort for the actors.
The reason this is so critical is that actors are likely spending prolonged periods of time in the virtual sets, doing several takes before they get it just right, or doing batches of episodes in a single shoot. Anything that causes discomfort can potentially cut your shoot short in an unpleasant way.
These are some of the more general lessons we took away from working hands-on with live actors in a virtual world. They’ve helped us hone our vision, and Flipside is already way better because of it.
Stay tuned for the next post in this 3-part series on live acting in virtual reality. We have lots more to share! And if you're a content creator, make sure to sign up for early access to Flipside Studio!