- by John Luxford
Director Mode is an experimental new feature that makes you an invisible director, hiding you from both video output and motion capture recordings. Director Mode also lets you move and scale the world with your hands like you can in Set Builder mode.
Moving and scaling the world while recording makes it possible to capture shots from any angle, and even move the world with one hand and the handheld camera with the other to simulate jib arms and other complex shots. So many new possibilities!
Press Alt + D to toggle Director Mode on/off.
Now when you enter Set Builder mode, you'll see your character disappear and your hands replaced with controllers, so you can see you're in Set Builder mode. This will help make more precise placement and interactions easier too when arranging your sets.
The Flipside Creator Tools have been updated to version 0.12, and include some really cool changes too.
The Flipside Creator Tools are now in their own window in Unity, instead of appearing as an overlay to the Scene window. This avoids problems where the Scene window may be too small, cutting off visibility of some of the Creator Tools buttons.
In the Flipside Creator Tools menu, choose Open Creator Tools to open the new window. You can dock it anywhere in the Unity layout that works best for your workflow.
The new Animation facial expression mode triggers Unity animations to control facial expressions. This not only makes bone-based facial expressions possible, but anything the Unity animation system can do too!
It's also saved us a ton of time importing characters with bones but no blend shapes. Double win!
Learn more in our user manual about connecting facial expressions to Unity animations.
Any auto-playing animations and audio sources in custom sets will now be synchronized to restart on Play, Record, and Add Role, and to pause/resume when playback is paused too, so they stay in sync with recorded parts.
Any cameras that you add to a custom set are now automatically added to the Flipside camera system. You can even attach animations to cameras to achieve any camera movements imaginable.
Trigger your own custom Unity events in your sets! Unity events can do things like enable or disable objects, trigger sounds and animations, particle effects, and more. There are three new ways of triggering Unity events in your sets:
These changes open up so many new possibilities for interactions on sets, and we can't wait to see what you guys do with them!
- by John Luxford
Pixels was a live animated improv show by the internationally acclaimed improv duo Stephen Sim and Caity Curtis. They used Flipside Studio to create an animated show in real-time in front of a live audience, wearing HTC Vive headsets to perform the characters while the audience watched on screen.
It was also the first public test of our upcoming cross-platform multiplayer that's going to completely change the way we make animated shows forever.
They sold out 9 out of 10 shows at the Winnipeg Fringe Festival 2018, and the response was incredible! The opening night of the festival experienced severe thunderstorms, so a big thank you to those who braved the weather to see the show that first night.
Stephen and Caity had a dozen different characters and a dozen different locations they could choose from, as well as any of the characters and sets created by our community, which resulted in some hilarious moments.
The sets were simple photograph backdrops, and the characters were all ordinary people, but it became immediately apparent when the first live audience members gasped at the opening of the show why this was something truly different.
There are several things that real-time animation makes possible, whether for live theatre audiences, live streaming audiences watching from home or anywhere, or both.
Virtually unlimited locations you can jump to at a moment's notice enables you to take your show anywhere, including to dynamically react to the audience's suggestions and reactions.
Locations can also be dynamic themselves, like the driving scene shown above. As you can see, we've barely scratched the surface of what storytellers are able to do now as a result this technology.
Just like locations, an actor can instantly become any character of their choosing to augment the experience of the show.
Improvisers have always had to inform the audience through verbal or physical clues as to what their character's role was, but now for the first time they can become that character in an instant.
Creating instant immersion like that frees improvisers to go deeper and explore new directions that were only previously possible through elaborate stage design and costume work.
The director can also dynamically affect the production as it happens. With the power to control cameras and more, the director can present the audience with something more akin to an edited TV show or movie with dynamic shots that theatre could never do live before.
Picture not just actors as part of the stage performance now, but also virtual camera operators getting close-ups and otherwise impossible shots, and still others animating environmental elements of the scene like lighting changes, sounds, moving props, or animals.
Real-time animation gives superpowers to improvisers and sketch comedy troupes, turning them into real-time immersive storytellers. With the democratization of motion capture brought about by the combination of consumer-ready virtual reality hardware and software like Flipside Studio, real-time animation is going to explode both online and off.
All of this also brings a level of audience participation to animated content that only live theatre could do before. Suggestions can come to life right before the audience's eyes, and audience members can even be invited "on stage" (live or online) to be part of the experience.
And as computer graphics continue to make leaps and bounds over the coming years, real-time animation is also going to grow to encompass every style from the Simpsons to Toy Story, and even photorealistic animation like Planet of the Apes or Avatar.
There will always be new worlds and new stories to bring to life, and for years that's been getting faster and faster to do.
Now, it's instant.
- by John Luxford
Meet the new Crocodile and Lion characters. Along with our existing Elephant and Giraffe characters, these two round out our jungle characters collection nicely.
We've added a Vlogger Living Room set you can use to quickly make vlog-style recordings. This set is already setup with cameras and everything so you can jump right in and get creating.
HTC Vive users can now press the index finger trigger to switch between open hand and pointing hand poses, making button pressing easier and adding more expressiveness to hands.
When you're in VR for long you can easily lose track of time. Our new clock prop will help you keep track of time back in the real world.
We've added the ability to scale preview monitors just like you can with ordinary props. Just grab a monitor with two hands and make it any size you want.
We've also made loads of small fixes and quality of life improvements throughout the app:
- by John Luxford
Hey there Flipsters!
Here's another update we think you're going to love, so let's jump right in!
We spent a ton of time honing our lip syncing and we're finally ready to unleash the results of those efforts. With some clever hacking, Flipside's lip sync responsiveness is now more than double what it was in previous releases!
This is a huge leap forward for improving the quality of Flipside's output, and will make a noticeable difference for everyone.
This is another feature that's been requested several times, and it makes a big difference in using the handheld camera. Just press up/forward to zoom in or down/backward to zoom out. It feels very natural to use.
Thanks again to all our users. We're working hard to bring a level of polish to everything in Flipside in order to make your productions faster, smoother, and better.
Every week we say this, but we're still just getting started. There's so much coming in time that Flipside is only going to get better and better until it's just the most natural way to create animations, period. So thanks for coming on this journey with us!
The Flipside Team
- by John Luxford
As you may remember, we recently renamed our company Flipside XR from our former name, The Campfire Union.
As The Campfire Union, we experimented a lot with VR as a medium to really understand its strengths, constraints, and particularly, its creative potential. We experimented with everything from games, to virtual relaxation, training, and 360 video, but we just kept coming back to what we all were outside of VR, artists and performers.
Officially, we started Flipside in spring of 2016, but its conception goes much further back than that. All the way back to Peg Jam 2014 in fact, where Les and some friends made Party Sketch 3D, which is kind of like Tilt Brush meets charades (the charades concept would become one of our first experiments in Flipside).
In 2015, we ventured into our next creative VR experiment with a dance / music performance app called Lightshow. Lightshow was inspired by art forms like fire spinning, poi, hooping, and gloving. Lightshow recorded your dance movements and people could watch your performances over the web.
Lightshow was made on the Oculus Rift DK2 with a pair of Razer Hydra controllers. Because the Hydras used electromagnetic sensors, they used to freak out when you moved your hand too close to the metal edge of our whiteboard, which happened a lot because we were crammed into a tiny little office back then.
We learned a lot from that experiment, let it stew for a while, and turned that learning into Flipside about a year later by combining the performance elements of Lightshow with the basic multiplayer code from our Lost Cities VR game, which was midway through development at that time.
Here's a video of Les and John demoing Flipside in May 2016:
You can even see Lightshow's light streams on my hands in the Flipside demo around the 0:50 mark. But this time, we had also decided to focus more on characters, because we realized that the performer is at the center of any performance. We knew we needed to go deep on characters, so we started down that path right away in Flipside.
The next thing we built was Flipside's Magic Pencil, which lets you draw your own props and use them to improvise. We even built a timer and a random drawing suggestion that would appear only for the person in VR, and we played a big guessing game with the Winnipeg Alternate Reality Club (our local AR/VR meetup). Super fun night!
We definitely knew we had a long road ahead to achieve our vision for Flipside, but that's life for what was then a team of four people bootstrapping a startup with service work, creating a 360 video experience with the CMHR, and finishing their first VR game (with multiplayer no less!). So I'm rather proud of what we accomplished that year :)
Here’s the earliest blog post we could find about Flipside:
That was also the year that Les won a whopping $100,000 pitching Flipside in a local pitch competition, which was the spark that enabled us to shift from service work to working on Flipside full-time. That led to us joining Boost VC’s Tribe 9, where we met lots of other awesome sci fi startups (as Boost VC likes to call us), and a seed funding round in 2017.
At Boost, we met San Francisco comedian Jordan Cerminara and created a YouTube series called Earth From Up Here together, about an alien newscaster named Zeblo Gonzor who delivered weekly updates about the strange things happening down here on Earth. That show proved what people could do with Flipside, and helped us learn from working in collaboration with a writer and actor to help shape and improve the app for future users.
Fast forward to March 2018, we were finally ready to take all that learning and share it with the wider VR creative community. We first released Flipside in early access on SteamVR and soon after on Oculus Home as well. The positive reviews and the response was amazing!
Since then, we’ve made over a dozen updates with tons of improvements and new features to help Flipside’s creators. Custom character and set importing evolved into shared characters and sets. Our early camera controls evolved into a complete in-VR camera switcher (and we have lots more up our sleeves too).
Most recently, we brought Flipside on stage in the form of a TEDxWinnipeg talk that John did in June about how virtual avatars are revolutionizing our sense of identity. And hot on the heels of that, we partnered with local improv duo Stephen and Caity to put on a live animated improv theatre production called Pixels that uses Flipside to render the show in real-time as they act it out in a pair of HTC Vive headsets. The show opens for 10 days starting this Thursday, July 19.
It's safe to say that at this stage of VR and AR (or just XR), no one knows how to define an "XR show" or "XR entertainment" yet. We have some examples of it, but we've barely scratched the surface (just like two years in, we’ve barely scratched the surface of what Flipside is going to grow into).
And certainly no one company should solve that all by themselves, either. We don't live in an echo chamber, and we're going to discover more and faster if we all put our heads and hearts into it.
That's why our philosophy centers around the believe that the more the merrier in figuring that out. It's a big world, and we believe there's room for all our voices as we discover how to create beyond reality’s limits, together.
Join us in discovering new formats, techniques, and rules for storytelling in a whole new immersive medium.
- by Rachael Hosein
This update is one that many of you have been waiting for! We have some great bug fixes, a new character, and best of all — an in-VR camera controller!!!
The feature that so many of you have requested is finally here! We now have a camera switcher that lets you control your camera moves and settings in VR. We'll be polishing it up as we go, but figured best to get it in your hands as soon as possible. To access the in-VR camera switcher, press Alt + S on your keyboard.
This mode works hand-in-hand with the in-VR camera switcher. To hide the camera switcher interface that appears on your computer screen, press Alt + S. This will let you record full screen output through OBS and will pop up your in-VR camera switcher.
We've had this character in the hopper for a while now and figure it's just about time we introduced you to bones, the skeleton version of our first dinosaur character!
We've added a new dynamic light prop to the set builder, which you can find under the Show Tools category. The dynamic light lets you place and control your own lighting on any set, and you can even grab them to modify your lighting during your shows!
A big thanks to everyone who submitted an entry to our very first creator contest. We received some great entries and announced our winner on Friday. Congratulations to our winner Dorothy Jean Thompson (@vrgamedevgirl). Check out her entry below!
- by Rachael Hosein
This update includes some great new additions - Shared characters and sets, in-VR controllers for the slideshow and teleprompter, actor marks for framing shots and knowing where to stand, numerous bug fixes, and our first Flipside contest!
You may have noticed some additional fields recently added to the website when uploading your characters and sets. Those of you who have guessed it are right! Any characters or sets that our creators choose to share are now available for all on the sets and character palettes.
This also means that from now on, more characters and sets are going to be available to everyone, and on an ongoing basis, not just when we update Flipside itself.
Controlling your slideshow is now much easier! We've added a slideshow controller prop that lets you see your current slide, move to your next and previous slides, and jump to the first or last slide.
Similar to the slideshow, we've also added an in-VR controller for your teleprompter!
To help with setting up your shots during set building, we've added an actor mark prop to the show tools palette. Use the actor mark as a stand in when setting up your cameras and dressing your set. When recording, the mark will appear as an 'x' on the ground so you know exactly where to stand. The 'x' works just like the other user interface elements like the palette, they are only visible to you unless you want the user to see it.
With this update, comes the very first Flipside Creator contest! You could win a $50 Steam Gift Card. We’ll give you the script and you make a recording using Flipside Studio.
Visit the Contests page for more details.
- by John Luxford
In this tutorial post, we will show you how to prepare an Adobe Fuse character model for importing into Flipside Studio.
Adobe Fuse is a 3D modeling app that makes it easy to create unique human characters in minutes, without having to be an experienced 3D artist.
This tutorial assumes you have installed and setup both Adobe Fuse as well as the Flipside Creator Tools.
Launch the Adobe Fuse app and create a new character by choosing File > New Model.
To build your character, under the Assemble tab, first choose a head from the list on the right, followed by a torso, legs, and arms.
Next, click on the Customize tab to reveal the available customization options. Feel free to modify these to your liking.
Next, click on the Clothing tab to choose the clothing for your character. You can choose from a variety of tops, bottoms, shoes, hair, hats, and more.
Lastly, click on the Texture tab to modify the textures on your character. This lets you control all aspects of your character's textures and how they look and feel.
To finish the rigging stage of our new character, Fuse relies on another Adobe product called Mixamo. When Fuse sends your character to Mixamo, it will automatically rig your character for you, but before completing the import process there is one setting you will need to change.
To send your character from Fuse to Mixamo, click on the Send to Mixamo button in the top right corner. This may take some time to process in both applications, so you'll need to wait for the exporting and importing processes to finish before continuing.
When you see your character animated in the Mixamo Auto-Rigger, before clicking Finish, change the Facial Blendshapes setting to Enabled. This ensures that your character's facial expressions can be connected to Flipside Studio's facial expressions and lip syncing capabilities.
After clicking Finish, Mixamo may show a "Proceed with this new character?" warning. Click on the Use This Character button to proceed.
You will now be shown a page with two download options. Make sure to choose the download option for 3D Software, and not the option for game engines.
When Mixamo asks you to choose your download settings, leave the default options as is.
You are now ready to import your character into the Flipside Creator Tools! Here are some links to guide you through the next steps:
- by Rachael Hosein
This update has some new character additions, new features, and as always some bug fixes and polish.
Now that Flipside characters support realistic hair, cloth, and tail movement, we thought we'd release some showcase characters so everyone can see how awesome this feature is.
Let's all take a moment to welcome Elephant, Giraffe and Warrior Goblin to the Flipside character crew!
You no longer have to peek through the bottom of your head set to see your Twitch chat! We've added a Twitch chat viewer to the Show Tools category of the Set Builder, so you can view and display your Twitch channel's chat conversation inside your Flipside shows!
We know that your connection to your audience is a vital aspect of the Twitch experience, and now you can interact naturally with your audience as you stream your shows from Flipside.
There's a new version (v0.10) of the Flipside Creator Tools that makes it WAY simpler to map your character's facial expressions. This was the most challenging aspect of character importing, and this fix ought to make it much easier now. It also has little preview buttons to test how each expression looks before building and uploading your characters into Flipside!
We've also added a marker that shows you where the origin point of your scene is when creating sets. This was one of the most confusing aspects when adjusting your set's position in the scene, and should be much easier now.
You may be thinking...what loading notices and alerts? We originally had the loading notices and in-app alerts pinned to your hand, which went unnoticed for some of our users. We've moved them to pop up right in front of you so you don't miss the important messages (but mainly so you can check out our awesome loading animation in full view).
We've added a small watermark to the video output from Flipside because we need to promote Flipside to the world. We've had several requests from users to add one, but we can foresee some users wanting it removed too, so if you're one of those users talk to us and we can work something out.
- by Rachael Hosein
In this tutorial post, we will show you how to import a set into Flipside.
This tutorial assumes you have installed and setup the Flipside Creator Tools.
For this tutorial, we're going to use Cube Room by Naomi Chen from Google Poly. You can source any model you would like to use for your set, all you will need is an .obj file for the model, a .mtl file for its material, and any associated textures.
Launch Unity and have the Flipside Creator Tools set up.
If using Google Poly to get your set model:
If you have a model from elsewhere:
Since you don't need the default camera and light, delete them from the Hierarchy panel.
Convert your scene into a Flipside set by choosing Flipside Creator Tools > Create Set From Current Scene.
With the root set object selected in the Hierarchy panel, go to the Inspector panel to access the Set Info.
If the set model is under a Creative Commons with Attribution license:
Position the set so it's on the floor of your scene:
In the scene window, click Build Set Bundle. Once the set building process is complete, click on Find Set Bundle File to locate your set file.
Once your set has been added to your Creator account, launch the Flipside app and check out your new set!
For a more detailed breakdown of how to import your own set, visit Creating a Custom Set in the Flipside documentation.