Posts by Lux

Flipside 1.3 goes big (and small) on characters, high (and low) on voices

- by Lux

Flipside 1.3 is out today and introduces the ability to scale any character to be anything from super tiny to absolutely huge, and we're sure you're going to have a ton of fun with it.

This opens up so many new creative possibilities for our users, from the Honey, I Shrunk The Kids variety to Godzilla-level out of this world. Combine that with any character you choose and the possibilities are endless!


To change a character's scale, open the Characters menu and use the +/- buttons in the Scale section under the character preview mirror. Click Reset at any time to change back to the character's original scale.

Pitch shift voice effect

Flipside 1.3 also adds the ability to change the pitch of your voice in your recordings and to others over multiplayer. This is a great way to help you get even more into character, achieve fun effects like making yourself sound like a chipmunk, or just changing up your voice for a little added privacy. Whatever the reason, Flipside's got you covered.


Click on the Settings tab on the dashboard menu and you'll see Pitch shift under a new Voice section of the settings. There's even a fun preview tool to hear what you'll sound like at different pitches.

Connecting with friends just got easier

You now have the ability to add your Meta Quest friends to your list of friends in Flipside, so no need to go finding them in the Multiplayer search. They'll appear in the dashboard friend list automatically too, so your friends are always just an invite away.



📝Read Our Release Notes 

Find all of the information from this update here.


Download the newest version now for your Meta Quest 2 and 3, Rift S, and Pico 4 VR headsets!

Quest Store   Pico Store

📚Flipside for Education

It's back-to-school time, and we know that instructors are getting geared up to provide future innovators with the skills they need to succeed. Are you an educator interested in using Flipside in the classroom? Join our educators' mailing list and let us know.


🙌Thanks For Your Feedback

Thanks to everyone who notified us of bugs and shared feature requests! We love the ideas and appreciate when you share your feedback.

CLICK HERE TO FILE A BUG REPORT 🐛

MAKE A FEATURE REQUEST 💡


🔌Connect with community

We’d love to welcome you to the Flipside Studio Community Discord channel where you can connect with other creators, share your #MadeInFlipside creations, submit feature requests, and share bug reports.

Join the Community
 

How to make VR music videos in Flipside

- by Lux

Here's an example VR music video made in Flipside. It's a song called "So Tired" by Flipside's CEO sung by a zombie, who, you can imagine, may be a little tired of his undead existence.

@johndeplume Someone's got a case of the Mondays 🙄 Words & music by Johnny Broadway Music video by Rachael Hosein Watch it in VR on Flipside (https://www.flipsidexr.com) #newmusicalert #newmusic #artistsoftiktok #indiemusic #indie #music #musicvideo #animation #motioncapture #xr #vr #quest3 #metaquest3 #immersive #madeinflipside ♬ original sound - Johnny Broadway

It's also best viewed in Flipside, where you get a front row seat to the graveyard punk performance, so we recommend you hop into Flipside on your Meta Quest 3 or Pico 4 VR headset, but the video is pretty good too.

Here's how to make a VR music video like that for your own songs (or ones generated with AI apps like Suno AI).

Getting the audio ready

The first step is to export the lead vocal stem file from your song. This way you have the original song file and a second file with just the lead vocals.

If you've recorded it yourself, this can be done by muting all tracks except the lead vocal in your DAW and exporting that as a new WAV file.

If you don't have access to the original recording files, you can use the free StemRoller app to separate the tracks for you. StemRoller will separate out the vocals as well as the bass, drums, and other instruments, but for our purposes we just need the vocals.wav file that it generates.

You should now have two audio files:

  • The original song as a WAV file
  • The vocal track as a WAV file

Log into the Flipside Creator Portal then click on the Audio tab and upload these two files to your private audio collection in Flipside.

Setting things up in Flipside

In Flipside, you can either record the animated band members together with friends over multiplayer, or record them one after another by adding a part successively for each band member.

One thing to note: There is currently a bug preventing props from being added to a recording after the fact, unless you add it mid-recording but then it will pop in at that time, so any props you need at the start of the recording you'll need to take out and have ready in the initial recording.

In the Sets menu, choose a set you want the performance to be on, set up your props so they're located where you want and can be easily used when recording the performance itself.

You should now be on set and almost ready to record. Let's start with the lead singer part, assuming you're recording alone.

Go to the Characters menu and choose the character you want to perform the lead. It's also a good idea to calibration at this time, so click on the Calibrate button in the bottom right corner of the Characters menu then stand in a T-pose for a few seconds as instructed.


The audio setup in Flipside

On the main menu, choose Audio Browser then go to All Audio > Imported. You should see the two audio tracks you uploaded through the Flipside Creator Portal earlier. Click on each one and choose Add to Audio Controller.

Next, click on the audio track that has the lead vocal only and choose Voice from the drop down menu. This will ensure that track powers your lead singer character's voice as you record.

Next, on the Audio Controller, check the checkbox next to each so they're both selected to play in sync.

Lastly, click Play on Record to ensure they'll both start playing automatically at the start of your recording.

Recording the VR music video

Get your character into position to start the recording, grabbing any props you need like the microphone, for example, then on the main menu click Record and wait for the countdown.

With the lead vocal track powering your character's voice, you're free to focus on performing the physical parts of the performance.

When you're finished, click Stop then choose to Keep the recording.

Once the recording is finished being saved, load the recording and click the Mute icon next to the lead singer's part so the lead vocal is no longer heard doubled, since we only needed it to power the mouth movements.

If you're overdubbing parts for additional band members, you can now change characters, get into new positions, and choose Add New Part. Do this until the recording is complete.

If you've made it this far, you probably have a brand new VR music video on your hands which you can publish to your channel using the Publish button when you have that recording loaded.

You've now created a VR music video people can watch and feel like they're right inside the experience.

Have fun!

 

Some Thoughts About Creation Tools

- by Lux

By Lux (Flipside's CEO & Co-Founder)

What are creation tools

When you make art, you create something new out of the resources and materials at your disposal. Paper, pencil, pen, paintbrush. Hands, hips, feet, voice, instruments. Computers. Controllers and headsets. Software.

Some materials are repurposed from other things, or intended for other purposes. Some art uses only your own body, like dancing or singing. Some artists use very precise tools, like a saxophone or a violin. The range of what's possible to make art with to create with is almost limitless.

Software makes up some of the most sophisticated tools artists have ever had available to them. From Ableton Live for music to Photoshop for images, Final Cut for video, Flipside for spatial and, more and more, AI is influencing each of these areas too.

Final Cut

This is a wide and varied category of software called creation tools made for the express purpose of helping others create.

The rest of this post is going to focus on software creation tools, since that's what we make. I believe software creation tools have the potential to influence the art that's made with them in more ways than physical tools have, which makes asking questions about the nature of creation tools important for developers to do.

What it means to make a creation tool

A question I try to think about from time to time is, what does it mean to make tools for others to create with?

What are the implications of this question? What are the values that arise from a given position on it?

Blender

Blender

When you make software that helps others make something that's theirs, it comes with a certain responsibility. You're supposed to get out of the way and try not to influence how it looks and feels or sounds, because it's not about you it's not your art it's about assisting their creation.

A creation tool maker makes the invisible stuff, the glue that holds together the seams of a digital creation. That's our mission in a nutshell.

There's a responsibility in holding someone else's creative work in your own creative hands and not inadvertently dictating what the end result should be. User-generated content should be made by the user, but default art assets, default settings a user doesn't know they can change, feature choices and limitations, and even workflows built into how you use an app, all influence the things that can be made or the direction users typically go creatively within a creation tool.

There are also limitations that may influence the work no matter what you do, like the 8-bit sounds of early video game consoles, or early monitors with only 256 colours. In our case, rendering for VR is more costly than rendering for mobile or desktop, so we choose default art styles based on what will still look great but perform well on VR headsets, and so we tend to steer away from realism in our default choices as a result, knowing this will have some influence on our creators and their output.

Photoshop

PhotoShop work area (ver. 0.63, Oct. 1988) image from https://www.firstversions.com/

A creation tool's influence on the art that's made with it

Our tools are inherently going to have some influence on the art we make. As a tool maker, being aware of that is important in order to understand and choose carefully to the best of our ability what that influence is going to be.

Some of it is temporary, since software is always in flux, and necessarily includes temporary limitations that become more flexible or smoothed out over time. Developers have the eternal struggle of knowing that what's around the corner is going to make X or Y feature better, but that users have to wait and live with the limitations that exist today.

Tilt Brush

I touched on feature design, workflow choices, and choosing default settings earlier, and these are hard problems even for experienced developers. User experience is ever-evolving, especially in areas like VR that are still not fully charted territory.

A developer must understand that when you choose a default setting, most users are unlikely to ever change it. So if a capability is only exposed by changing that setting, it may never be discovered or used by the vast majority of creators using your tool.

Similarly, a workflow that has a default path going from steps A to B to C, but you can choose to "See More Options" which might open up steps D and E, will influence all but a small percentage of creators to stick to the A-B-C path. Creators are busy and they don't have time to crawl through every possible option in your software when they're just trying to get something done, so it's important to choose default paths carefully and know where you might be exposing or hiding certain features.

Guides and templates

When a new user first starts using your creation tool, they'll need guidance on what to do to accomplish what they want to make with it. Contextual tool tips are great for explaining things in the moment, and can even be used to highlight important elements in the user interface. Tutorials and references accessible within the app can be great resources, but often become overwhelming in the amount of documentation that there is, or forgotten and not used because they're often implemented out of context.

Onboarding tutorials are a great way to walk the user through a core set of features or recommended workflow and ensure they successfully perform each of the steps. This can be a very powerful way to help a user through their first few steps with your app, but remember, you're teaching a certain workflow that users are unlikely to deviate from unless they become power users over time or happen upon other, less visible features.

Ready Player Me Wizard in Flipside

After the initial onboarding, the user needs to feel confident creating something with the app. This is where guides and templates come in. They need to create something, but not necessarily their thing, yet.

Templates, or in our case, default characters, props and sets, can be a great way to get users going quickly by reducing the choices they have to make in order to get to that first creative reward. But even then, a user can hop into character and not know what to say. So we introduced an AI script generator that will write you a monologue, a dialogue, a poem, or a bunch of awful dad jokes.

Each of these work together to reinforce the creative flow when a user is still new to the tools and doesn't have that freedom of familiarity to just hop into being creative. This is central to our thinking on how AI fits into Flipside as a creation tool. We don't see it as a replacement for human creativity, but rather, as an assistant that can jump in to grease the wheels where needed.

There are way more areas we think about leading users and balancing that against leading them too far, and all the other questions I asked earlier in this post, but this post would be much too long if I were to include them all.

The importance of creation tools

We believe creation tools are an important category of software, and have dedicated our lives to making the first generation of creation tools for VR, AR, and the metaverse. We believe this work is important for several reasons:

  • Creation tools help keep our creative spirits alive and healthy.
  • Creation tools help open up new possibilities for the kinds of art we can create as a society.
  • Creation tools lower barriers and democratize the ability for many people to participate in new forms of creation.
Flipside
Flipside

Flipside's core mission is to help everyone bring their imagination to life and share it with the world. That's a tall order, but Flipside has lowered the barrier to content creation in what is one of the most challenging areas of technology, and most importantly, we've facilitated the creation of thousands and thousands of pieces of content - sketches, jokes, dances, music, reenactments, improv, and more - that otherwise wouldn't have existed.

We are honoured to be building Flipside for creators like you, and everyone that has joined us on this journey, and thank each and every one of you for believing in and wanting to build this crazy new world of spatial entertainment along with us.

 

Interview with our CEO John Luxford in DesignRush

- by Lux

DesignRush recently spoke with our CEO & Co-Founder John Luxford (aka Lux) about how brands can leverage user-generated content (UGC) in the metaverse.

Read on for Lux's perhaps contrarian views on what the metaverse is, how it's evolving, what role VR and AR play in shaping its future, and how brands can take advantage of these immersive technologies.

Read the full interview here: Learn How Brands Should Leverage UGC in the Metaverse.

 

The Fourth Wall in Spatial and Immersive Content

- by Lux

By Lux (Flipside's CEO & Co-Founder)

The fourth wall is a concept in traditional film and theatre which is essentially the boundary between the performers and the audience.

While there are exceptions, like theatre in the round, in most styles of theatre the audience is looking at the performance from the front of the stage. This is also known as a proscenium, in theatre lingo. In film, this would be the camera itself.

Breaking the fourth wall happens when an actor deliberately addresses the audience, speaking directly to them, or speaking directly into the camera. This was popularized in television shows like BBC's The Office, where actors would turn and look at the camera as if conferring with the audience that something absurd just happened.

This can be used as a way to increase engagement with the audience, add emotional impact, create a memorable moment in the experience, and even make the audience feel like part of the show.

The fourth wall in immersive entertainment

In immersive theatre, there's no obvious fourth wall, yet there's still an audience whose perspective needs to be managed throughout the performance.

Some virtual worlds have obvious fourth walls because they mimic real-world theatres or stages. Flipside has a number of these, from the black box theatre, comedy club, outdoor stages, or our newest Laughs N' Riffs stage. In these cases, it's obvious where the audience will be and that the actors should make sure to craft the experience for that perspective.

Others, like our kitchen, class room, or campfire sets have no obvious fourth wall, but one must still to be added to the environment so that performers will know where the audience is expected to be.

An added question to be asked in crafting an immersive experience is whether the audience is participating in the story or simply watching, and whether this can be used to increase the sense of immersion or potentially take away from it.

In immersive worlds, audience comfort is also a factor to consider. Some audience members may not be comfortable having to engage, just like they may not want to sit in the front row of a stand-up comedy show for fear of becoming part of the show.

All these considerations are important in deciding how you want to craft a piece of immersive entertainment.

Immersive examples of breaking the fourth wall

In immersive entertainment, there are two ends to the spectrum of breaking the fourth wall:

  1. Not addressing the audience at all but drawing their attention in indirect ways.
  2. Addressing the audience as if they're actors in the performance.

Between these ends are other approaches such as giving a knowing nod to the audience but not engaging verbally, and many other ways we have yet to dream up. Immersive performance is a ripe ground for innovation.

There are also examples of breaking the fourth wall for specific purposes, such as:

  • To poll the audience for suggestions, for example when an improv groups asks the audience for suggestions for their next improvisation.
  • A choose-your-own-adventure narrative might pause and prompt the audience to have to choose between two or more forked paths in the storyline.

How we think of the fourth wall in Flipside

We think a lot about the fourth wall in Flipside. Our team has many discussions and plans for evolving the audience's perspective and experience. We look forward to exploring this subject with Flipside's community of creators as Flipside continues to evolve.

Today, when you load a set to record a post, Flipside shows an "Audience" marker where the audience members will spawn when loading the post to watch. This is the first step to something more full-featured that we're working on fleshing out in a variety of ways, such as:

  • Letting the creator adjust and craft the audience position and choose the standing or seating arrangement.
  • Letting creators move or change the audience position during the recording as part of the experience. This could mean teleporting the audience between several defined positions, or moving them between two positions like an experience on rails.
  • Giving the audience agency to control their position through things like virtual vehicles.
  • And of course, we also give audience members the agency to step out of that box and explore more freely, because they are in a virtual world after all.

Audience marker in Flipside

One key challenge to overcome with moving the audience between positions or giving them vehicular controls is simulator sickness. While we already have features like vignetting of the user's POV when walking around in Flipside or rotating the world while invisible, this needs to be considered in any implementation of audience movement as well. For example, preventing the rails experience from tilting the audience position while moving it between point A and point B, but only allowing the creator to adjust the yaw but not the roll or pitch.

There's also the challenge of watching content in mixed reality versus fully immersive virtual reality. The audience in this case may often be arranged more like the audience in theatre in the round except the content would likely need to be scaled to fit the room instead of being life sized, and the same piece of content in Flipside may end up being viewed in both of those contexts.

Another consideration for non-headset-wearing audiences and how we might enable audience members to become avatars within the content or whether we enable them to drop and control virtual cameras in the scene so they can feel like the director of the show.

Each of these is a powerful way to engage audiences and ultimately, our job is to empower creators to craft the experience for all audiences in such a way that makes for a great experience for each of them. It's a tall order, but we've got more than a few tricks up our sleeves that we'll be introducing over time.

 

Words Matter: Spatial vs Immersive

- by Lux

By Lux (Flipside's CEO & Co-Founder)

There's an interesting effect in technology where platforms align themselves with certain terminology in order to try to differentiate from one another. But eventually, most companies converge on common words. Web 2.0, web3, cloud computing, edge computing, the list goes on.

In virtual reality (VR) and augmented reality (AR), these terms were coined long before the technology was ready for an industry to form around them. Then came the Oculus Rift which was dubbed a VR headset. But it could have just as easily been called a VR visor, or the more technical head-mounted display (HMD) that people sometimes use.

Then Microsoft came along and coined mixed reality (despite the fact that mixed reality meant something already) and branded their VR platform Windows MR. Other companies tried to coin a term that would encompass both AR and VR and came up with XR, as in extended reality.

Now Apple has entered the fray with the Apple Vision Pro (AVP) and is calling it a spatial computing device. They even go so far as to discourage app developers from referring to either AR or VR when describing their AVP apps.

Windows MR has since been abandoned, and many companies continue to use XR, but the influence Apple has on shaping industry perception is massive, so my prediction is that we'll all eventually converge around spatial computing, unless they pivot to something else.

But when talking about VR experiences, the word spatial leaves something to be desired. While yes, it does suggest a blend of both AR and VR, there's another word that I think might be lost in the shuffle: Immersive.

You see, spatial refers to the three-dimensionality of what's displayed, but immersive refers to the sensory experience of the user. One is technology-centric and the other is user-centric. Which is why you don't hear about spatial theatre groups, but you do hear about immersive theatre groups, and when users describe an experience they just had, they don't say how spatial it was, they say they felt immersed in it.

This split makes me think of Simon Sinek's TED talk about how people don't buy what you do, they buy why you do it. The word immersive touches on a why. Spatial just describes a piece of technology. Which is odd for Apple, who are the original company to say ay ahere's to the crazy onesss in their Think Different campaign.

Immersive speaks to one of the values of the experience the user has, and the implied belief that immersion offers something to the user experience that non-immersive content can't offer.

But we usually think of immersion as a VR thing, and not an AR thing. I think there are degrees of immersion, and AR still meets several of them. When you lose yourself in the experience, that's immersion, whether you can see the edges or a peek of the man behind the curtain.

In AR terms, this immersion can range from creating the effect that there are portals you can peer through in your room's physical walls, or the effect of of oskinningss your reality. Imagine an AR skin that makes your world look like Sin City, or The Walking Dead - that would surely be pretty immersive and yet is only augmenting your real world experience.

Projection mapping has become a common technique used by VJs to make stages and walls at festivals and raves feel more immersive, and a similar effect is employed by installations like the Van Gogh: The Immersive Experience. There are even immersive audio experiences like Darkfield Radio that immerse the user without using any visuals at all.

The key to immersion is helping the user get lost in the experience. When you lose yourself in something, you lose your sense of time, you forget about your outside cares, you take things in more fully, and you leave having had an experience that may have awed you or moved you in some way that you felt you were a part of. That's the magic of of ospatial computingutingu, not the technology.

 

Reflections on our first year in the Quest app store

- by Lux

It's been one year since we officially launched on the Meta Quest app store, and what a year it's been!

Our user base has grown to over 50,000 creators who've made over 160,000 spatial recordings. We're sincerely grateful for every one of our users as well as the positive reviews and messages of encouragement you've sent us. Those words and the content you're making and sharing keep our team motivated every day to make Flipside the best we can make it, so thank you from the bottom of our hearts.

Here are some of the highlights we've had over the past year:

Flipside software updates

Since launch, we've released 14 software updates, including some major new features including:

  • AI script and dialog generator, powered by ChatGPT
  • AI set generator, powered by Blockade Labs
  • Speech-to-text for easier text input
  • Joystick-based walking
  • Flipside channels and publishing
  • Remixing posts and editing recordings
  • Direct messages and sharing posts
  • Flipside tokens and creator tips

Characters, props, and sets

We've substantially expanded the list of built-in characters, props, and sets to now include hundreds of options.

We launched with a library of 55 characters, 49 sets, and 330 props. We now have 111 characters, 69 sets, and too many props to count.

And of course, you can always create custom characters through our Ready Player Me integration, custom sets with our Blockade Labs integration, or use our Flipside Creator Tools plugin for Unity to import your own custom characters, sets, and props. Sky's the limit!

Launching on Pico VR

We've expanded beyond the Meta app stores to include support for Pico 4 headsets too.

Download Flipside on Pico VR

New direction

We've changed the name of our app from Flipside Studio to simply Flipside, emphasizing that Flipside is now a metaverse social media platform for next-gen creators to be able to share their spatial recordings directly with their fans.

Our vision is to build a metaverse powered by imagination, and to democratize the creation of spatial and immersive content so that anyone with an idea can bring that idea to life and share it with the world.

We believe this is the missing piece of the metaverse, enabling real-time content creation so that you just act things out and they're instantly ready to share.

Read more about this new direction here

Content highlights

As more and more creators publish to Flipside, we want to highlight some of our favourites here for you.

 

Flipside is a social media platform built on pure imagination

- by Lux

By Lux (Flipside's CEO & Co-Founder)

What if you were to combine social media with a virtual TV studio? The results would be a social media platform focused on pure imagination and hyper creativity. Sounds crazy, but it makes total sense in the metaverse.

And that is exactly what Flipside is, the first social media platform built on a foundation of pure imagination. You can think of it as the TikTok of immersive entertainment, or like stepping inside of a content creator’s mind.

Why the need for a new social media platform for the metaverse?

Because the thinking that’s gone into existing metaverses is too limited to the original vision of what a metaverse is, which makes great fodder for science fiction, but isn’t a complete vision of what that could look like. It’s missing real-time content creation at the heart of it all, which is exactly where Flipside comes in.

Spatial computing empowers a level of real-time content creation never seen before. Instead of animating frame by frame, or building interactivity in a game engine, now you just become a character on a virtual set, and your words and movements become the performance. When you press save, the recording is done. That fast.

Next-gen creators

We’re seeing the emergence of what we call next-gen creators. Creators who are XR native and spatial first. Who feel perfectly themselves embodying an avatar and who understand that identity is something that can shift and morph even from moment to moment. Who step through virtual worlds as fluidly as stepping through a door. The inventors and discoverers of what’s possible in this new spatial computing paradigm.

And we know audiences have been dreaming of the idea of being able to jump inside of the content they’re seeing, from Mary Poppins and gang jumping inside of a chalk drawing, or Alice falling down the rabbit hole long before that.

So here’s our tribute to next-gen creators and their fans. We’ll be there watching your creations and celebrating the vision you bring to the birth of a new space for imagination and endless creativity.

Welcome to Flipside.

 

Flipside Studio is dead, long live Flipside! The social media platform for next-gen creators and their fans

- by Lux

Flipside is so much more than a virtual TV studio – it’s the first social media platform built around pure creativity and imagination.

We’ve come a long way from being a virtual TV studio for spatial content creation. Today we’re announcing Flipside Studio is now simply called Flipside.

With creator channels and social sharing features built right in, Flipside is now a full-blown metaverse social media platform where next-gen creators can post their spatial recordings to their channels to share with their fans.

We’ve heard from many Flipside creators, and the message is clear: Creators need a way to reach their fans directly, not just on other social platforms, where they can only show a window into their spatial creations.

Today, we’re saying loud and clear: We hear you, and that’s exactly what Flipside is moving forward.

To be clear, all of the Flipside Studio features like our camera switcher are sticking around. Flipside will always excel at producing 2D content for cross-promotion on other platforms, but our vision is to help creators build a following that can interact directly with their spatial and immersive content.

Introducing Flipside tokens

Tokens are the virtual currency of Flipside. They provide a way to pay for virtual goods or premium features, but more importantly, they provide a way for creators to generate revenue from their Flipside content.

Our mission is to help creators build sustainable channels where they can earn real money from their content. That’s why we’re introducing creator payouts from day 1 in our token system. When you reach the payout threshold, visit the Flipside Creator Portal and click on the Request Payout button. It’s that easy.

Tokens are currently limited to paying for our AI integrations and sending tips to your favourite channels, but we will be expanding the program to include many new ways for creators to earn tokens over time.

Click here to learn more about our Flipside token program


Download the newest version now for your Meta Quest 2 and 3, Rift S, and Pico 4 VR headsets!

Quest Store   Pico Store

📚Flipside for Education

It's back-to-school time, and we know that instructors are getting geared up to provide future innovators with the skills they need to succeed. Are you an educator interested in using Flipside in the classroom? Join our educators' mailing list and let us know.


🙌Thanks For Your Feedback

Thanks to everyone who notified us of bugs and shared feature requests! We love the ideas and appreciate when you share your feedback.

CLICK HERE TO FILE A BUG REPORT 🐛

MAKE A FEATURE REQUEST 💡


🔌Connect with community

We’d love to welcome you to the Flipside Studio Community Discord channel where you can connect with other creators, share your #MadeInFlipside creations, submit feature requests, and share bug reports.

Join the Community
 

Flipside Studio 2021.1 introduces motion capture export, video renderer, stand-ins, and more

- by Lux

Flipside Studio

We're excited to finally take the wraps off of the new version of Flipside Studio, which includes a number of major improvements and new features.

This is also the first version of Flipside Studio to offer paid subscription plans for added functionality. All of the existing functionality of Flipside Studio remains free for everyone, but the watermark removal and some of the new features below will be paid-only.

Visit our pricing page to learn more.

Motion capture data export

Flipside Studio 2021.1 adds the ability to export motion capture data, making Flipside Studio the easiest way to record multiplayer motion capture sessions for use in any animation software or video game. Motion capture data can be exported in the following ways:

  1. Raw BVH files for use anywhere
  2. Unity animations, imported via Flipside Creator Tools
  3. Blender animations, imported via a new Flipside plugin for Blender

How it works is you record your characters with just your VR headset and controllers, and Flipside Studio exports the full-body character movements. If you have Vive Trackers, you can also do full-body motion capture!

Note: Motion capture data export is a paid feature.

Video renderer

Flipside Studio 2021.1 adds a powerful new video renderer for exporting footage of your Flipside Studio recordings.


Features include:

  • Capture multiple cameras in one render pass
  • Queue multiple renders to run back-to-back
  • Resolutions: 180 (Preview), 720, 1080, 2K, 4K
  • Output formats:
    • PNG or EXR image sequence
    • MP4 video
    • WAV audio (separate tracks for each actor)
    • Depth map
  • Adjustable frame rate
  • Adjustable video encoding settings
  • Adjustable anti-aliasing settings

Note: The video renderer is a paid feature.

Improved movement retargeting

Retargeting is the process of converting an actor's movement to the shape and dimensions of the character. Flipside Studio's retargeting system was rewritten from scratch for 2021.1 and the result is a massive improvement to the way characters move.

In our old retargeting system, the larger the difference there was between the actor and the character's dimensions, the less natural their movement would feel. In the new system, actors still control characters just by moving their bodies, but the movement of the character is more true to the character's dimensions. This means the actor and character hands won't always line up, but what the cameras see will be more accurate and look better.

The new retargeting system should be applied automatically, but if your character's movements feel off, you may need to recalibrate to fix it.

Introducing stand-ins

We've added a new feature we call stand-ins which adds the ability to freeze your character poses so they can be used as stand-in references when setting up your shots, and as actor marks during your shoot. Point your teleporter at a stand-in and your teleporter will lock onto it. Teleport into it and you'll automatically teleport to your mark for the shoot and change into the right character too if you weren't already.

And just like Flipside's current actor mark props, you can move them to adjust their positions or remove them individually by grabbing them in Set Builder mode.

While this feature is experimental, you'll find it on the underside of the Characters palette under the Stand-Ins heading. Press the Pose button and you'll be given a 3-second countdown. At the end of the countdown, a stand-in will appear for each user who's in character at the time.

Other improvements

  • Flipside Studio now supports full-hand tracking using the Valve Index controllers.
  • Ambient Occlusion can now be enabled under Settings > Output for improved lighting and shadows. Note that there is a performance cost to enabling Ambient Occlusion.
  • There's a new Microphone tab in the settings which includes several new features to fine-tune Flipside's voice recording and lip-syncing, including:
    • Lip sync gain to amplify your voice for more exaggerated lip-syncing
    • Lip sync noise gate to filter out unwanted background noise
    • Compression settings including attack time, threshold, compression ratio, and compression gain
    • Option to specify whether compression applies only to the recorded voice, only to lip-sync input, or both
  • The teleprompter now supports a few additional character sets, including Greek, Cyrillic, Thai, and Tamil. More will be added in future updates.

Flipside Creator Tools

In addition to the new BVH to Unity animation converter that works alongside the new motion capture data export feature of Flipside Studio, we've also added the following new features to the Flipside Creator Tools:

FaceMirror lets you animate faces on anything

This release introduces a new FaceMirror component that lets you animate faces on non-character objects in sets, such as props or otherwise inanimate objects.

We've included an example scene under FlipsideCreatorTools/Examples/Example-FaceMirror where you can see how it works and how it's setup. Just attach a FaceMirror to any object, then attach additional FacialExpressionReference components to power the elements of the disembodied face.

FaceMirror-powered faces work over multiplayer in Flipside Studio and can be recorded and played back complete with recorded voices.

More sophisticated animated expressions

We've added a new character expression type called AnimationParameters, which lets you control Unity animation parameters to achieve more sophisticated animated expressions.

Publish directly in Unity

You can now publish your custom characters and sets to Flipside Studio directly in Unity through a unified "Build & Publish" button in the Flipside Creator Tools window. This helps reduce iteration time and eliminates the need to hunt down your bundle files or open a separate browser window, enabling one-click publishing. Just reload the character or set in Flipside Studio to see your changes instantly in VR.

Click here to download the latest version of the Flipside Creator Tools 2021.1.

 

Older posts »