Be A Bird VR Project: Week 3
Today I used the Insta 360 Pro2 camera and a Zoom (H3-VR) 360 degree VR audio recorder ambisonic sound mic, recorder and decoder to do some green screen filming using the London College of Communication’s facilities.
I had to get everything done in record time because, while I had a 3-hour session booked, the tech support was only available for 90 minutes.
The first 30 mins I invested in setting up and filming the tutorial footage. Dressed as a 6-foot-tall seagull I did 6 quick takes — giving the PLAYERS a simple demonstration of the movements they need to perform to glide, steer, dive and climb.
The remaining 60 minutes was spent filming with a very special guest, a world famous musician whose extraordinary talents I’m hoping will come as a nice surprise for those who manage to complete the game and thereby gain access to his sweet flute sounds.
Once the PLAYER has collected enough berries in scene 2, they will be told to “fly to the top of the highest tree”. At the top of the tree will be an EYRIE — the venue in which a short 360 music concert will unfold.
Importantly, when the PLAYER transitions into this 3rd and final scene the orientation of their body in VR will match the orientation of their body in reality. The idea being that if they spend the last few minute of the experience feeling like they are lying on the floor looking up at the person speaking to them, a posture that matches reality, the off-boarding process should be much more straightforward than starting that process only when they take off the VR headset.
To explain further, during the whole flight experience they will have spent several minutes upside down: in reality they will be flat on their back head pointing up at the ceiling at all times, but in virtual reality they will feel like they are inverted — looking down over the landscape. This feels absolutely fine when you go into it, but can be a little disorienting for some people when the come out of the experience, i.e. take off the headset and find themselves inverted the right way round again. Doing this inversion process in the cut from scene 2 to scene 3 should make the off-boarding process more straight forward.
The video footage from the 6 data cards (one for each of the 6 cameras that comprise the Insta360 array) take half an hour each to transfer, so I am currently waiting with baited breath to see whether Mistika VR will be able to instantly convert each video into a 360 experience or not. As I still have a whole data card to go, I don’t yet know whether the footage will be useable!
One decision I may come to regret it angling the camera array upwards a little so that it would match the PLAYER’s perspective better — looking up at the actors from ground level. It just occurred to me that Mistika might stitch the footage together using mathematics that are based on the perfectly reasonable presumption that the orientation of the camera rig will be perfectly upright. Fingers crossed, with a bit of jiggery-pokery, it might be salvageable even if it isn’t perfect upon dragging and dropping in the files.
Assuming that step goes well there is also the challenge of syncing the ambisonic recordings and orienting them so that the locations of the sounds come from the appropriate 3D location in space. And then there’s the process of keying out the green screen cleanly and superimposing the remaining footage over an appropriate CG environment.
What could possibly go wrong…