Hypnos Main Page
Welcome to Project Hypnos Wiki. This Wikipedia page is some tips to how to produce 360 degree videos and make them playable in Virtual Reality devices.
- 1 Important Elements for VR Development
- 1.1 1. Location
- 1.2 2. Shifts in Scale
- 1.3 3. First-person Point of View
- 1.4 4. Circumambulation (Panoramic effect)
- 1.5 5. Feeling of Height
- 1.6 6. Linear Motion
- 1.7 7. Binaural Sound
- 1.8 8. Indirect Control
- 1.9 9. Dynamic trigger
- 1.10 10. Head-Feet Normalization
- 1.11 11. Eyecontact
- 1.12 12. Frame Rate/Resolution
- 1.13 13. Blocking
- 2 Scriptwriting
- 3 Pipeline
- 4 Filming
- 5 Stitch
- 6 Sound
- 7 Compositing
- 8 Tech
Important Elements for VR Development
Of course, a good location makes good a background for any story. If you want to put CG (Computer Graphics) elements into the film/experience, then you should shoot in the green screen room. Now we are using 10 GoPro cameras attached to the rig mount by 360Heros. The big issue we are facing right now is stitching 10 clips perfectly. We found that, it's easier to stitch videos into a sphere video if the scene is more complicated, because there will be more details from different camera. For example, if you shoot in a room only with white wall, in post production, you can not know where the stitching lines are. The video clips are all in rectangle shape. To make all clips into a sphere which can be played as 360-degree video, there are a lot of overlaps. Therefore, the details in the background become important.
2. Shifts in Scale
In <The Chair>(https://share.oculus.com/app/sightline-the-chair), you can see they use different space and different size of objects to create an unexpected virtual world. They use Shifts in Scale. One second, you see a little to the right, and the next second, you see a big rock right in front of you at a 10 cm distance. Then you are in a narrow room which is only able to hold four standing human beings at a time, and the next second, you are in the outer space, watching the earth rotating under your shoes. Shifts in Scale gives viewer surprises. Sometimes you feel you are a giant in the world; sometimes you become an ant walking slowly in the scene.
3. First-person Point of View
First-person perspective is really important for successful immersion, because the viewer is able to feel he/she is THE character, feeling he/she is THE one in the scene. On the other hand, VRSE(http://vrse.works/) has released three 360-degree video on their mobile app. One of the three, Clouds over Sydra, was at the third-person perspective with an internal narrator. She told me a story about her life in Syria. Watching different places in Syria with her voice by my ear, people feel emotionally immersed in the film. Therefore, First-person Point of View is good but not necessary.
4. Circumambulation (Panoramic effect)
You may never hear this word, but this is really important in 360-degree film. Different from regular flat video, you can choose which direction you want to focus on. The front, the left, the right, the back, the top or the down view...any angle can be a different scene. However, it's easy for the viewer to not know where to focus and lose the important message we developers/filmmakers want them to see. In this case, panoramic effect is important. For instance, the viewer focuses on the front scene where two people are chatting, at the next second, you want them to focus on something happening behind him. In this case, we could make the two chatting people walk panoramically from the front to the left then to the back. Therefore, the viewer would naturally follow them and see clearly the bomb that explodes at the back scene.
5. Feeling of Height
Everyone knows the Grand Canyon has a transparent bridge called Skywalk. People feel fear because they can see the top-down view of the Grand Canyon very clearly. In a VR headset, it's easy to make the feeling of height even if it's CG environment. Take a look of this video: http://youtu.be/gX9YG_xEhyg?list=PLNWhOq-W1Ubmnh2AuOWXf1KkQkfzmpQNI, Suicide in Cyber Space. The VR headset can easily deceive your brain. If there is no ground visible, you feel ungrounded and afraid of the height. You eyes control your brain, even though you know that's not true. Feeling of Fear Fear is the easiest emotion that can be created in a VR headset. The easiest thing to implement is to make everything black in the virtual environment. Suddenly, you put a terrifying ghost face in front of the viewer. I guarantee, 95% of people would be terrified. It's a totally other world, like being in a dark room where you can’t see anything. Everything you see is provided by the developer. Once you cannot control anything you see, fear comes to you. Here is a research paper you can look at, The Oculus Rift and Immersion through Fear: http://www.academia.edu/5387318/The_Oculus_Rift_and_Immersion_through_Fear
6. Linear Motion
Linear Motion is really important for the viewer. What is Linear Motion? It's the camera movement. You have to keep the camera at a linear speed, and keep the track smooth. It's really hard to implement it for 360-degree video, because there can be ground issues, camera shaking on the shoulder by cameraman, shooting in the water... those kinds of situation can possibly make the camera shaky and result in a nauseous feeling for the viewer. Therefore, there is a easier way to shoot - Don't move the camera! It sounds ridiculous for a regular movie, however, when you work in 360-degree, the viewer can look around by themselves. So what you need to work on is to try to make it amazing without moving the viewer's camera - make them move around to discover interesting stuff in your design.
7. Binaural Sound
Due to 360-degree video, binaural sound which means 3D sound becomes very very very important. I said very three times because I want you to remember this, because it's really really really important. It could be a trigger, leading the viewer to read the message you want them to read, see the keyframe you want them to experience... For example, you make a car accident scene behind the viewer, and you want them to experience the moment the car explodes. Binaural Sound makes them turn their heads around and see the explosion. However, a simple explosion sound at the back is not enough. You also need to make them hear the brake sound effect clearly enough before the accident happens. Thus, they can notice the brake sound and turn their head toward back direction to see the soul-stirring BLOWOUT!
8. Indirect Control
Indirect Control you can say is a kind of feeling of freedom, and is always important in any experience or game. The viewer seems to have freedom to see any direction, any angle they want, however, they are controlled by the elements the designer set up beforehand. The viewer has no idea they are being controlled. So like what I said in Binaural Sound and Circumambulation, the viewer will follow the characters' movement or try to see the direction which has special sound effect. Indirect Control helps the viewer in VR world not to lose focus on what they need to see and understand. And this is the hardest part for the designer in VR scene, because you cannot thoroughly control viewer's head and force them to turn their head to the right direction you want them to be. Indirect Control is the only way.
9. Dynamic trigger
Dynamic trigger is a good way to implement Indirect Control. Again, <The Chair> by SightLine(http://sightlinevr.com/) does a really good job of this. If you don't move your head and just stare at the front, nothing changes. In this case, the viewers think in two ways: 1) It's broken, I need to take off the Oculus Rift; 2) It must has something I didn't see, let me try to see other directions. Most people tend to do the second, because it happens in the beginning. No one would think it's the end after merely 10 seconds, so people intuitively start to turn their heads around. In this case, dynamic trigger helps people to see what they need to see such as story or interesting objects. However, turning-head type of Dynamic Trigger is really tiring for some viewers, even if they like the difference they saw.
10. Head-Feet Normalization
In the virtual world, with Unity, you can put the camera at any height. The first idea we wanted to do was the little-girl perspective. Thus, we put the camera at a low-height in Unity. However, people felt weird because they are not real little girl, and they thought it may be system's failure and did not feel immersed at all. A viewer came in and crossed his legs on the chair. He said, "I feel better in this way." The reason is, our brain knows how tall we are. As a result, at the moment what you see cannot fit our body gesture, you feel weird. The height from head to feet cannot fit your body height as you see, you feel weird. So that's why Head-Feet Normalization is a key element for the viewer to watch 360-degree film in a comfortable way.
As with traditional inter-personal communication, eye-contact is key to conveying non-verbal social cues. We found this to be the same in virtual reality as well. In order for the viewer to feel included in the scene, as if to take the role of a character, the actors must make "eye contact" with the camera rig. The opposite can also be used to great effect to alienate the viewer.
12. Frame Rate/Resolution
The frame-rate of your final product should be determined early in the production pipeline. In our research, we found that to minimize motion sickness and increase immersion it best to keep the frame-rate above 60 FPS and avoid going below 48 FPS. In addition to the frame-rate of the video captured, one must also consider the playback rate (Unity FPS goes down dramatically with large video files), codec FPS limitations as well as the rendering costs of additional frames. The best way to avoid issues is to test your pipeline, in full, prior to your final production.
In the continued effort to avoid stitch lines, it is best to keep the actors in-line with a single camera in the array. If an actor must cross from one camera to another, it is best to do so further away, rather than closer to, the camera array.
Scripts for VR films have some things that differ from traditional 2d mediums.
1. The viewer can look around, and this should be planned for. The film needs moments built in for viewers to explore the scene. Pacing is a bit slower.
2. If doing a film from a first person perspective, defining characteristics of the character, such as gender or race, can be problematic. The less specific you are the better, especially in the beginning as the viewer becomes immersed in being the character. That said, putting in some character specifics does help the story, so it is important to choose what is important and what is not.
3. You will be unable to show close ups of items. It is better to think of the story as being for theater than for film. small objects may need to be described. Also, because the viewer can look around, you will have to be careful of making sure plot points aren't missed, by focusing the viewer's attention with movement or sound
4. Similarly, avoid splitting the attention of the viewer regularly. The majority of the action should be kept to a 180 degree plane. It is fine to do this at points, and can create immersion, especially when you want the viewer to feel anxious or overwhelmed. But these should be moments that are planned for, not the regular state.
Camera & Memory
360 Heroes Aluminium Mount
GoPro HERO3+ Camera 10
32GB micro SDHC Cards
Stitching & Rendering
Autopano Giga & Autopano Video Pro
1. Limitation of space
So, How can we film in 360 space? There were several issues that we didn’t expect in general 2D film. In the production, we need to direct actors and actress in the scene, but because we used 360 angle, We didn’t have space to hide myself. So as a solution, we hide a cellphone in the setl and through facetime we could watch the scene. And also, We did more rehearsals than usual so that I could see their acting and movements.
2. Camera quality
Go-Pro camera has the lower quality, so we need to have plenty of lights in the scene. But as I mentioned before, we can’t hide our lightning equipment. So we use lights that look like part of scenery.
3. Ground Replacement
And we struggled with removing the tripod from the footage. So we built the human-o-pod. Because our story is from a first person perspective, It looks like a human body in the film. We think it really worked well with our story.
Welcome to Project Hypnos Wiki. This Wikipedia page is some tips to how to produce 360 degree videos and make them playable in Virtual Reality devices.
Gear VR uses 5.1 surround sound setup. The sound will dynamically move as the viewer turns their head. Recording and Mixing
1. Mic actors individually. Lav mics are available from Dave Purda; however the current mics we have only operate on two channels, so you can only record two actors at the time. If you have more than two actors, you may have to borrow separate mics, be creative with putting to mics on the same channel, or use method 2. When using Lav Mics, it is better for the actors to wear cotton. Try to hide the mics in such a way they will not rub the actor's shirt and create noise. Monitor the audio before recording so you can resolve any problems beforehand.
2. Go into your favorite DAW, and create a new project. Set the project to use 5.1 sound (both Logic and Audition can do this). Bring in your recorded dialog as mono sound files. You may have to clean noise out of the Adjust the panning of the sounds to fit the film. Assuming your actors move, you will have to automate the pan to follow the actors around. In Audition this can be done with the Track Panner.
3. Mix in any other sounds. Do not forget smaller incidental sounds, such as footsteps, room noise, ambiance, etc. Using a bit of reverb on the master channel is also recommended to tie the audio together. You want all the sounds to sound like they are in the same room, not recorded seperately.
1. Use the audio from the gopros. It is recommended you choose 5 cameras, one for each surround sound channel, and pan accordingly.
2. Align the audio very carefully to avoid phasing. If your audio sounds strange or echoes, this is the likely problem. Some DAWs have built in audio syncing you may use.
Once your audio mix is done, bounce the 5.1 mixdown. ETC does not currently have a surround sound set up, so you will have to check your mix on Gear VR.
Premiere lets you import 5.1 audio. Place the audio. You will have to check very carefully to align the audio with the video. Find points where a loud noise is made. Check the audio at multiple points, especially lip syncing. Placing audio even one frame off can make a difference.
When you are ready to export your film, go to the export settings. Click the Audio Tab. Make sure you set the mix to 5.1 (default is Stereo)
While Oculus VR players can play the audio created in the step above, no player has yet been found that will dynamically move the sound.
If you intend to create an oculus build, you will be able to create true dynamic binaural sound with the Two Big Ears plugin
With 2 big ears, you place a room area, to which you apply the plugin as a component. Then, any mono sound source you place in the space will be 3d, and HRTF filters will be applied to make the sound sound like it is behind you, above you, etc.
Post Production may be done in After Effects and Premiere. It is recommended to work with Image Sequences out of the stitching software and throughout this portion of the pipeline: this provides the best balance between quality and being easy to work with.
The final film must be rendered out of Premiere. To play on Gear VR, the film should be an mp4, 2048x1024, 5.1 Audio
There are several platform for Virtual Reality, such as Oculus Rift, Samsung Gear VR and Google Cardboard. Each one has its own pros and cons.
Oculus Rift is connected to PC that could have powerful CPUs and GPUs, so it have the capacity to render 4K video. But the problem is that the screen of Oculus Rift have a low resolution and, when you watch the video in the Oculus Rift, you could realize the video is pixelated.
Samsung Gear VR have a high resolution screen and much better user experience with its embedded mouse pad. However, it supports up to 2K video and it is expensive if the user don’t have Samsung Note 4.
Google Cardboard is small and easy to put in the bag. Also, it work well with any phone with a big screen. But, the user have to hold the cardboard when watching the movie, which might break the immersion a little bit.
Youtube has just released a 360 video channel that we can upload a panoramic video. Everyone can watch the 360 degree video easily without any required device. But the problem is that the audience cannot have a feeling of the immersion which is the key of this new media.
For the Oculus Rift, we have tried two ways to play the video inside. The first way to do is to write our own application with the help of Unity and Blender. The other one is to use Whirligig player to play the video.
Make Our Own Application
1. Make a projection sphere in the 3D modeling software Blender.
- Go to [[ http://www.enigmatoots.co.uk/#!unwrapping-sphere/cvnq]] and follow the instruction there.
- Flip all the normals on the sphere (W -> 0).
- Export the sphere in the FBX format.
2. Play the Video in Unity
- Import the sphere file in Unity and drag it into the scene.
- Go to [[ http://www.renderheads.com/portfolio/UnityAVProWindowsMedia]] and download the plugin AV Pro Windows Media.
- Add exactly one component AVProWindowsMediaManager in the scene.
- Add a component AVProWindowsMediaMovie to an object in the scene. In my setting, the Folder is "Video/" and the Filename is the name of the video with its format.
- Then I create a folder named Video in the folder that holds the Asset folder and put all the videos there so that Unity won't automatically convert them, which consumes lots of time and won't produce high quality video.
- In the same object, add a component AVProWindowsMediaMaterialApply and drag the material of the projection sphere to the Material variable.
- Then we can play the video projecting on the sphere. Now we have to put the Oculus camera into Unity.
- Go to [[ https://developer.oculus.com/downloads/#version=pc-0.5.0.1-beta]] and download the file Unity 4 Integration.
- Import the package into Unity. After import, you will find two folders, OVR and Plugins, are created under Assets.
- In the folder OVR, there is a Prefabs folder. Drag the OVRCameraController to your scene or OVRPlayerController based on what you want to use.
- Change the position of the Oculus camera to be the center of the sphere.
- After building the project, we can play 360 movie in the Oculus.
If there is no need for interaction and 3D sound, we can utilize a VR player called Whirligig to play 360 video.
- Go to [[ http://whirligig.xyz/downloads/]] and download the player according to your operating system.
- Put the film to the folder production/media and run the program Whirligig_DirectToRift.
- Click the Space key to trigger the menu. Switch to the video you want to play. Also, the format should be Barrel and the FOV should be 360.
- Set the Rift Display Mode in the OculusConfigUtil to be Direct HMD Access from Apps.
- Play the video.
- Install Note 4 USB driver from: http://www.samsung.com/us/support/downloads
- Plug the Samsung Note 4 to the computer via USB and go to /Phone/Milk VR/. Put the panoramic video there.
- Plug it to the Gear VR and launch the app Milk VR. Then find the film in the Download folder using the mouse pad.
- Play the film.
- Download the Kolor Eyes app from the store.
- Put the file down the folder for the app.
- Play the film.