As the animation studio for Fall 2016, Team Catharsis was tasked with delivering a non interactive
VR production piece based on a concept handed over to us by a previous team. The concept was titled
‘Sea of Stories’ and the experience was an attempt to explore virtual reality as an art form. To be
more specific, the experience was to be an aesthetically pleasing experience that is an exploration
of a world, in a manner that allows the viewer to interpret its significance or meaning. Parallely
the team was also working on a preproduction concept that was to be developed the following
semester. In a nutshell, the experience will be a toy based sequence delivering a message of
cultural unity and harmony through cultural music and dance depicting the celebration of life.
The following is a list of lessons that were learned along the way, including creative, production
and technical :
Establish Specific Goals
As a non interactive non narrative piece, the established goal of a
VR art piece was a vague one at best. These goals could have been better established using stronger
visual direction, which translates to detailed concept art that the team sets out to implement.
This should not be treated as mere inspiration but as a visual goal to measure success by. Because
VR experiences are rendered in realtime, many art assets need to be generated in game engine and
the path to achieving the established visual direction may not be clear. It may be tempting to
simply go with what can be achieved. However, this will compromise the overall visual direction of
Adapt to VR
Paradoxical to the previous point mentioned, it is also important to understand
that VR is a three dimensional medium and concept art has its limitations. Concept sketches for
singular elements with all the details established, is more useful than sketches of the world as a
whole. A similar paradox is the team should also be able to utilize technical exploration as
inspiration for visual direction. The platform you use can possibly inspire visuals you wouldn’t
normally have considered.
One of the reasons the team may have neglected visual direction early in
the semester is a conflict in production styles needed to deliver VR experiences. On one hand
iterative game based production is needed to solve design challenges and focused film production
styles is required for visual fidelity. Early on in the semester the team sacrificed visual
direction for the sake of rapid development of a prototype for
reasons listed in the next point.
Establish Non Conflicting Finite Goals
The reason the team believed prototyping was of paramount importance was because the original thought process for the project included design
challenges such as communication of an abstract experience arc which included a possible solution
to the Swayze Effect. Narrative in VR is a legitimate design challenge in its own right and was
possibly in conflict with the established goal of creating an art piece. More time could have been
spent thinking about visuals instead of building a prototype for playtesting.
Rendering in Real Time
Even non interactive experiences in VR involve a ‘gameplay’ component
because everything is rendered in real time. While this may not seem like a technically challenging
aspect, the effort of ‘gameplay programming’ depends on how much your experience is based on events
vs animations. Be wary of delegating gameplay programming as a part time role, because it may
compromise the individual’s ability to execute his main assigned role. Keep track of framerate as
you go along, you want it to be 75 fps, ideally 90. It’s easy to get carried away with the project
and end up with 30fps. Avoid transparent objects and complex particle systems as much as possible.
And finally, try to write frame rate independent code from the very beginning; even during the
Unreal has its advantages and disadvantages over Unity. Its Material system is
stellar, node based and easier than writing shaders in Unity. The Blueprint system however, may be
tedious to those used to C# in Unity. Compile times are also rather long, making it annoying to
make small changes to C++ in the project.
The music of the Sea of Stories is possibly its strongest element and adds
tremendous value to the experience. The only way to extract this value addition is by treating
sound as an equal customer from the very beginning of development.
The importance of spatial audio needed to be stressed since VR experience requires indirect control
to concentrate the viewer on the things planned as clues and hints of the Experience. Spatial audio
can also make the panning smooth like in real world.
What we have learned in the sound process are:
1. Balancing between voiceover and music
2. Music is something that has the least possibility to be blocked during the design process, and
can inspire or be inspired by many things from other artists, such as color usage and dancer’s
3. Sound designer should have write down list of sound they need for this experience and make
sure the team has seen the list. Ask for feedbacks.
4. The nice thing we did in the beginning is to structure out the sound especially for Wwise so
that iteration speeds up in the beginning.
5. Let someone else try the experience. Since there are story snippet inside our experience,
letting people outside the project to try is the best and effective way when you are seeking proper
6. Most of the sound effect other than ambiance should be trigger based. If using Wwise, need to
limit the trigger to one time and set to discard new. Create another blend container or any
container under the first hierarchy, with none limitation. By using this tip, the whole thing will
only be trigger once when the player touched the trigger box.