From ETC Public Wiki
Jump to: navigation, search
  • The new upgrade from Kinect 1 to Kinect 2 in the CAVE has added more flexibility in terms of creating robust immersive worlds inside the CAVE
  • All the information below is from my two weeks of research on Kinect 2 inside the CAVE so please upgrade the document accordingly if you find any false information based on your research
  • I will layout a brief overview of what you can do with the Kinect 2. I will put in necessary links wherever possible to help you further your research. The Kinect 2 is a great tool and can be used to enhance immersion inside the CAVE by use of complex gestures that can be created very easily.
  • If you have worked with Kinect 1 then you might find many things that are similar like you can still work with the pointman.
  • What I found most amazing about the Kinect 2 is the gesture creation. The Kinect v2 plugin with Unity allows you to use existing gestures that come with the library like Push, Pull.
  • You can also create your own gestures with code or you could use the Visual Gesture Builder tool which involves using Machine Learning Techniques for gesture creation. I will talk more about it below


  • The kinect 2 is placed above the front screen at an angle so that it can track people standing on the motion floor*
  • I will talk about Kinect 2 specific to the Unity game engine
  • Firstly make sure that on Windows 8 Cave Machine, the Kinect 2 SDK is installed. Its already installed but just make sure it exists.
  • If it’s not installed, you can easily instal one from the Kinect website here
  • Next to make this work with Unity, you would need the Kinect V2 Unity Plugin. You can find this on the Asset store. Please contact the ETC support to make use of this Plugin. There could be other upgraded Plugins. I worked with this around March 2015. You can use any other plugin you like. The general setup would still be the same


  • If you just read through the readme file that comes along with the plugin, you have all the information given there. I will just mention some important steps here because there are still some things needed to be done differently


  • If you are going to use the Kinect 2 inside the CAVE for your scene then obviously you would need to setup the CAVE camera first. There is already a documentation that shows you how to do it. Do that first before you proceed. Not mandatory but a good practice to do this first


  • Open up the KinectAvatarDemo scene. Start the scene. Check if the Kinect is able to detect you and the pointman moves with you. If it does you are good to go
  • Now there are two things you need to care of first. First is the Kinect 2 sensor tracking distance and second is the Kinect 2 sensor angle. Because for tracking distance, you would only need to track the person standing on the motion floor and not track anyone standing behind the player.
  • For sensor angle, the Kinect 2 was originally meant to create gestures with it facing the player placed perpendicular angle. Now as in the CAVE, the Kinect is facing downwards at 30 degree angle, we need to adjust these values else the skeleton data is not tracked perfectly by the sensor
  • On the MainCamera, attached is a script KinectManager. At the top you will see two variables. “Kinect Height”, “Kinect Angle” “Min User Distance” and “Max User Distance”. All these values (in meters) need to be changed as they affect the way the Kinect Sensor tracks the body data. If you turn the checkbox for “Hint Height Angle”, it will hint you the values that the Sensor needs for height and angle values
  • If the Kinect 2 is not moved from its original position, anytime recently these values should work else just do what I mentioned in my last point
    • Sensor Height = 2.26
    • Sensor Angle = -29
    • Min User Distance = 0.5
    • Max User Distance = 4 ( a value of zero means unlimited distance and it would track everyone in the room which we don't want)


  • Start exploring all the scenes, duplicate the scenes if you want and just keep exploring with the Kinect 2. More you explore, more ideas you will get of what you could do with this amazing tool
  • Most promising aspect of the Kinect 2 is the Gesture creation tool. On the KinectManager script, there is something called the “Player Common Gestures”.
  • They have a list of about 15-16 built in gestures that you can READILY use without having to do anything. Play around with these.


  • Wouldn’t it be cool if you could create your own custom complex gestures of say anything? Like a golf simulation, baseball or gestures for simple interactions like door open, close?
  • Kinect 2 does this quite well with “Visual Gesture Builder” tool. VGB uses Machine learning techniques from recorded body data and from a set of sample test cases from users to create custom gestures. After you record certain gestures with the Kinect Studio, you can use VGB to tag the gestures you want from recordings and then create a database of these tagged gestures. You can call this database in any application you want like Unity for example

All the details for using VGB is on the link below

There is a lot of information provided. So use it effectively!