Unity3d & Oculus Rift / Accurate Player Height in the Rift

I've had several people tell me they feel too big, or that the space seems small to them. So I finally spent an evening doing some R&D on the issue.

Basically. The 'Skin Width' on the OVRPlayerController should be set to .001 (default is .08)
I believe this is adding some padding beneath (and all around) the player which is adding additional unwanted height. So here's to hoping that someday Oculus will adjust this as a default setting in their OculusUnityIntegration package.

Here are my notes from the evening:

Player stats -
. 6'2" Player height (set in config)
. actual real life eye to floor measurement, 5'9" (stepped on a tape measure and looked into a mirror for the height)

I built a little 3d measuring tape prefab, 7 feet high, with markers at each foot. As well as a ruler graphic with 12 inch marks, on every foot. Dropped this into my scene that was modeled to actual scale. So everything should be accurate!

Fired up the demo. Press space bar, and eye height reads as 1.776, or about 1'8"

My view of the ruler, looking straight forward, was at a height of about 6'1"

With the 'skin width' set to .001, I fire up the demo again. Eye height is now about 5'9.5"

So that's a 4.5" difference, which can make a huge difference when viewing things up close. That basically puts you in heels :D

Then I ran a few other tests, just playing around with the Character Controller's player height & radius. Basically you don't want to mess with these, but I did find that changing the height in there did have an affect on the overall height in the demo. The default is 2m height, which had given me the 6'1" viewing height. A player height of 1.5, brought me down to about 5'. And a player height of 1 m brought me down to 4'1.5"

Also I was curious about what affect the positional camera had on the player's height. As soon as the demo launches and the camera turns on, it starts calculating position, which can turn into a small or big shift in actual sense of viewing height when the headset is put on your face. What I found was that as long as you reset the view, by pressing F2, you could reset to the correct viewing height. Although it was locking up the positional tracking, which was or wasn't getting unstuck after doing a bit of voodoo swaying side to side.

I also played around with the OVRCameraController, which I use less of. Basically it has an option for 'use player eye height', which shifts the players view above the actual placement of the camera prefab in the scene. In my case it was raised about 2'10" off the ground, from a camera placed at 0. I'm guessing this i suppose to simulate a players sitting height, although for myself I feel it would need to come down 5-6 inches lower. Turning off 'use player eye height' has the viewer look from the exact position of the OVRCameraController.

I hope this helps some people, since it had been driving me a bit crazy. And now I need to update my demo!

Setting up the DK2 on a RMBP

Hey,

[Note: all of this exploration was done with a Retina Macbook Pro & DK2]

So this weekend I worked on getting my demo functioning with the DK2. It took quite a bit of fiddling now that there are even more options & toggles to play with. At first I had just settled for an experience that had some pretty severe stuttering, but then I found this FPS Display made by merire (https://developer.oculusvr.com/forums/viewtopic.php?f=37&t=11723) which let me drop a FPS counter into my scene to watch the frames per second. I saw numbers ranging from 30, to 60, to finally 75 fps, all through only making changes to the Rift setup and no scene optimization.

So at first I was playing around with the Rift Display Mode. 'Extend Desktop to the HMD' was where I started, but you can't see or adjust the refresh rate within Windows display options.. and my scene seemed to be locked at 60, which is the same hertz as my laptop display...

I flipped over to 'Direct HMD Acceess from Apps'

Now in Unity, I'm used to running my demos without DX11, but it seems to be essential in getting the frame rate to max out.

I had already set my quality settings to match what was suggested in the Oculus Unity Integration guide (page 10 & 11).

For the player controller, I tweaked the OVRCameraController

Turning off Mirror to Display was important, as I was seeing around 38fps with it on. As soon as I turn it off I saw a solid 75fps.

Another thing is that I can only get a visual through _DirectToRift.exe, which is fine, I just can't build&run directly from Unity. So I built and then opened up the project folder outside Unity and ran the Direct to Rift executable. This displays a black window on my laptop screen. Now when I run the regular exe (at least with my current project) I get this crazy effect (a bright white glow) that grows across the game window on my laptop screen, while the DK2 has a black screen. Otherwise I'm happy to have found something that works with my RMBP, and at least works with my fairly simple projects! Stay tuned, as I hope to release my little scene to share.


Reflections! - an overview of reflections in Unity3d

Been doing some research into what options there are for reflections in Unity3d

The first one is simply using cubemaps. Using whatever method you prefer; there are various assets that will make them, there are scripts you can put together to make them.  I purchased Cubemapper which I think has a great interface for quickly capturing cubemaps - and also updating them as your scene updates.

Cubemaps seem to work great for small objects within a space, so for furniture within a room you might get away with using a cubemap, especially if the reflection will be blurred and broken up by a rough texture. So they work especially well when they can be abstracted.

my current Shader Forge shader that allows for a baked complete map & cubemap reflection w/fresnel control

But what if you have a reflective floor or wall? The cubemap doesn't project the reflection accurately and you get this reflection that's detached in its reflecting as you move and view it. A couple solutions for this seem to be either using box projected cubemap environment mapping (BPCEM), which projects the cubemap onto a box which gives the reflection more structure.

http://forum.unity3d.com/threads/113784-Has-any-one-experimented-with-Box-Projection-Correction-Environment-Mapping

my attempt with the box mapped reflection, notice the issue around the corner of that opening

http://forum.unity3d.com/threads/230523-CubeMapper-Parallax-reflection-demo

In the second link it looks like these guys plan to make a Unity asset that works around this idea of mapping reflections onto simpler geometry in the same style as the box projection mapping. So that could be quite cool. If they can get past the issue of reflections only being box mapped then it could be a great solution.

Lastly there's screen space reflections (SSR), which relies on taking whatever you're looking at and doing calculations to grab the objects and 'render' them as reflections on other objects. I'm not sure exactly the technical details but it seems to work well enough as long as the reflections you need aren't happening behind you. So while a sphere in front of you could reflect into the ground that you see below it, this method couldn't reflect objects behind you onto the surface of the sphere. Definitely wont work for mirrors on a wall. Make sense?

My eyes are set on Candela SSRR, although it's a bit pricey. Might be better off waiting until Unity 5 to see what features come packed in that updated version of the software before purchasing more assets.

In the end I might need to try and use all 3 methods cobbled together to get the right look.

 

In the mean time I've been honing my ability to organize & bake 3dsmax scenes for importing into Unity. One tool that I've been demoing that I think it might pick up is UV-Packer, I've been quite impressed with how quickly, easily, and tightly it packs a UV space. Given, you still need to try and do a half-decent job unwrapping. But I dig the packing skills of this plugin.