So this weekend I took my first serious dive into developing 3d for the Rift and for real-time viewing. Finally punched in my 4 month trial code from Oculus that I got with my DK1 and explored some of the pro features, mostly lightmapping, and being able to actually compile a demo for the Rift!
Some initial notes from the experience:
1. when exporting an FBX from 3dsmax, I always make sure I'm converting units to centimeters.
2. within the OVRCameraController, make sure 'use player eye height' is checked
3. Adjust OVRPlayerController 'skin width' to 0.001
4. now, within the Oculus Configuration Utility: make sure your profile is correctly setup and checked as 'default'
Using these steps I was able to bring a model of my apartment (which I modeled exactly 1:1 through laser and tape measurements) out of 3dsmax and into Unity3d and view it through the Rift with almost an exact match in size/depth perception.
Based on some comments on the Oculus Developer Forums, grodenglaive suggestedThomas Pasieka's suggestion of using a 1x1x1 meter cube in unity to compare with imported models for consistency. This seemed to show me that my model was the correct size when imported at 0.01
Another test I did was check my player eye height by pressing spacebar (it was 1.776m). I made a cube that was 1.776m tall and viewed it. I expected the top of the cube to be perfectly aligned with the horizon, but instead I could see the top surface of it... So something is a bit off on the OVRController... the final solution was NOT to scale the imported geometry, but adjust the skin width on the OVRPlayerController from .08 to .001. That .08 skin was exactly what I was trying to counter in my import scaling by .05-.075. doh
Also thanks to owenwp & drash, seems like it'll all be sorted going forward now.