Vive | Setting up basic teleportation in Unity3d

The best solution at the moment is following Theston's work, which is an all-in-one script that you drop on the controllers. It handles a laser pointer, teleportation, and even some ability to grab objects. See what he's doing here, - and here,

Download the assets here, And my steps I go through to integrate the files into my projects:

  • Import the latest SteamVR Plugin
  • Download the zip of SteamVR UNity Toolkit from Github
  • Copy SteamVR_Unity_Toolkit folder into your Unity project's Assets folder
  • Edit / Project Settings / Graphics, add a new included shader - choose 'Unlit/TransparentColor
  • Drop CameraRig into scene from, Assets / SteamVR_Unity_Toolkit/Prefabs
  • Put Steam VR_Basic Teleport script on CameraRig (found in the Assets / SteamVR_Unity_Toolkit/Scripts folder)
  • Put Steam VR_Controller Events & Steam VR_Simple Pointer scripts on both Controllers.
  • Have the floor be a physics collider. I break my floor geometry off from the rest and put a Mesh Collider on them. The teleporter will now only work when pointed at the floor.

Linear EXR Texture Baking

Texture Baking 3dsmax/Vray -> Unity3d)

I wanted to post this up for future reference. Somehow I've managed to bake textures as .tga all this time without complaining - The problem is I wanted to try baking my textures as .exr. But when baking more than one object, you aren't able to adjust the filetype you want to export your baked textures as. So they default to .tga

While chatting with a friend about making a baking script for me, we stumbled across this post online - Michiel Quist posts the steps needed to change the default filetype used by the Render to Texture dialog.

  • Find your 3dsmax folder and go into this subfolder ...\UI\MacroScripts
  • Edit 'Macro_BakeTextures.mcr' in a text editing program
  • Find this line, local    defaultFileType = ".tga" -- default bitmap file extension

I replaced .tga with .exr to try and take advantage of rendering textures in linear color space. I know Unity3d has buttons for working with linear color space.

So far in Unity3d I've done this --

  • Change the imported texture settings to 'advanced' texture type
    - Bypass sRGB Sampling, tick On
    - Encode as RGBM, tick Off
    - Generate Mip Maps, tick Off
    - Wrap Mode, Clamp
    - Format, Automatic Truecolor
  • Camera has HDR ticked. edit: this is been problematic. One thing is that HDR cameras do not support MSAA. Another problem is that Unity does not support post corrections to linear exr textures. You'd need to invest in Amplify Textures 2 if you wanted full control. But even then, the sacrifice of MSAA doesn't seem to be worth it at the moment.

  • Batch convert materials to lit/texture, then selectively go around and switch various textures/shaders to standard material to give reflective properties
  • Extra camera screen space adjustments

I'll try and put together a personal 3d scene at home that I can share this new process.

Setting up the DK2 on a RMBP


[Note: all of this exploration was done with a Retina Macbook Pro & DK2]

So this weekend I worked on getting my demo functioning with the DK2. It took quite a bit of fiddling now that there are even more options & toggles to play with. At first I had just settled for an experience that had some pretty severe stuttering, but then I found this FPS Display made by merire ( which let me drop a FPS counter into my scene to watch the frames per second. I saw numbers ranging from 30, to 60, to finally 75 fps, all through only making changes to the Rift setup and no scene optimization.

So at first I was playing around with the Rift Display Mode. 'Extend Desktop to the HMD' was where I started, but you can't see or adjust the refresh rate within Windows display options.. and my scene seemed to be locked at 60, which is the same hertz as my laptop display...

I flipped over to 'Direct HMD Acceess from Apps'

Now in Unity, I'm used to running my demos without DX11, but it seems to be essential in getting the frame rate to max out.

I had already set my quality settings to match what was suggested in the Oculus Unity Integration guide (page 10 & 11).

For the player controller, I tweaked the OVRCameraController

Turning off Mirror to Display was important, as I was seeing around 38fps with it on. As soon as I turn it off I saw a solid 75fps.

Another thing is that I can only get a visual through _DirectToRift.exe, which is fine, I just can't build&run directly from Unity. So I built and then opened up the project folder outside Unity and ran the Direct to Rift executable. This displays a black window on my laptop screen. Now when I run the regular exe (at least with my current project) I get this crazy effect (a bright white glow) that grows across the game window on my laptop screen, while the DK2 has a black screen. Otherwise I'm happy to have found something that works with my RMBP, and at least works with my fairly simple projects! Stay tuned, as I hope to release my little scene to share.

Reflections! - an overview of reflections in Unity3d

Been doing some research into what options there are for reflections in Unity3d

The first one is simply using cubemaps. Using whatever method you prefer; there are various assets that will make them, there are scripts you can put together to make them.  I purchased Cubemapper which I think has a great interface for quickly capturing cubemaps - and also updating them as your scene updates.

Cubemaps seem to work great for small objects within a space, so for furniture within a room you might get away with using a cubemap, especially if the reflection will be blurred and broken up by a rough texture. So they work especially well when they can be abstracted.

my current Shader Forge shader that allows for a baked complete map & cubemap reflection w/fresnel control

But what if you have a reflective floor or wall? The cubemap doesn't project the reflection accurately and you get this reflection that's detached in its reflecting as you move and view it. A couple solutions for this seem to be either using box projected cubemap environment mapping (BPCEM), which projects the cubemap onto a box which gives the reflection more structure.

my attempt with the box mapped reflection, notice the issue around the corner of that opening

In the second link it looks like these guys plan to make a Unity asset that works around this idea of mapping reflections onto simpler geometry in the same style as the box projection mapping. So that could be quite cool. If they can get past the issue of reflections only being box mapped then it could be a great solution.

Lastly there's screen space reflections (SSR), which relies on taking whatever you're looking at and doing calculations to grab the objects and 'render' them as reflections on other objects. I'm not sure exactly the technical details but it seems to work well enough as long as the reflections you need aren't happening behind you. So while a sphere in front of you could reflect into the ground that you see below it, this method couldn't reflect objects behind you onto the surface of the sphere. Definitely wont work for mirrors on a wall. Make sense?

My eyes are set on Candela SSRR, although it's a bit pricey. Might be better off waiting until Unity 5 to see what features come packed in that updated version of the software before purchasing more assets.

In the end I might need to try and use all 3 methods cobbled together to get the right look.


In the mean time I've been honing my ability to organize & bake 3dsmax scenes for importing into Unity. One tool that I've been demoing that I think it might pick up is UV-Packer, I've been quite impressed with how quickly, easily, and tightly it packs a UV space. Given, you still need to try and do a half-decent job unwrapping. But I dig the packing skills of this plugin.

Workflow Exploration; 3dsmax to Unity3d

I haven't posted in a bit, since everything is still a mess. I've also been doing experiments with projects I can't share so that's another reason. But I thought I'd catch up on what I've been trying out so far. What has or hasn't worked.

So I'm currently trying to devise a workflow that allows me to bring scenes from 3dsmax, with all the glorious lighting from V-Ray baked into them.

Here's the current method:

1. Optimize all geometry as much as possible. Reducing turbosmooths, reducing curve interpolation on rendered splines and/or sweeps. Some things I've just applied an optimize modifier to reduce.

2. Unwrap. I usually do a shoddy job of just putting simple box mapping on objects. So before I start combining meshes, I unwrap them as properly as possible. I'm still trying to find a better workflow that doesn't damage the existing mapping until after baking.

3. Combine. At this point I combine the whole entire scene into a single object. This is simply so that I can explode it back into pieces based on material. This is because if I have two of the same material in the scene and bake them as two textures - the material is confused as to which baked texture to use.
detach by material name

4. Organize. Now each mesh should have a jumble of UVs, so it needs to be packed more cleanly. Currently I'm simply using the 3dsmax pack uv function, but I'm looking into other plugins to do this better.

5. Bake! Everything should be ready to go. A couple important things are that; you make sure you're using regular cameras to preview the lighting. You can still use a vray exposure control to match an exposure so that it works with Render To Texture. Also make sure you're baking your gamma into the render. For me this is simply changing my color mapping mode to 'color mapping & gamma'.  Also I have the baked texture replace the existing diffuse slot texture, so that I can convert the material to standard and export with the geometry without issue.

In the Render to Texture panel, check 'color mapping' & select the desired elements to bake:

VrayCompleteMap - everything
VrayDiffuseFilterMap - just diffuse
VrayRawLightingMap - just lighting/shadows
VrayMtlReflectHighlightGlossinessBake - just the texture used in the reflection
VrayBumpNormalsMap - normals from bump material (if any)

I've gotten a lot of issues during this step - ranging from issues if any of the materials are anything besides a regular VRayMtl. So blend materials, 2-sided, light material will cause issue, the issue being that render to texture can't find the diffuse slot of that material to place the baked texture into.

6. Export. Once everything has baked. If you've set the baked materials to load into the diffuse slots. Next you'll convert all the materials to standard. I use this:

At this point I also use a script to load the diffuse textures into the viewport so they're visible. I can't find the exact script but there are multiple ones on scriptspot.

Now you can export everything as an FBX, I embed media, and for units scale I uncheck automatic and set it to centimeters.

7. Import. The FBX should load automatically if you exported it into your Assets folder somewhere. The model should also have all the baked textures on all its materials. You can now go through them and change them to more complex materials and load the other baked materials if needed (bump & reflection). I'm still struggling to find the best methods for reflections since all my materials usually have some sort of subtle reflective property w/ fresnel. I'll probably wait for Unity 5 before I purchase anymore assets - but I have been eyeing something like Candela SSRR or Screen Space Reflections (SSR). I still need to figure out and make use of the Shader Forge asset I purchased :)

I'll leave you with this, a little test I did yesterday. There's this triangle shaped building near where I live that has had the lower level up for lease, I've been wanting to imagine some use for this space. So yesterday I went and took about 150 pictures of the facade from all angles and processed them in PhotoScan - I got a pretty decent mesh that I'll use for reference when i model it properly. But for now I did a ProOptimize on it, and threw it into Unity to view :)