Vive | Setting up basic teleportation in Unity3d

The best solution at the moment is following Theston's work, which is an all-in-one script that you drop on the controllers. It handles a laser pointer, teleportation, and even some ability to grab objects. See what he's doing here, - and here,

Download the assets here, And my steps I go through to integrate the files into my projects:

  • Import the latest SteamVR Plugin
  • Download the zip of SteamVR UNity Toolkit from Github
  • Copy SteamVR_Unity_Toolkit folder into your Unity project's Assets folder
  • Edit / Project Settings / Graphics, add a new included shader - choose 'Unlit/TransparentColor
  • Drop CameraRig into scene from, Assets / SteamVR_Unity_Toolkit/Prefabs
  • Put Steam VR_Basic Teleport script on CameraRig (found in the Assets / SteamVR_Unity_Toolkit/Scripts folder)
  • Put Steam VR_Controller Events & Steam VR_Simple Pointer scripts on both Controllers.
  • Have the floor be a physics collider. I break my floor geometry off from the rest and put a Mesh Collider on them. The teleporter will now only work when pointed at the floor.

Linear EXR Texture Baking

Texture Baking 3dsmax/Vray -> Unity3d)

I wanted to post this up for future reference. Somehow I've managed to bake textures as .tga all this time without complaining - The problem is I wanted to try baking my textures as .exr. But when baking more than one object, you aren't able to adjust the filetype you want to export your baked textures as. So they default to .tga

While chatting with a friend about making a baking script for me, we stumbled across this post online - Michiel Quist posts the steps needed to change the default filetype used by the Render to Texture dialog.

  • Find your 3dsmax folder and go into this subfolder ...\UI\MacroScripts
  • Edit 'Macro_BakeTextures.mcr' in a text editing program
  • Find this line, local    defaultFileType = ".tga" -- default bitmap file extension

I replaced .tga with .exr to try and take advantage of rendering textures in linear color space. I know Unity3d has buttons for working with linear color space.

So far in Unity3d I've done this --

  • Change the imported texture settings to 'advanced' texture type
    - Bypass sRGB Sampling, tick On
    - Encode as RGBM, tick Off
    - Generate Mip Maps, tick Off
    - Wrap Mode, Clamp
    - Format, Automatic Truecolor
  • Camera has HDR ticked. edit: this is been problematic. One thing is that HDR cameras do not support MSAA. Another problem is that Unity does not support post corrections to linear exr textures. You'd need to invest in Amplify Textures 2 if you wanted full control. But even then, the sacrifice of MSAA doesn't seem to be worth it at the moment.

  • Batch convert materials to lit/texture, then selectively go around and switch various textures/shaders to standard material to give reflective properties
  • Extra camera screen space adjustments

I'll try and put together a personal 3d scene at home that I can share this new process.

Workflow Exploration; 3dsmax to Unity3d, updated!

Here's my latest method.  I've also included some new tools that are linked below. This post is pretty text heavy, but I hope it might help some people. I think this method is a great compromise on getting a space VR ready, while at the same time delivering still renderings that clients ask for:

My work is done on a 2013 Retina Macbook Pro, running Windows 7 as my benchmark for performance. If things can run decently on my set up then I assume most others should have a buttery smooth experience. (specs: 2.8GHZ, 16GB memory, GeForce GT 650M)


1. First step, optimize all geometry. Reduce turbosmooths, and reduce curve interpolation on rendered splines and/or sweeps. Sometimes I'll just apply an optimize modifier to reduce poly count. I try to keep the total scene poly limit to 500k - 1mil.


2. Check for flipped normals. I've been running into this problem when dealing with real world projects. Sometimes objects have been mirrored and instanced or just built too quickly and care wasn't taken to double check flipped normals. Sometimes you can get away with this within 3dsmax since it can render back-faces. Unfortunately, in Unity you'll see right through them :)


3. Combine. At this point I combine the entire scene into a single object. This is done so that the elements can be broken up by material name, which also gives you a multi/sub-object material to review materials, rename, etc. Combining keeps the amount of objects to deal with lower, and makes for an easier Shaders set up Unity. You can further break the objects up once the baked textures are created.  In addition, it's also a good place to check for sub-object selections and deselect them!

4. Unwrap. I have a maxscript tool that a friend built for me to speed up this step., download here. It does a beautiful job. One thing to note is that objects cannot have sub-object selections. This will cause issues in the unwrap (since only the selected polys will get unwrapped). So be sure to check for selection when the scene is combined as a single mesh. I then do a basic UVW Unwrap 'flatten' within texture channel 5 or 10. Then I run UV-Packer through the same channel, which does the best job of optimizing uv space and packing those flattened UVs. Finally, I collapse everything.


5. Bake! Everything should be ready to go. A couple important things to note: you should make sure you are using regular cameras to preview the lighting. You can use vray exposure control to match a vray cam exposure so that it works with Render To Texture. Also make sure you're baking your gamma into the render. For me, this is simply changing my color mapping mode to 'color mapping & gamma'. Also make sure output gamma is 2.2 (these settings might be different depending on how you render, test with a single object until the baking matches a vray render)

In the Render to Texture panel, check 'color mapping' & select the desired elements to bake. I currently I only bake with the VrayCompleteMap.

note: vraydisplacement wont bake out, so you'll need to place the texture being used into an actual displace modifier, subdivide the geometry, then collapse & optimize the geometry to be suitable for baking.

6. Export. Prep the file for export once everything has baked. If I net render the baking, I'll run the Render To Texture on my local machine with 'skip existing files' selected. At the point, I'll also choose under Baked Material 'Save Source/Create New Baked (Standard:Blin), and set the target map slot to diffuse color. Note: setting the wire color of all the objects to white before doing this will set the ambient color of the Blin material to white, which will make things easier in Unity when imported)

Run the bake again.  It will skip all the objects since they've baked already, but it will put the textures into standard materials. I'll also click 'Update Baked Materials' for good measure. Then select 'keep baked materials' and clear the shell. Now you're left with a scene that has all standard materials, with white diffuse color, and the baked texture.

Optional: At this point I also use a script to load the diffuse textures into the viewport so they're visible.

Shuffle the baked UV (10, or whatever channel you used), into channel 1. Channel 1 is the UV channel Unity uses.

Now you can export everything as an FBX.  Note: I embed media, and for units scale I uncheck automatic and set it to centimeters.


7. Import into Unity! The FBX should load automatically if you export it into your Assets folder. The model should also have all the baked textures on all its materials.  The shaders will be default Unity diffuse shaders. I use this script that my friend made to select all shaders and switch them to unlit/texture. UnlitDefaultShaderPostProcessorPlusShaderConverterTool_1.1.unitypackage, download here. There's also a post processor in there that's suppose to automatically convert all shaders to Unlit/Texture. You can delete it if you wish. You should now be in a pretty decent starting place to preview the space in Unity.

The next steps that I take are to isolate and work with materials that should be reflective. These need special shaders that will accept cubemaps, bump maps, etc. You might then go back into 3dsmax and isolate the objects with these materials so that they can be separate geometry for using the correct probes.   You might also re-bake those specific objects to obtain texture textures for bump/spec/gloss/whatever. I recommend looking at Cubemapper and Reflection Manager. Unity 5 should also have a great set of reflection tools included, so look out for that!

I'm also eyeing Amplify Texture as a way to bring in larger baked maps, however I haven't seen performance hits from 2k textures yet, so I've been slow to adopt it into my process.

 Let me know if you have any questions, either by emailing or leaving a comment. Have fun!

Setting up the DK2 on a RMBP


[Note: all of this exploration was done with a Retina Macbook Pro & DK2]

So this weekend I worked on getting my demo functioning with the DK2. It took quite a bit of fiddling now that there are even more options & toggles to play with. At first I had just settled for an experience that had some pretty severe stuttering, but then I found this FPS Display made by merire ( which let me drop a FPS counter into my scene to watch the frames per second. I saw numbers ranging from 30, to 60, to finally 75 fps, all through only making changes to the Rift setup and no scene optimization.

So at first I was playing around with the Rift Display Mode. 'Extend Desktop to the HMD' was where I started, but you can't see or adjust the refresh rate within Windows display options.. and my scene seemed to be locked at 60, which is the same hertz as my laptop display...

I flipped over to 'Direct HMD Acceess from Apps'

Now in Unity, I'm used to running my demos without DX11, but it seems to be essential in getting the frame rate to max out.

I had already set my quality settings to match what was suggested in the Oculus Unity Integration guide (page 10 & 11).

For the player controller, I tweaked the OVRCameraController

Turning off Mirror to Display was important, as I was seeing around 38fps with it on. As soon as I turn it off I saw a solid 75fps.

Another thing is that I can only get a visual through _DirectToRift.exe, which is fine, I just can't build&run directly from Unity. So I built and then opened up the project folder outside Unity and ran the Direct to Rift executable. This displays a black window on my laptop screen. Now when I run the regular exe (at least with my current project) I get this crazy effect (a bright white glow) that grows across the game window on my laptop screen, while the DK2 has a black screen. Otherwise I'm happy to have found something that works with my RMBP, and at least works with my fairly simple projects! Stay tuned, as I hope to release my little scene to share.

Reflections! - an overview of reflections in Unity3d

Been doing some research into what options there are for reflections in Unity3d

The first one is simply using cubemaps. Using whatever method you prefer; there are various assets that will make them, there are scripts you can put together to make them.  I purchased Cubemapper which I think has a great interface for quickly capturing cubemaps - and also updating them as your scene updates.

Cubemaps seem to work great for small objects within a space, so for furniture within a room you might get away with using a cubemap, especially if the reflection will be blurred and broken up by a rough texture. So they work especially well when they can be abstracted.

my current Shader Forge shader that allows for a baked complete map & cubemap reflection w/fresnel control

But what if you have a reflective floor or wall? The cubemap doesn't project the reflection accurately and you get this reflection that's detached in its reflecting as you move and view it. A couple solutions for this seem to be either using box projected cubemap environment mapping (BPCEM), which projects the cubemap onto a box which gives the reflection more structure.

my attempt with the box mapped reflection, notice the issue around the corner of that opening

In the second link it looks like these guys plan to make a Unity asset that works around this idea of mapping reflections onto simpler geometry in the same style as the box projection mapping. So that could be quite cool. If they can get past the issue of reflections only being box mapped then it could be a great solution.

Lastly there's screen space reflections (SSR), which relies on taking whatever you're looking at and doing calculations to grab the objects and 'render' them as reflections on other objects. I'm not sure exactly the technical details but it seems to work well enough as long as the reflections you need aren't happening behind you. So while a sphere in front of you could reflect into the ground that you see below it, this method couldn't reflect objects behind you onto the surface of the sphere. Definitely wont work for mirrors on a wall. Make sense?

My eyes are set on Candela SSRR, although it's a bit pricey. Might be better off waiting until Unity 5 to see what features come packed in that updated version of the software before purchasing more assets.

In the end I might need to try and use all 3 methods cobbled together to get the right look.


In the mean time I've been honing my ability to organize & bake 3dsmax scenes for importing into Unity. One tool that I've been demoing that I think it might pick up is UV-Packer, I've been quite impressed with how quickly, easily, and tightly it packs a UV space. Given, you still need to try and do a half-decent job unwrapping. But I dig the packing skills of this plugin.