An A–Z guide to making a VR environment
Guide - 2018
Scott Liang
Hello friends! This guide is meant to introduce you to the entire process of making a VR environment.
I've seen a lot of (fantastic) tutorials on each individual step, such as 3D modeling in Blender and lighting in Unity, but none about the overall workflow. If you're already familiar with some of the applications, this guide will hopefully shed some light on how they fit together. If you're completely new to the domain, this guide will serve as a handy outline from which you can structure your learning.
To follow along, all you'll need are a computer, an iPhone, and a Google Cardboard-style headset (the kind with a button).
On your laptop, you will need Blender, Unity, and Xcode installed. On your iPhone, go ahead and install Unity Remote. All are completely freeee! I also touch upon Rhinoceros (3mo free trial) and Photoshop (~$10/mo), but these are optional.
The image below gives an overview of the workflow:
Blender is an insanely powerful, open-source 3D graphics software with which you'll build and texture-map your 3D models. It's well-suited for VR environments because it focuses on mesh geometries (which are used by game engines) and provides a UV mapping setup, among other things. Other popular programs for these tasks are Maya and 3D Studio Max, but they're insanely expensive.
Once you've completed your models and texture maps, you'll take them into Unity, which is a highly popular game development engine. Here you will do just about everything that is not related to making models. It may be helpful to think of Unity as the main program, and Blender as an awesome "modeling companion" for it.
When you're happy with your environment, you'll install your VR app on your iPhone via Xcode, then into your VR world you go!
OK! Let's get started.
If you're just starting out with Blender, I highly recommend the tutorials provided by Blender Guru and CG Geek. Their beginner series will get you up and running in a matter of days. I've also compiled a list of the most useful Blender shortcuts, which you can find here.
The first step of the workflow is to create the mesh geometries of your environment. A few key tips for modeling for VR:
When naming the objects, I usually go ahead and create/assign materials to them as well (if you can, use the same materials for objects with identical appearances). However, don't go overboard with materials settings in Blender; Blender and Unity use fundamentally different rendering methods (ray-tracing vs. BRDF), so very few properties will transfer over.
When you've completed your models, make sure to apply your transforms via ctrl+a before UV mapping.
Before exporting, you will also want to make sure that the origins of your models are sensible. This is easily done by selecting an option from Set Origin in the Tools Panel (Object Mode). For more precision, first set your cursor to a specific vertex, edge, or face in Edit Mode and select Origin to 3D Cursor.
For the hexagonal screens, I found it (much) easier to use the Grasshopper plugin for Rhino, which is a parametric modeler that allows you to generate geometries through various inputs. Below, notice how by simply defining a boundary curve, then dragging a few number sliders, Grasshopper creates the exact object that I need. I've uploaded a copy of the script here.
Once generated, the hexagon screens can be exported as .fbx or .3dm files and brought into Blender.
If you want any semblance of realism, you'll have to add textures to most of your objects. Some excellent sources of free textures are Poliigon, Free PRB, and CC0 Textures.
Each textured object will need to be UV mapped. UV coordinates define where on the surface of a 3D mesh a two-dimensional texture is projected. The process of mapping can be rather tedious: you'll have either manually Mark Seams and Unwrap each object or, if you're extra lazy, use Smart UV Project and tweak the UV map in the UV/Image Editor.
This Smart UV Projected map required rotating and adjusting X & Y texture scaling to achieve a usable result.
Sometimes you will encounter the need to tweak a texture or create an entirely new one. Substance Painter is the uber-application for this, but in most cases, Photoshop will do the trick. Hit Export UV Layout from the UVs menu in the UV/Image Editor, bring it into a Photoshop layer for guidance, and texture away.
One super handy thing about Photoshop is its ability to automagically create normal maps from images. Simply go to Filter > 3D > Generate Normal Map.
Congrats, you made it! With all the models and UV maps completed, it's now time to bid adieu to Blender and move everything over to Unity. There are three ways of doing this, each with their own pros and cons:
For the sake of brevity, I went with option 2: sticking the entire .blend file in Assets.
Bonjour, Unity! Here you will:
(As far as learning resources go, I've found Brackeys, SpeedTutor, and Unity's official documents to be excellent.)
Once you've followed one of the export options above, you should notice your exports as prefabs in your Project window. Drag them into your Hierarchy window to view them in your scene. Lookin' good!
You might notice that the materials nested under the prefabs are greyed out. This is due to a recent change in how imported materials are treated in Unity. To fix this, simply click on the prefab(s), go to the Inspector, and click Extract Materials (make a new Assets > Materials folder for them). The will created editable copies of the materials in Unity and automatically remap them to the prefabs.
Next: Create an Assets > Textures folder and drag all of the scene's texture files into it. Then go down your extracted materials and tweak their shader parameters to what you want. If they have textures associated with them, simply drag the textures to their appropriate Albedo (the equivalent of Diffuse in Blender), Normal, etc. fields. You'll notice your objects update with textures, UV-mapped and all!
There are a boatload of things that can be done here, but in general, I go through the following steps:
I'd be happy to go into more detail about this. Feel free to drop me an email.
More complex environments will require a way to navigate them. Since my target device was an iPhone + Google Cardboard setup, I opted for a simple raycasting navigation system. Think of it like having a laser pointer glued to the top of your head—just turn your head to point it where you want to go and press the button.
For this scene, I used a few scripts provided by Fred Moreau, who has a great Unity VR course on Udemy.
It's very simple to set up:
Next, select the floors you want to walk on, go to the Navigator panel, and check them as Navigation Static. Make sure they're set as Walkable, then bake them.
Lastly, go to Edit > Project Settings > Editor > Unity Remote and select Any iOS Device. Install Unity Remote on your iPhone, plug it in, hit Play, and you should be good to go.
You are SO CLOSE! Mmmmm the hearty smell of finish line. Once you're happy with your scene, go to File > Build Settings in Unity:
Apply iPhone to headset. Apply headset to face...
And enjoy your new VR world!
Thanks so much for reading my guide—I hope you found something useful! If you'd like clarification about anything or would like to suggest improvements, I'd love to hear from you.