work /
Guide: Making a VR World
work
virtual reality
architecture
landscape architecture

An A–Z guide to making a VR environment

context /

Guide - 2018

team /

Scott Liang

A Primer

Hello friends! This guide is meant to introduce you to the entire process of making a VR environment.

 

I've seen a lot of (fantastic) tutorials on each individual step, such as 3D modeling in Blender and lighting in Unity, but none about the overall workflow. If you're already familiar with some of the applications, this guide will hopefully shed some light on how they fit together. If you're completely new to the domain, this guide will serve as a handy outline from which you can structure your learning.

To follow along, all you'll need are a computer, an iPhone, and a Google Cardboard-style headset (the kind with a button).

No need for a crazy VR rig

On your laptop, you will need Blender, Unity, and Xcode installed. On your iPhone, go ahead and install Unity Remote. All are completely freeee! I also touch upon Rhinoceros (3mo free trial) and Photoshop (~$10/mo), but these are optional.

The image below gives an overview of the workflow:

Blender is an insanely powerful, open-source 3D graphics software with which you'll build and texture-map your 3D models. It's well-suited for VR environments because it focuses on mesh geometries (which are used by game engines) and provides a UV mapping setup, among other things. Other popular programs for these tasks are Maya and 3D Studio Max, but they're insanely expensive.

Once you've completed your models and texture maps, you'll take them into Unity, which is a highly popular game development engine. Here you will do just about everything that is not related to making models. It may be helpful to think of Unity as the main program, and Blender as an awesome "modeling companion" for it.

When you're happy with your environment, you'll install your VR app on your iPhone via Xcode, then into your VR world you go!

OK! Let's get started.

Blender: Geometries and Textures

If you're just starting out with Blender, I highly recommend the tutorials provided by Blender Guru and CG Geek. Their beginner series will get you up and running in a matter of days. I've also compiled a list of the most useful Blender shortcuts, which you can find here.

Use Blender 3D modeling and UV mapping

Making the Models

The first step of the workflow is to create the mesh geometries of your environment. A few key tips for modeling for VR:

  • Build everything to scale, with dimensions that approximate the real world. 1 unit = 1 meter.
  • Try to keep your polygon count low! VR is particularly reliant on high framerates (60–90fps is ideal) and everything is rendered twice, so performance optimization is crucial. You can make up for detail loss via high-quality textures.

When naming the objects, I usually go ahead and create/assign materials to them as well (if you can, use the same materials for objects with identical appearances). However, don't go overboard with materials settings in Blender; Blender and Unity use fundamentally different rendering methods (ray-tracing vs. BRDF), so very few properties will transfer over.

When you've completed your models, make sure to apply your transforms via ctrl+a before UV mapping.

Before exporting, you will also want to make sure that the origins of your models are sensible. This is easily done by selecting an option from Set Origin in the Tools Panel (Object Mode). For more precision, first set your cursor to a specific vertex, edge, or face in Edit Mode and select Origin to 3D Cursor.

Optional: Parametric Geometries

For the hexagonal screens, I found it (much) easier to use the Grasshopper plugin for Rhino, which is a parametric modeler that allows you to generate geometries through various inputs. Below, notice how by simply defining a boundary curve, then dragging a few number sliders, Grasshopper creates the exact object that I need. I've uploaded a copy of the script here.

A simple script in Grasshopper

Once generated, the hexagon screens can be exported as .fbx or .3dm files and brought into Blender.

UV Mapping

If you want any semblance of realism, you'll have to add textures to most of your objects. Some excellent sources of free textures are Poliigon, Free PRB, and CC0 Textures.

Each textured object will need to be UV mapped. UV coordinates define where on the surface of a 3D mesh a two-dimensional texture is projected. The process of mapping can be rather tedious: you'll have either manually Mark Seams and Unwrap each object or, if you're extra lazy, use Smart UV Project and tweak the UV map in the UV/Image Editor.

Unwrapping the UV maps

This Smart UV Projected map required rotating and adjusting X & Y texture scaling to achieve a usable result.

Tweaking the UV maps

Optional: Custom Textures and Creating Normal Maps

Sometimes you will encounter the need to tweak a texture or create an entirely new one. Substance Painter is the uber-application for this, but in most cases, Photoshop will do the trick. Hit Export UV Layout from the UVs menu in the UV/Image Editor, bring it into a Photoshop layer for guidance, and texture away.

One super handy thing about Photoshop is its ability to automagically create normal maps from images. Simply go to Filter > 3D > Generate Normal Map.

You get a normal map! Yout get a normal map!

Exporting to Unity

Congrats, you made it! With all the models and UV maps completed, it's now time to bid adieu to Blender and move everything over to Unity. There are three ways of doing this, each with their own pros and cons:

  1. Export each object as a .fbx to your Unity project's Assets folder: File > Export > FBX > make sure [x] Selected Objects is checked. Tedious and requires an export each time you update an object's geometry, but safe.
  2. Place the entire .blend file in the Assets folder: Unity reads .blend files directly, so if you make an update in Blender it will automatically update in Unity. Amazeballs! However, you'll have to keep your .blend file very clean and keep track of all the transforms you make in Unity.
  3. Save each object as its own .blend file and move them all into the Assets folder: Not being able to make all the geometries together in a single file is a pain, but this structure may have merit with collaborators.

For the sake of brevity, I went with option 2: sticking the entire .blend file in Assets.

Unity: Everything Else

Bonjour, Unity! Here you will:

  1. Configure materials
  2. Configure scene
  3. Set up VR navigation scripts (if applicable)
  4. Set up build settings for iOS & XR
  5. Build the VR application
  6. Install with Xcode

(As far as learning resources go, I've found Brackeys, SpeedTutor, and Unity's official documents to be excellent.)

Configuring Materials

Once you've followed one of the export options above, you should notice your exports as prefabs in your Project window. Drag them into your Hierarchy window to view them in your scene. Lookin' good!

You might notice that the materials nested under the prefabs are greyed out. This is due to a recent change in how imported materials are treated in Unity. To fix this, simply click on the prefab(s), go to the Inspector, and click Extract Materials (make a new Assets > Materials folder for them). The will created editable copies of the materials in Unity and automatically remap them to the prefabs.

Recent Unity builds treat imported materials differently

Next: Create an Assets > Textures folder and drag all of the scene's texture files into it. Then go down your extracted materials and tweak their shader parameters to what you want. If they have textures associated with them, simply drag the textures to their appropriate Albedo (the equivalent of Diffuse in Blender), Normal, etc. fields. You'll notice your objects update with textures, UV-mapped and all!

Configuring the Scene

There are a boatload of things that can be done here, but in general, I go through the following steps:

  1. Change the Color Space to Linear under Project Settings
  2. Make sure the camera has Allow HDR checked
  3. Add a custom Skybox (HDRI Haven, HDR Labs, and the Unity Asset Store are good sources)
  4. Tweak environmental lighting in the Lighting panel
  5. Set up direct lights (note: point lights and realtime lighting are performance-intensive)
  6. Add reflection probes and light probes, if necessary
  7. Add the Post Processing Stack from the Asset Store and tweak parameters
  8. Set the appropriate objects to Lightmap Static / Reflection Probe Static and let 'em bake

I'd be happy to go into more detail about this. Feel free to drop me an email.

The effects of the reflection probe can be seen on the pot

Optional: Setting up Navigation Scripts

More complex environments will require a way to navigate them. Since my target device was an iPhone + Google Cardboard setup, I opted for a simple raycasting navigation system. Think of it like having a laser pointer glued to the top of your head—just turn your head to point it where you want to go and press the button.

For this scene, I used a few scripts provided by Fred Moreau, who has a great Unity VR course on Udemy.

It's very simple to set up:

  1. Create a simple reticle object—anything you like, could just be a tiny glowing cylinder
  2. Create a sphere, name it Agent, and add a Nav Mesh Agent Component to it
  3. Nest the Main Camera inside the Agent
  4. Create an empty object named Navigator, and nest it inside the Main Camera
  5. Create an Assets > Scripts folder and add these scripts to it (thanks Mr. Moreau!)
  6. Then follow the instructions in the image below:

Setting up navigation scripts

Next, select the floors you want to walk on, go to the Navigator panel, and check them as Navigation Static. Make sure they're set as Walkable, then bake them.

Baking the navigation meshes

Lastly, go to Edit > Project Settings > Editor > Unity Remote and select Any iOS Device. Install Unity Remote on your iPhone, plug it in, hit Play, and you should be good to go.

Finishing Touches: Build Settings and Installation

You are SO CLOSE! Mmmmm the hearty smell of finish line. Once you're happy with your scene, go to File > Build Settings in Unity:

  1. Select iOS as your platform, hit Switch Platform, wait for it to install, then hit Player Settings:
  2. In Player Settings, go to Other Settings and set Color Space: Linear, Target Device: iPhone Only, and Target minimum iOS Version: 8.0.
  3. Important: Define your Bundle Identifier in a way that will match what you enter in Xcode (see below).
  4. Scroll down to XR Settings, check Virtual Reality Supported [x], and select the Cardboard SDK.
  5. Create a new Builds folder in your Unity project's root folder, click Build, and save it there.
  6. Find the new build on your Mac and double-click the .xcworkspace file.
  7. In Xcode, match the Bundle Identifier with what you defined in Unity, select your Team to sign with...
  8. Plug in your iPhone, select it from the device drop-down, and hit Run.
  9. ERMAHGERD!!

Match the Bundle Identifiers in Unity and Xcode

Apply iPhone to headset. Apply headset to face...

And enjoy your new VR world!

 

Finishing Points

Thanks so much for reading my guide—I hope you found something useful! If you'd like clarification about anything or would like to suggest improvements, I'd love to hear from you.