Unity 2020 Virtual Reality Projects
上QQ阅读APP看书,第一时间看更新

Enabling virtual reality for your platform

The Diorama scene we created in the previous chapter was a 3D scene using the Unity default Main Camera. As we saw, when you pressed Play in the Unity Editor, you had the scene running in the Game window on your 2D computer monitor. We will now enable the project and scene so that it runs in virtual reality. These first couple of tasks are similar, regardless of which device you are targeting:

  • Setting the target platform for your project builds
  • Installing the XR plugin for our device
  • Installing the XRI package
  • Creating a VR enabled XR Rig camera rig

We'll take care of these details now. After doing this, you'll need to set up your Unity project and development system software, depending on your specific target. This will include completing the following tasks, which we'll cover on a case-by-case basis in the rest of this chapter:

  • Ensuring your VR device is connected and you're able to normally discover and use its home environment.
  • Ensuring your device's development mode is enabled.
  • Installing any system software on your development machine that's required for you to build your target platform.
  • Optionally, importing the device toolkit for your target device into your Unity project and using the provided camera rig instead of the XR Rig.

Now, let's configure the project for your specific VR headset.

As you know, installation and setup details are subject to change. We recommend that you double-check with the current Unity Manual pages and your device's Getting started documentation for the latest instructions.

Setting your target platform

New Unity projects normally default to targeting standalone desktop platforms. If this works for you, you do not need to change anything. Let's take a look at this now:

  1. Open the Build Settings window (File | Build Settings…) and review the Platform list.
  1. Choose your target platform; such as one of the following, for example:
    • If you're building for Oculus Rift or HTC Vive, for example, choose PC, Mac&Linux Standalone.
    • If you're building for Windows MR, choose Universal Windows Platform.
    • If you are building for Oculus Quest, or Google Cardboard on Android, choose Android.
    • If you are building for Google Cardboard on iOS, choose iOS.
  2. Then, click Switch Platform.

Unity may need to reimport your assets for a different platform. This will happen automatically and may take a while, depending on the size of your project. The editor will be locked out until the process is completed.

If the platform you require is not present in the list, you may need to install the corresponding module. To do this, you need to go back into Unity Hub, as follows. These steps are similar to the ones described in the previous section, Installing the Unity Editor:

  1. Quit the current Unity session (File | Exit).
  2. Open Unity Hub on your computer desktop.
  3. From the left-hand side menu, choose Installs.
  4. Find the version of Unity that you're using in this project.
  5. Use the three-dot menu to choose Add Modules.
  6. Review the list and check the checkbox for each of the modules you require. The module may have additional tools. Be sure to unfold the module name and check any sub-options. In particular, the Android Build Support module has child Android SDK & NDK Tools and OpenJDK sub-modules, which should also be checked.
  7. Press DONE to install the additional modules.
  8. Reopen your Unity project. The added platform should now be included in the Build Settings' Platform list.

Next, we'll enable VR for the project.

Installing XR Plugin Management

When VR is enabled in your Unity project, it renders stereoscopic camera views and runs on a VR headset. Unity 20193 and later sports a new XR Plugin Management tool, which you'll use to install and manage VR device SDKs. You can do this using XR Plugin Management, as follows:

  1. Open the Project Settings window (Edit | Project Settings).
  2. SelectXR Plugin Managementfrom the left tab menu.
  3. If necessary, click Install XR Plugin Management, as shown in the following screenshot:

Next, you'll need to install the loaders for the specific device(s) you plan to target with this project. We'll go through the details in the device-specific topics later in this chapter. Meanwhile, you may see the following warning:

Then, once one or more of the desired plugins have been installed, the plugin will be added to the list in the left-hand menu, under XR Plug-in Management. In the following screenshot, for example, I've installed the Oculus plugin and you can see that the Oculus settings are now available:

When you install XR Plug-in Management, other dependency packages may also be installed. In fact, using the Install Plug-in Management button we used previously is a shortcut for doing the same through Package Manager (Window | Package Manager). In the following screenshot, you can see the package in Package Manager where I've selected Show Dependencies from the Advanced drop-down menu, so you can see that two additional (hidden) required packages are also installed: XR Legacy Input Helpers and Subsystem Registration:

Good. At this point, the project is VR-ready, since we've set the target platform and installed the XR plugins for the target device. If you want to develop a project directly using the native XR SDK, you could stop here. However, if you want to develop a project for one specific platform, you can use an optional device toolkit, such as Oculus Integration (OVR), SteamVR Interaction Toolkit, or Windows Mixed Reality Toolkit (MRTK). In this book, we will focus on the Unity XR Interaction Toolkit. We'll install this next.

Installing the XR Interaction Toolkit

We are going to be using Unity's new XRIToolkit in the projects throughout this book. XRI provides higher-level components for interaction in VR and AR projects, including cross-platform hand controller input for grabbing objects. It also provides components for setting up a VR camera rig that handles stationary and room-scale VR experiences.

The XR Interaction Toolkit can be installed using Package Manager. Follow these steps to install it in your project:

  1. Open Package Manager (Window | Package Manager).
  2. Filter the list toAll Packages(Use the drop-down list at the top-left of the window).
  3. At the time of writing, XRI is still in preview, so you may also need to select Show Preview Packages from the Advanced dropdown menu.
  4. Type xr interaction in the search area and, with the package selected, pressInstall.

The XR Interaction Toolkit package is shown in the following screenshot:

With XRI installed, you now have access to its components through the main menu bar, that is, Component | XR. As shown in the following screenshot of the menu, these include XR Rig, XR Controller, XR Interaction Manager, and XR Simple Interactable:

There are some pre-made game objects you can readily add to your scenes through the main GameObject | XR menu, as shown in the following screenshot:

For example, here, you can find the Room-Scale XR Rig and Stationary XR Rig game objects, both of which use the XR Rig component. We're going to add one of these to our scene next. At this point, you should have set up the target platform and installed an XR Plugin and the XRI Toolkit. Now, we'll replace the default main camera with the XR camera rig in the scene.

Adding the XR camera rig

The XR camera rig places a camera object in the scene that tracks the position and pose of the physical VR HMD. It also includes game objects that track the player's hands. There are two versions you can readily add to your project – one for room-scale VR and one for stationary VR. Room-scale mode enables the player to walk around within a configured play area (for example, Guardian on Oculus or Chaperone on SteamVR). In stationary mode, the player is assumed to be seated or standing in place; they can still move their head to look around the scene, but the player rig is stationary.

Please decide which mode you would like to use in your project. Don't worry if you change your mind; they both use the same XRI components, but with different default settings that you can change as needed later on.

To replace the default Main Camera with a pre-configured XR camera rig of your choice, use the following steps. We'll also adjust its initial position so that it's a few meters back from the center of our ground plane:

  1. From the main menu, selectGameObject | XR | Stationary XR Rig orRoom-Scale XR Rig.
  2. Position XR Rig at Z=-3 (select XR Rig in Hierarchy, then in Inspector setTransform Position Z=-3).

This adds two new objects to the root of your scene hierarchy: XR Interaction Manager and XR Rig. It also removes the default Main Camera from the Hierarchy, as XR Rig has its own Main Camera child object. At this point, if you have a VR device that's supported in Editor Play mode, when you press Play, you can enjoy your scene in VR!

Let's take a moment to explore what this XR Rig is made of.

Exploring the XR Rig objects and components

Using the GameObject | XR menu to add an XR Rig to the scene adds the XR Rig and XR Interaction Manager objects to the scene. Let's take a closer look.

Select the XR Interaction Manager object. In the Inspector window, you can see that it has a corresponding XR Interaction Manager component. XRI requires you have one XR Interaction Manager component in the scene so that you can manage communications between the interactors and interactables in the scene (we'll go into more depth about these and start using them in Chapter 5, Interacting with Your Hands).

The scene now also includes an XR Rig object, which has a child hierarchy of GameObjects. As shown in the following screenshot, it has an XR Rig component, which includes the option to set Tracking Origin Mode; for example, for room-scale (Floor) or stationary (Device). It also has aCamera Y Offset. This defines the camera's (player's head) default height above the floor and is needed when we're not using Floor as the tracking origin:

Camera Y Offset is pertinent for stationary tracking and will be applied to the rig's child Camera Offset transform. In stationary mode, the Y offset sets the child Camera Offset object to this fixed Y position. Conversely, in room-scale mode, the camera Y offset will be automatically set to zero and use the device's runtime tracking of the HMD to determine the eye level. The Camera Offset object is shown in the following screenshot:

The height for the average seated person is about 1.3 meters. The height of an average human female is 1.6 meters and 1.75m for males. The actual averages vary by ethnicity/locale ( https://en.wikipedia.org/wiki/Average_human_height_by_country).

The children of Camera Offset are the Main Camera, LeftHand Controller, and RightHand Controller objects. Like any camera in a Unity scene, the XR Rig's Main Camera has a Camera component, with default settings. But it also has a Tracked Pose Driver component configured as a Center Eye - HMD Reference pose source. Tracked Pose Driver, as its name suggests, tracks physical devices in 3D space, and in this case, it'll be tracking the head-mounted display device. The Main Camera's Tracked Pose Driver is shown in the following screenshot:

While we're here, you may want to set the near clipping plane of the camera to 0.01 (the smallest value Unity allows). This option can be found in Camera | Projection | Clipping Planes | Near, in the Inspector window. The camera's clipping planes define the distances from the camera where rendering starts and stops. Any objects in front or behind the near and far clipping planes, respectively, are not drawn in the view, and any objects partially crossing this plane will be clipped. The default value may be fine for conventional apps, but in VR, you really put your face into the scene and likewise, may bring your rendered hands close to your face, so we should set the near clipping plane so that it's as small as possible.

In the Hierarchy window, you can see siblings of Main Camera, which are the left- and right-hand controller objects. Each of these is a more complex combination of XR components, including the following:

  • XR Controller: Interprets Input System events as XRI interactor position, rotation, and interaction states.
  • XR Ray Interactor: Used for interacting with objects at a distance using raycasts.
  • XR Interactor Line Visual: A helper component that gets line point and hit point information for rendering.
  • Line Renderer: Renders the visual line that's cast from the controller when the user is in VR.

Aninteractor is an object in the scene that can select or move another object in the scene. This XR Rig uses a Ray Interactor by default that selects objects from a distance using a raycast in the direction the user is pointing. Another type is Direct Interactor, which selects objects when they're directly touching the target interactable object. An interactor requires an XR Controller component.

The XR Controller component converts physical device controller input, such as button presses, into interaction events, including Hover, Select, Activate, and UI button presses. You can also specify the model prefab for the hand to show in VR. Details of the XR Controller component are shown in the following screenshot, with the Model option highlighted:

You also can see that, currently, the controller Grip button is used for selection, while the Trigger button will be used to activate an object or interact with UI elements.

The XR Interactor Line Visual and Line Renderer components are used in conjunction with Ray Interactor to display laser beam-like lines within VR so that the user knows where they're pointing. You may want to explore the many parameters that control the look of the rays.

Your scene is now set up with a camera rig that enables viewing in virtual reality. Depending on the VR headset you're using (for example, ones tethered to your PC), you may be able to press Play now and review the scene in VR. However, you should make sure you can build and run the project on your physical device too. Let's do that now.