Creating a game that truly connects with the user was important to us when creating BurntOut. We want those who are playing our game to feel more prepared to take on the issue of burnout in medical school. We decided that we may be able to evoke a stronger connection in VR, where the player feels they are actually in the world. Since the last blog post, most of our energy has gone into moving development to that.

Oculus Go

Due to its low price point, we decided the best platform to develop for was the Oculus Go.  It brings an interactive VR experience without requiring a gaming console or a gaming computer. However, it does bring its own limitations.

  • The location of the hand and head are not tracked, only their orientation. This makes the Oculus Go much better for experiences you would play stationary, which still fits in with our BurntOut framework.
  • The Oculus Go also can’t handle mesh colliders on objects with any degree of complexion to them. Even inflating the convex mesh to the point where the collider is a cube results in the Oculus Go throwing a ton of errors. This requires us to use a simpler mesh for collision or to use a bunch of primitive colliders.
  • We’re also planning to transition to voice-to-text for dialogue. Unfortunately, the Oculus Go only has one built-in voice and it sounds very robotic. Unlike other Android devices, there does not seem to be a good way to install more text-to-speech devices, as the options on it are fairly limited. We may end up having to prerecord dialogue if we cannot find a way to import more voices with our game.
  • The Oculus Go also runs on Android, rather than through your computer, which makes debugging much more complicated. Every time you want to test code on the Oculus Go, you have to build the project, which takes a bit of time. The print statements we get also come from running a compiled version on the Go, so we lose out on useful stack information when viewing the log. To make development simpler, we are developing for the Rift, since they both use the same Oculus SDK. This allows us to test directly in Unity, without having to wait for a new build each time. It also leaves us with the more helpful console logs.

VR Mechanics

With a move to VR, our Visual Novel mechanics are being heavily changed. Previously, when a person was selected, we put their sprite in the foreground, along with what was said. With no real “foreground” in VR, we’re changing it so that the character says their text aloud, with UI drawn in front of the character. Speaking characters will also be highlighted, so the player can easily see who is talking.

We’ve tried to keep the point-and-click mechanics fairly similar. Rather than use a particle effect, we add an emission on the highlighted character’s sprite to show whether they are selected. That took a bit of work to set up, as while the default Oculus pointer can collide with objects, the “OnMouse” events aren’t triggered. We changed the default Oculus pointer to send the mouse events we would typically expect to get, which fixes our issues.

Due to the major shift to VR, much of what was done before has to be redone or thoroughly modified to work in the new version. However, now that we’ve grown accustomed to working in the VR environment, we’ve been making a lot of progress and should be able to catch up to where we were fairly soon.

You can read more about the BurntOut game at its website and follow along with our development on the blogs.