PROJECT STARDUST – X-Wing VR

PROJECT STARDUST – X-Wing VR

My Fall semester 2018 at the University of Utah was spent developing a VR experience for a Virtual Reality Sickness (VRS) study which turned into my first ever solo developed video game. Project Stardust is a rudimentary study of the correlation between the number of reference points in a virtual environment and VRS symptoms. This project was completed by a two man team and while I handled the entirety of the development work, the research & design portion of this project was a partnered effort mostly driven by Daxton Wilson.

For those of you who want the TLDR, you can download the latest version of the experience HERE. Also, here is a video of the first prototype (version 0.4):

 

The Reason

Our hypothesis driving this project was simply that an increase in reference points in a virtual environment and a player’s high rate of speed in relation to them would cause an increase in user’s VRS symptoms. In order to test this supposition we needed an infinitely large environment that we could incrementally add reference points to and test a player’s VRS symptoms periodically throughout a high-speed experience. It was also a requirement that the environment constantly engage the user in activity and also be a convincing enough to immerse them in order to get more indicative VRS symptom measurements. Since both Daxton and I were both BIG Star Wars nerds, a VR space battle seemed like the only logical option. Conveniently objects in such an environment can be scaled and added incrementally without it feeling unnatural, which is very useful for the study. We could start with very small number of insignificant reference points such as distant starlight scale to larger objects such as planets/Death Star… you get the picture.

The Design

 

Daxton and I both agreed that with our time constraint of around 7-weeks we had to borrow and re-imagine a lot of the elements in our project and only if time permitted could we add some of our own creative elements. We decided on a plan to re-imagine an old arcade game: ATARI Star Wars designed by Mike Hally. Here is a gameplay video of the original arcade game we sought to recreate:

 

I played this game at an arcade near my house as a kid and distinctly remember the smell of the interior of the cabinet’s cockpit. You would climb inside the blue cabinet that had Darth Vader painted down the side and there was a very distinct dual-handed joystick control to steer your X-Wing.

 

The original game consisted of three stages. Each of these stages incrementally added reference points to the players environment and it was therefore suitable to port this game for our experiment.

Departures From Original

 

While trying to stay true to the original feel of the game, design decisions were made to alter some aspects of the game-play and environment to better fit the requirements of our project and the VR design space.

One such design decision that our users seemed most aware of is head-tracking aiming with the HMD; which allows the aiming reticle to be controlled through head movements and aiming in the direction the player looks. This turned out to be the most controversial design decision and users have argued against it since it 1) made the game significantly easier than a fixed reticle design and 2) went against Star Wars lore since it allowed an X-Wing to fire in directions that weren’t directly in front of the craft. I made the executive decision to include this as the only aiming mode in the prototype since it accomplished some very important things pertaining to our study:

  • HMD aiming caused a significant uptick in the amount of visual interaction with reference points in the virtual environment
  • Looking up, down, or to the side while in motion changes the perceived rate of speed of reference points, which can again be tied to VRS symptoms
  • It allowed users to easily clear all 3 levels when in a timed research study scenario

Ideally this would have been the only departure from the original arcade game. It soon became clear that using a predefined track movement system, where the user has no control over anything besides aiming, was not an option. The experience needed full control over pitch, yaw, and roll in order to feel like a realistic flight simulator in space. Putting the player on a predefined track would have also caused a feeling of vection, a sensory illusion of self movement which can cause VR sickness when the user is not in control of the movement.

Since a full 3-dimensions of freedom to the player was granted there were points when testing the builds I found it necessary to be able to see behind me since the back of the cockpit limited the players field of vision. Since it was common that my rudimentary AI TIE fighters would get behind the player I included a backward facing camera displayed on a cockpit screen. While not entirely tied to the VRS study I also wanted to include cockpit interaction which would allow the user to control aspects of the X-Wing and HUD.

Our vision was to immerse the user in the era, lore, and environment and no trench run would be complete without actual music, dialogue, and sounds from the original movie. This was another controversial decision since this brings the issue of copyright infringement. I claimed fair use for the exhibition of the music and dialogue since not only is it a creative rendition it is also being used for research.

Phase 1

Approach

 

Animated GIF X-Wing Star Wars Stage 1 1983

Stage 1 (Approach) – Star Wars – ATARI, 1983

In the first phase of the game the user is in deep space on approach to the Death Star. On approach the player and their allies are intercepted by waves of enemy tie fighters which must be destroyed or survived to proceed to the next stage. The player has a shield indicator which allows them to be hit by multiple laser blasts before being blown to smithereens. I also found it very satisfying that enemies would explode and send debris hurling outwards when destroyed. The environment contains few reference points with the user experiencing high speeds and velocities, testing user susceptibility to VRS-related symptoms in an environment with a single point of reference; the Death Star. 

Thanks to meshes I found on scifi3d.com and my own modeling work I was able to complete the player x-wing and enemy TIE fighters in the first few weeks to start testing. I used Blender a 3D modeling program to import assets found on the internet, modify the existing meshes (mostly decimating to reduce poly count), and adding my own components such as individually labeled buttons and lighted areas. I would then create a UV map from the newly created mesh and import it into Substance Painter (30-day free Student Edition was a godsend) to texture with PBR smart materials which did a lot of the artistic heavy-lifting. The University of Utah VR lab had 4 gaming laptops running in tandem to render separate portions of the 4K and downscaled textures. Finding the right process by which to export from one freeware program to the next was a neck breaking process which took way longer than it needed to and in retrospect would have rather paid for something like a 3DS license than struggle through some of the Blender issues encountered at the beginning of the project.

With the art somewhat satisfactory, I then set about the task of coding the flight controls. This. was. my. bane. I had taken courses on statics & kinematics but replicating a nice-feeling spaceflight control schema turned out to be a really difficult task. I first tried straight zero-gravity rules: no drag, no gravity, nothing to hinder angular velocity. I placed the thrust vector near the back of the x-wing centered about the engine-housing and allowed the user to squeeze the oculus controller grip to ramp the acceleration. The results were a bit disastrous, leaving both me and my partner with green faces when we tumbled end-over-end into the endless void.

Pretty soon I realized that the flight mechanics needed to work with some of the same principals as airborne flight. At least that is what a lot of newcomers to the game expected when they were first introduced to the game. People would often roll to the right or the left and expect the vehicle to bank and follow a left or ring hand turn. If pitching up or down was not accompanied by acceleration then the preceding velocity would still be in effect, however this was not apparent without a point of reference as prominent as the surface in Phase 2.

There were too many design hurdles to explicitly define here but I will say that my favorite design implementation was the destruction mechanisms used in this game. I took the meshes created in the earlier steps and applied an algorithmic ‘slicing’ in Blender to sections the mesh into the exploded fragments. These re-textured fragments would replace the live TIE, Y-Wing, X-Wing fighters, and turrets in the event of death and a random force vector would be an applied to each fragment hurtling them in separate directions. Thanks to some free particle effect packs found on Unity’s asset store I was able to assemble some convincing enough explosion animations. Wa-lah, my first ever video game has explosions and debris!

 

Phase 2

Surface

Stage 2 X-Wing ATARI

Stage 2 (Surface) of ATARI X-Wing 1983

The second stage of the game requires the user to destroy towers and incoming fighters while skimming along the surface of the Death Star. The environment contains a significant amount of reference points with the user experiencing high speeds and velocities, testing user susceptibility to VRS-related symptoms in an environment with a planar point of reference; the surface of the Death Star. 

Some interesting design challenges were posed by the enemy AI in this level. The models I created for the turrets had degrees of freedom horizontally with their heads and vertically with the barrels. Getting the barrel to point at a desirable firing point around the player’s craft was not the issue, it was hitting a player who is at a high velocity. In a way, our solution cheated a bit. We ended up parenting empty GameObjects to the player’s craft and placing them in on random lead trajectories in front of the player. When a turret’s firing cooldown timer expired and it had acquired an active target, it would fire a laser toward one of the randomly chosen leading GameObjects. This kept the player from being hit at an unrealistic interval through some path-prediction algorithm, and gives the chance of grazing fire which can pass through the player’s viewport.

This level could never have been realistically accomplished without the Plating Generator and Greebles Blender plugin by Mark Kingsnorth. It allowed me to quickly create a cold metal plateau’s that I could quickly generate raised grid patterns and mini-trenches with electronic greebles. My artistic skills need some improvement so I was able to cheat again by using Substance painter’s smart materials to get something that at least somewhat resembles the aesthetic of a large battle station surface.

The flight controls were very apparently off on the first few iterations. Either too slow, too fast, not responsive enough, too responsive, etc. and it was a tedious process of adjusting parameters. It made me wonder how large game companies establish a threshold for cutoff values in configurable flight controls settings. In the end, I am not entirely happy with the current state of the flight controls. There could still be massive polishing of the controls to be more like airborne flight.

 

Phase 3

Trench Run

“Cover me, Porkins!” The climax of the experience takes place in the final level when the player is forced to navigate the treacherous trench obstacles, towers, and TIE fighters. The environment contains the maximum amount of reference points with the user experiencing high speeds and velocities, testing user susceptibility to VRS-related symptoms in an environment with encompassing reference points; the Death Star trench and obstacles.  

This was by far the most rewarding and challenging levels to implement. It required a combination of procedural obstacle placement and an infinite moving track that gives the appearance that the user is rocketing down a trench. One of the main limitations was Unity’s limitation on floating point number representation; since the scale was set to 1 from the beginning of development when the user translates at a high rate of speed it poses the question of what happens when the user gets too far from the origin? Is there an end to a Unity level?

 

I found out pretty quickly that the level in fact does not end but instead float arithmetic starts to do wildly unpredictable things far from the origin. Therefore physics breaks down and the symptoms are exhibited by a wildly shaking cockpit and tearing artifacts. To be honest, it kinda felt like there would be an inflection point where physics would fundamentally shatter and a tear in space time would suck me and my consciousness into an eternal crushing darkness. But alas, we decided to make a design decision to as close to the origin as possible. In fact, we decided to fix the player at the origin, only allowing roll and yaw controls. This meant that the trench and all surrounding terrain must translate around the player to produce the illusion of movement. It also allowed track to be spawned ahead of and deleted behind the players view frustum.

Attached to the terrain were colliders to track the players progress along the trench, allowing object to be queue and spawned ahead of the player. The obstacles required a heuristic to disallow the placement of adjacent obstacles and disallow passage, so the obstacles were broken into a dozen groups of non-adjacent placements and the spawner would randomly select from them upon the triggering of a preceding collider.

Some visual effects had to be left out to accommodate the tight deadline, including but not limited to the death star explosion. Really once we had translation and control working in all three phases we could have stopped there and started user-testing. But my inner nerd couldn’t handle a half-finished project. I spent an extra week and a half mixing sound, quotes, and music into the scenes to allow the immersion to be complete.

Conclusion

For my first full-fledged video game, I would say that it turned out better than I expected. However, I severely under-estimated the amount of work entailed to bring a virtual reality experience to fruition. I don’t believe the process would have been too much different if I had been working without virtual reality compatibility, however I may get around to porting a non-VR version in the near future.

We were able to gather a lot of data from users in the form of verbal surveys and on-line screening surveys describing the virtual reality sickness (VRS) symptoms they experienced for each stage on a scale of 1 – 5, 1 denoting no symptoms experienced with 5 denoting experienced greatly. The initial population was made up of 122 males / 134 females for a total of 256 subjects tested. An online version of the verbal survey was made available for further study after the project completion which added an additional 40 males / 12 females for a total of 308 subjects.

The results for VRS symptoms showed increased vertigo and nausea on Phase 2 (Death Star Surface) which many users attributed to the flight controls not responding as they expected. Surprisingly Phase 3 (Trench Run) showed the least response to VRS symptoms even with the encompassing environment and many reference points. This proved our hypothesis to not only be incorrect but that the encompassing environment seemed to reduce VRS symptoms in comparison to the other stages.

Our study is ongoing and the experience itself could still use some work and possibly a few more features 🙂 But hopefully we were able to give a nod to an old classic in a new format (and hopefully not obtain a lawsuit in the process)