A collection of works // MTEC 3230 Spring 2021

Author: Gregor Erdman (Page 1 of 4)

In Order Update #6

For the final steps in production, I attempted to create a starting UI which could be accessed through user input. I was unsuccessful. I had an OVR Manager on my scene and called the appropriate things according to the documentation. However, OVRInput refused to output any information making it impossible to design a UI as originally planned. Instead I set up an animated camera in the scene and had it destroy after 25 seconds using Invoke(). This allowed me to create a pan down to an an in-world Raw Image which then transitions to the main scene by destroying the Canvas and camera objects causing the camera to reset to the OVR Player Controller. I also had to program the audio sources to switch places so that I could transition from one sound (birds chirping) to the other (music for the game portion).

Next and lastly before building, I had to solve an issue where sound effects kept playing every time the player or an object got near them, leading to multiple overlapping sound effects that didn’t respond as expected. To solve this problem I simply set up a boolean called hasPlayed to be triggered at the same time as the snapping mechanic and then set the sound to be destroyed one second after, allowing it to play once and then never be played again, which is an imperfect solution but well suited to the current gameplay.

Finally I built the game and tested it. It worked! I then pushed it to git and called the project complete.

In Order Update #5

In this update I added terrain outside the window to make the work more visually appealing. I also added the books David made to the shelf and made them snap into place in the same manner. Now the player can add books to the shelf after they set it up. I buried the books under the other objects so that the player is more likely to grab the shelf first. Otherwise you have the awkward sight of books floating in place in midair. Finally, I added the music David found using an audio source set to play on awake and loop. Then I scripted in a click sound David found so that it was triggered at the same time as the snapping mechanic.

The next step is to apply sound to all objects, update the terrain and remove the boxes and replace them with the hand prefabs packaged with the Oculus Integration Kit. After that, we only need a UI and an introductory menu to be largely done with the proof of concept.

In Order Update #4

The game is now set up with the core mechanic integrated into the environment. The player grabs the furniture piled up on the bed and places it around the room. Once it enters the trigger zone and the grab ends the object snaps into place. I struggled a bit with getting the rotation right since rotating to a specific point requires a Quaternion but I eventually found Quaternion.Euler() and was able to control the rotation of objects.

In the Unity Editor, I attached the scripts to each object and then attached a light and a particle system to the script. Then I gave an empty object with a box collider on it a tag with the name of the object it would be colliding with i.e. “desk” “shelf”. With everything in place I simply needed to iterate across all the items in the space. The most tedious part was inputting all the item location coordinates manually but as I said in a previous post I’m not sure of a way around it.

The next step is to add more details to make the game more exciting and immersive. I’m not sure what that entails but we’ll see what develops.

« Older posts