Ekemini Nkanta

Artist. Developer. Emerging Media Technology major.

Instructions for setting up Unity for the Oculus Quest

~~~~ creating the project + importing oculus VR ~~~~

  1. create a new 3D 2019.1.8f1 project
  2. switch platform to android, texture compression = ASTC
  3. import integration 1.38 from the archives (https://developer.oculus.com/downloads/package/unity-integration-archive/1.38.0/)
  4. select yes & restart when asked to update OVRplugin and spatializer.
  5. you must add an App ID in order to use hands/controllers!
    create an app on the oculus dashboard
    copy paste ID into oculus > avatars AND platform? > edit settings

~~~~ configuring project settings ~~~~

  1. project settings > player > XR settings > check VR supported. then add oculus to SDK list
  2. change company name and product name, then go to other settings > change package name
  3. remove vulkan from graphics APIs
  4. minimum API level = lollipop (21)
  5. change quality settings for best performance. i combined oculus’ standard quest setup with recommendations i found online – see screenshot!

~~~~ adding VR functionality ~~~~

  1. delete the main camera in your sample scene. we won’t be needing this anymore
  2. drag OVRPlayerController prefab into scene. now we can move (character controller component) & look around in VR (OVRCameraRig)!
  3. on the OVR Manager component:
    target device = quest
    tracking origin type = floor level
    uncheck Allow Recenter if your game features locomotion (rather than a stationary position, such as a cockpit)
  4. for hands, drag CustomHandLeft/Right into the scene.
    most tutorials ask you to place it under the left/right hand anchors, but when i tried that – at least for this version of oculus integration – my hands ended up looking like this:

i’ve also heard that in general, parenting the hands directly to the camera rig creates lots of jitter and lag? it seems like the best thing to do is make them their own separate game objects, and then…

  1. on each hand’s OVR Grabber script:
    drag TrackingSpace (!!) into each hand’s Parent Transform. (not OVRCameraRig, not OVRPlayerController.)
  2. for even more optimization, you can check “parent held object.” this parents any object you’re holding directly to your hand, which eliminates the jitter caused by fixedupdate – BUT can mess up some intricate physics.
    (in my case, we’re not stacking blocks or anything – we’re just gonna be holding a camera and a journal. but later on when i add climbing, i might have to turn that off.)

~~~~ setting up your environment ~~~~

  1. create or import your room environment
  2. to make an object grabbable: add the OVR Grabbable component to it (+ a collider if there isn’t one already)

~~~~ some tweaks ~~~~

  1. right now, bringing a grabbable close to your head dramatically pushes you around. we can fix this :)
  2. head to layers > make a “Player” layer and a “Grabbable” layer
  3. in player settings > physics, disable the collision between the 2.
  4. assign “Player” to OVRPlayerController (only on the top level, not recursively) and “Grabbable” to all grabbable objects

Photogrammetry

Homework: Import two 3D scans into your custom VRChat room.

I tried to use display.land (free iOS app I randomly got a week ago because it looked cool) instead of Metashape, which ended up backfiring. I ended up successfully scanning my potted plant… along with the entire room.

Luckily the app lets me edit the scan after it’s done processing everything, so I cropped it down to just the plant:

But when I downloaded the mesh, it gave me the old version :(

So it’s technically in my room, but nothing like what I expected.

« Older posts

© 2020 Ekemini Nkanta

Theme by Anders NorenUp ↑