03/08/19: Video Update on Phase 1

This is going to be a relatively short update on how far Phase 1 has progressed in the last few days, but finally including some video footage of the scene working, along with some of the tools I’ve been brushing up on to apply this week.

The above video is a quick demo of the teleport point placement and scaling in the scene.

What was most surprising for me was just how long the sidewalk actually became. It felt like our last prototype was dealing with issues of time because the walk down the sidewalk was too short or the walking motion was too fast. At the height of a child, the building itself becomes this mammoth imposing object rather than just a set piece or a destination. The teleporting really emphasizes the distance too, all of the points are just at the edge of the teleport curve. I think I got lucky there. Overall this layout feels smoother and I’m excited to start putting in the other scene elements.

On some technical notes:

  • During our demo on Thursday it was pointed out that some objects aren’t keeping scale with the ground or street planes. In the video I can definitely see the lamp posts hovering off of the ground- this may just be a matter of making sure the final assets in the scene are combined into one set object. Still experimenting with that.

  • I found in this scene that the teleport point on top of the stairs was actually really hard to get to - you can actually see me struggling with it in the video. I underestimated how large the stairs would become at that height.

  • Which leads me to the suspicion that this height ratio isn’t quite right. I recorded this experience while seated, so I thought it might just be something wrong with the math. I repeated the same thing while standing and had the same issue. I can play with some numbers to get that right.

  • This was my first time testing SteamVR with a headset other than the Vive. Up until now all of my development has been using the Vive headset and controllers. Oculus is what’s available to me in this moment so I took the opportunity - it connected no problem! Teleport was already mapped to the joystick on the Oculus Rift controller. Cue my sigh of relief for a more versatile development process.

I have begun working with the car animation, starting with placing the user.


I made the loosest possible version of a block car in Maya with separate doors and brought it in just to have something to prototype with. This is where the user’s location in space is going to become an issue- I have to make sure they’re aligned with the driver’s seat. We’re going to have the user sitting in the demo anyways, so we might be able to just calibrate the seat with the environment and have the user sit on the bench.

Working on a GRA assignment this week I also learned how to use the Audio Mixer in Unity. Turns out I can group all my different audio tracks together and transition between various parameter states. Who knew!

Apparently not me. I suspect this is going to fix A LOT of the audio issues we were having in the last prototype, especially having to do with consistency - some of the volume levels were… jarring, and not in the intentional design type of way.


In class, I think I opened up the wrong version of the project, because all of the environmental objects were scaling without the teleport points attached. When I got home I realized that it was all fixed on my current version! One less thing to tackle.

Going away from the technical for a moment, Taylor posed an interesting question to me: how do we categorize this experience? I realize I’ve just been using the word “experience” but we’ve also discussed “simulation”. Adding that to the long list of queries for this open week ahead of me - confirming a proper term for what we’re working on, and justifying that definition.

What’s Next

  • Car animation

  • Composing Crowds

  • Connecting theory with my actions

  • Resuming my lineup of VR experiences