01/18/20: Spring Semester Goals

The next few weeks are the final weeks of production for this case study - in two weeks, Tori and I are flying to Florida to demo at the Zora Neale Hurston Festival. We then have the month of February and first week of March to make any modifications. At this point, production is more about cleanup of the main case study experience, and tiny prototype experiments based on proximity to other avatars and how to fully construct the environment.

Reaching the end of last semester, I had a rough draft of what our final experience was going to look like and a head start on the writing portion of my thesis. I created a priority list addressing our deadline for the Zora festival in two weeks and for the end of March:

Priority List for final production of Ruby Bridges case study.

Project Process

Based on the priority list, I wanted to line up my goals for the Zora festival and for the end of the semester to work on them simultaneously. The first thing I’ve been working on has been creating a consistent proximity in the walk with the user, Lucille, and the two federal marshals. At the end of December, all three characters would reach the end with the user, but would often feel too far away during the walk itself. It didn’t seem natural to have Lucille so far away from the user in a hostile situation, or to have the marshals walking three meters up the sidewalk while the mob avatars are crowding in.

In my test navigation scene, I set up the three avatars and put in a base user animation with some speed variations.

Group Walk Test using test prototyping space.

One of the biggest problems was getting the speed and animation adjustments right regardless of what the user is doing. From the user’s perspective, if they’re slowing down then it means they’re not looking straight ahead down the sidewalk, which gives me a little bit of leeway in the adjustments I make. Slowing down the avatars to an unreasonable speed (often looking like they’re moving through water) doesn’t matter as much because they will speed back up when the user looks directly ahead.

Implementing this into the main project scene was going to require reorganizing how the main control script was running the scene. Initially, the script was controlling all of the dialogue, audio and motion triggers. This got a bit messy and difficult to debug. Using this primary control script as a template, I created Scene1_MainControl.cs to house the booleans used to indicate scene status and runs the timing for each phase of the experience. From that, I created separate scripts to control the motion for all of the avatars in the scene (including the user) and the audio/dialogue. With that separation I’m able to get a better handle on debugging down the road.

New control script setup.

The audio script also took some prototyping to get right. I was having problems last semester with mob members initially playing all at once versus a randomized wait time. Distributing the AudioSources in the scene and layering these sounds still needs a lot of work, which Tori and I have already reached out for. I focused strictly on timing the audio and ensuring that the mob chants selected are appropriately randomized from a set number of clips.

Next Steps

Those scripts are currently in the scene and functional, so in the next two days I will be turning my attention to the environmental setup of the scene. This is where my Zora goals and end-of-semester goals overlap - I’ll be using a separate prototyping scene on Unity to place prototyped blocks and houses in order to determine the best placement for these assets in the world, and explore different configurations for the mob. I thought about using Tilt Brush or Maquette for this, but I found it’s much more efficient to use Unity because I can mix and match the assets I already have. I have already finished assigning all the textures and materials in the scene itself, and will continue to add environment assets in between the setup of the houses and cars. I will also need to time the door animations for the user to exit the car, and time to cleanup the user’s exit from the car itself.

Next week I will have documented the prototyping scene and resulting changes in the environment, as well as a runthrough of the experience with the separated script setup. Tori and I will be taking an Oculus RIft with us to demo the project in Florida, and so we will be conducting these tests using both an Oculus Rift and a Vive to check for any other issues.