The majority of this week was spent gathering footage and replacing all of the storyboards in my animatic. I went back into each project and did screen recordings along with footage of the players, from the HTC Vive to the Google Cardboard. Syncing up the footage was actually easier than expected using screen context clues. I also added background music and an intro sequence. As of right now, Explainer Video 1 is about 90% complete. The only things missing are the credits and some tweaks on sound/text.
Tori and I submitted our project proposals for the next four weeks, and that meant making decisions on what exactly I wanted to investigate for my portion of the project. We discussed working on technical exercises, trying to nail down the pipeline and techniques we might use for future development. My current plan for the next four weeks is to focus on:
- Navigation: How does the user move about the scene? I will use VRTK in Unity to experiment with teleportation, walking, and top-down maps as ways for the user to explore a given area. I've used the teleportation tools before in my Hurricane Prep project, so this will be a familiar area to start with.
- Object Interaction: How can the UI tools in VRTK and Unity be used to convey information to the user? There are a variety of methods for pop-ups and object selection. I will set up different objects throughout the scene and apply these different menu types/functions to them in order to test them out.
- Time: Tori's working on getting the motion capture pipeline down, from getting the data to bringing it into Unity. Once those animations are present, I would like users to be able to pause the action in the scene and be able to explore a frozen moment in time at their will. I will test this technique by importing a simple animated object in place of the mocap data, then applying it to the figures once they are in the scene.
I gathered a ton of relevant sources this week, all from different areas that we're investigating.
I found an article titled "In Their Shoes: 10 Empathetic VR Experiences" that features VR projects covering a vast span of topics, from refugee camps to solitary confinement. One that stood out is a project from Derek Ham (NC State University) titled "I Am A Man", part of an exhibition that will be on display at the National Civil Rights Museum. The exhibit is about the 1968 Memphis Sanitation Strike, and his experience takes you back to that scene. Ham documents the production on his LinkedIn page, which has provided valuable insight on his process. One particular entry I read had detailed his thought process on whether his project should be a documentary-like experience or fictional narrative, and how the presence of the user in the scene as themselves automatically alters the accuracy of the historical retelling. I've linked that page here, and placed the trailer for his experience below:
On another note, I was recently linked to the IEEE Conference on Virtual Reality through an educational AR/VR Facebook group. While unfortunately this year's conference is in Germany (unrealistic), the site listed papers from past conferences. I picked up several papers on using VR and immersive technologies in schools, and have added those to my reading list. Maria also linked Tori and I to a few readings on educational theory, and sent us more looking specifically at how elementary age children learn.
CURRENT QUESTIONS/NEEDS RAISED
As I start on this four week project, my questions are going to be technically based. I'll need to begin working with VRTK again and diving into some tools that I only understood at a surface level for the hurricane project. Through these readings I'll be gathering information on what the current opinion is on VR in classrooms, and how empathy plays in. The biggest need raised this week is the need to read all of these sources.
LIKELY NEXT STEPS
This week I will be finishing up my Explainer Video 1 and posting it on my site for viewing. I'll also be creating a base Unity file for our four week project and getting the teleportation tool functional, hopefully by mid-week. Tori and I will be getting motion capture data on Wednesday and learning how to live-stream the data into Unity. I'll be documenting the process and some of that footage will probably be in the blog post next week, along with some Unity snapshots.