Finalized Proposal and Work Documentation

APPROACH

The majority of this week was spent gathering footage and replacing all of the storyboards in my animatic. I went back into each project and did screen recordings along with footage of the players, from the HTC Vive to the Google Cardboard. Syncing up the footage was actually easier than expected using screen context clues. I also added background music and an intro sequence. As of right now, Explainer Video 1 is about 90% complete. The only things missing are the credits and some tweaks on sound/text. 

Still of title sequence from Explainer Video 1

Screenshot working in After Effects

Screenshot working in After Effects

CHOICES MADE

Tori and I submitted our project proposals for the next four weeks, and that meant making decisions on what exactly I wanted to investigate for my portion of the project. We discussed working on technical exercises, trying to nail down the pipeline and techniques we might use for future development. My current plan for the next four weeks is to focus on: 

  • Navigation: How does the user move about the scene? I will use VRTK in Unity to experiment with teleportation, walking, and top-down maps as ways for the user to explore a given area. I've used the teleportation tools before in my Hurricane Prep project, so this will be a familiar area to start with. 
  • Object Interaction: How can the UI tools in VRTK and Unity be used to convey information to the user? There are a variety of methods for pop-ups and object selection. I will set up different objects throughout the scene and apply these different menu types/functions to them in order to test them out. 
  • Time: Tori's working on getting the motion capture pipeline down, from getting the data to bringing it into Unity. Once those animations are present, I would like users to be able to pause the action in the scene and be able to explore a frozen moment in time at their will. I will test this technique by importing a simple animated object in place of the mocap data, then applying it to the figures once they are in the scene. 

RELEVANT SOURCES/INSPIRATION

I gathered a ton of relevant sources this week, all from different areas that we're investigating. 

I found an article titled "In Their Shoes: 10 Empathetic VR Experiences" that features VR projects covering a vast span of topics, from refugee camps to solitary confinement. One that stood out is a project from Derek Ham (NC State University) titled "I Am A Man", part of an exhibition that will be on display at the National Civil Rights Museum. The exhibit is about the 1968 Memphis Sanitation Strike, and his experience takes you back to that scene. Ham documents the production on his LinkedIn page, which has provided valuable insight on his process. One particular entry I read had detailed his thought process on whether his project should be a documentary-like experience or fictional narrative, and how the presence of the user in the scene as themselves automatically alters the accuracy of the historical retelling. I've linked that page here, and placed the trailer for his experience below: 

On another note, I was recently linked to the IEEE Conference on Virtual Reality through an educational AR/VR Facebook group. While unfortunately this year's conference is in Germany (unrealistic), the site listed papers from past conferences. I picked up several papers on using VR and immersive technologies in schools, and have added those to my reading list. Maria also linked Tori and I to a few readings on educational theory, and sent us more looking specifically at how elementary age children learn. 

CURRENT QUESTIONS/NEEDS RAISED

As I start on this four week project, my questions are going to be technically based. I'll need to begin working with VRTK again and diving into some tools that I only understood at a surface level for the hurricane project. Through these readings I'll be gathering information on what the current opinion is on VR in classrooms, and how empathy plays in. The biggest need raised this week is the need to read all of these sources.

LIKELY NEXT STEPS

This week I will be finishing up my Explainer Video 1 and posting it on my site for viewing. I'll also be creating a base Unity file for our four week project and getting the teleportation tool functional, hopefully by mid-week. Tori and I will be getting motion capture data on Wednesday and learning how to live-stream the data into Unity. I'll be documenting the process and some of that footage will probably be in the blog post next week, along with some Unity snapshots. 

Explainer Video 1 Progress

APPROACH

Having organized all of my assets last week, I was able to use this week to finalize my script and storyboards. I focused on three main points: my technical exploration of VR and AR, research work, and collaborative experience. From there I briefly detail my direction moving forward for the next semester. After a few drafts, I felt comfortable with the script and began rearranging the storyboards into a formal template. 

Final storyboard template, with timing and narration. 

While it was recommended for us to work in Premiere, I have more experience in After Effects and chose to use it for this video. The most difficult part of this week was recording the narration. I did a few test recordings using old script drafts, then a new one with the final. The pace of my speech would vary and I learned that there are certain sounds that are very difficult for me to say clearly. The animatic I produced at the end of the week is still using rough audio that needs to be edited for timing, but I was able to begin dropping some of the footage I already have into the composition. 

Screenshot of Animatic work in After Effects

Screenshot of Animatic work in After Effects

CHOICES MADE

The Explainer Video was already solidly planned for work this week, and my choices there were made early on with content organization and script editing. Tori and I have chosen to meet every Tuesday morning to discuss research findings and thoughts on project development for the potential Ruby Bridges project, though we still communicate frequently about this project at other points in the week. 

RELEVANT SOURCES/INSPIRATION 

I was sent several relevant sources this week. Joe pointed me in the direction of the VR/AR Association Online Conference, taking place from January 16-30. There are speeches being given in a variety of tracks, including Education and Storytelling, and they are recorded for viewing at any point. I also found that there are committees for each track with links to relevant articles. There are several talks in the Education track that I will be listening to this week, one specifically being "VR in Education: from Perception to Immersion" by Steve Barnbury. (Linked HERE)

Maria also sent Tori and I several relevant sources throughout the week addressing some of the questions I mentioned in my last post. While we’re not entirely sure that we’re going with the Ruby Bridges story, part of our conversation this week was how to figure out which books students are reading and how to narrow down that search. This article titled “The Confounding Science of Children’s Literature” tells us that nobody can agree why or how kids pick certain books to read. There are some subjects and genres that are overwhelmingly more popular than others and books with narratives are generally preferred, but It also mentioned that kids are picking books that can be part of a social experience, or something that they can talk to their friends about. There seems to be a small section of research in this area to get into, and this will likely be talking point for Tori and I this week. 

The Blue Eyes/Brown Eyes Exercise by Jane Elliott has also been part of the discussion this week. This project centers on ultimately fostering empathy for the students, and the exercise run by Elliott gives a classroom of students the experience of being a minority. The documentary for this project is linked below. While they do have some crossover, I was thinking about how she uses the social dynamics of a classroom to immerse students in this experience and if virtual reality (an individual experience) is able to create the same impact.

On the topic of empathy, another article sent by Maria actually argues a different side- that VR can be misleading and misrepresent their topics. Full immersion is interrupted by factors such as safety and that these are short-term experiences. The article ("It's Ridiculous to use Virtual Reality to Empathize with Refugees") is discussing VR in terms of disability simulations and refugee situations, but it does make a good point when discussing the factor of time and player awareness. If the player is aware that this is a simulation, will that lessen the impact because the element of fear will no longer exist? I feel that fear can be simulated to some degree in VR, but so many of these situations come from fear being experienced over an extended period of time. Those feelings cannot be replicated, and that is a point worth remembering.

CURRENT QUESTIONS/NEEDS RAISED

Last week I was thinking about broader implications of the project, and most of those questions have not been answered so they still stand. The needs raised for this week have to do with information- I need to sit down and gather all the information on topics being discussed (empathy in VR, VR in education, narratives in VR). Then from there see what questions are left or need to be reformed. 

LIKELY NEXT STEPS

This week will be completing the Explainer Video 1. This will involve editing audio, recording gameplay, and creating a rough cut of the video for Tuesday. 

The rest of my work for my project with Tori will be reading and gathering information. I'm still reading Flow, though the sections this week were not really relevant to my work. I have the videos from the AR/VR Conference to sort through about VR in the classroom, and a TED talk linked in the source above about empathy in VR. 

Start of Spring: Explainer Video 1

APPROACH

I spent this week sorting and analyzing my work from the previous semester to see where the common threads were and articulate a direction for my research. I made a list of all the projects and experiences I had, and what I gained from each. The majority of these projects were intended to be technical explorations; I wanted to gain more experience working in virtual and augmented reality to understand the mediums and become more comfortable with Unity. This came with good results, gaining more experience in C# and developing a better collaborative workflow by combining multiple projects in one game. I was even able to take a step towards mobile development. 

While organizing these projects, I noticed many of them had to do with player interaction and how players move throughout a space. In some cases, like the Hurricane Preparedness prototype, the player has the ability to move through a space and interact with objects based on the goal of the level. The VR MindMap project was a purely passive experience with no user interaction. I chose to make this the focus of my video: examining how the interactive nature of VR and AR technologies can be used in an educational environment. From there, I sketched out thumbnails for my storyboards and wrote a draft script detailing the connections made between these projects and my path forward. 

CHOICES MADE

Once deciding the direction of my work, I had a conversation with my classmate Tori about a potential project. She proposed the idea of creating an immersive virtual reality experience for students in elementary/middle school that would recreate a scene from "The Story of Ruby Bridges", the first African American student to integrate an all-white school in New Orleans. The scene in question stems from the photos of Ruby walking up to the doors of the school with protestors shouting at her from across the street. While we're still examining other impactful novels that students are reading today, I have decided to join this project and work with Tori to create an immersive experience giving students the option to move through these scenes at their own pace, exploring the world and gaining more information. 

RELEVANT SOURCES/INSPIRATION

After discussing this project with Tori, we brought it to Maria who recommended looking into some studies on VR immersion and emotion. I have started collecting several studies and books on VR interaction and narrative, one in particular titled "Advances in Interaction with 3D Environments". It makes a point of discussing different methods for wayfinding and navigation through a 3D space, and the efficiency of different manipulation techniques for 3D objects. 

9780061876721_p0_v2_s600x595.jpg

I also began reading "Flow" by Mihaly Csikszentmihalyi, which discusses the psychological state of flow. I have only ever heard of this concept in the context of game design, and did not realize this was a much broader theory. The book itself is written for the reader to understand how to achieve happiness. Flow is defined as "...the state in which people are so involved in an activity that nothing else seems to matter...", and is often manipulated in games to create emotional impact in between high-action moments. This feels especially relevant for the Ruby Bridges project; if the intended goal is to create an educational experience for the student through emotion, it's important to consider how interactivity may interrupt that flow or enhance it. 

CURRENT QUESTIONS/NEEDS RAISED

I'm starting to narrow down what part of "interactive" I'm choosing to focus on, but most of my questions from this week have to do with further defining this in the context of the Ruby Bridges project. 

  • What degree of realism should be achieved for the emotional impact we're seeking? 
  • Would allowing the students to interact with the scene decrease this impact, or draw them away from the narrative? 
  • What specific mechanics would I want to focus on for the scene, and are they appropriate for the age of the students? 
  • What form of hardware would the students be using to experience the scene? A Google Cardboard or a full headset? 
  • If this is the narrative we choose to pursue, which individuals or organizations should be involved to ensure a respectful, accurate portrayal? 

LIKELY NEXT STEPS

As far as the needs of the video, I will be working on recordings of gameplay from my projects and getting footage of the Hurricane Preparedness prototype being played by others on the Vive. I received permission to show footage from the app used for the VR Physics Education Study, so I'll be recording a section of that as well. I will have solidified the storyboards and script this weekend, and will do another run-through in the sound room to start putting my animatic together. 

I will continue reading Flow and searing for more sources on emotion and narrative in virtual reality, as well as immersion. Some of these sources need to be ordered through the library, so I'll be taking care of that and adding them to my reading list. Tori and I will also be having meetings on Tuesdays to work through some of our research and discuss further details on the project.