As of today, our Unity file is set up to operate as a good testing ground for interaction! Last week was really focused on honing our test ideas and getting the basic scene set up. This week, I focused on implementing navigation, interaction, and a functional menu to control our testing in the future.
On Monday I went ahead and tested the basic navigation and interaction functions I set up last week in the Vive. I set up the teleportation using the straight line pointer at first, and realized that this can prove difficult when moving around a large space. The straight line renderer in VRTK makes it hard to gauge distances, though it works excellent for UI selections. Keeping this in mind, I changed the teleportation to a bezier pointer. The arced line is easier to see in the space, especially when navigating elevated elements such as stairs or ramps.
I also tested out the interactive cubes. After some debugging, all the different highlighting settings worked. Some have outlines, some turn to solid colors, and they can be grabbed/tossed around the scene. I did run into one issue where the cubes could not be retrieved from the ground plane- it turns out the player was floating 0.5 units above the ground. Adjusting this value fixed the problem pretty quickly.
The Simulator now works in this project file, so I can roughly test functions without having to load into the Vive every time. In the past, troubleshooting often took a long time because I would playtest after making fifty changes. This makes it difficult to find the issues when they arise. It's been a conscious effort this time around to test every major implementation, and it's paying off.
I switched gears for a bit to work on UI elements in the scene. Because Tori and I are going to be testing a variety of interactive properties (some variations, other contradicting each other completely), I made a main menu for the whole project so that we can easily switch between scenes. This also includes toggles for the interactive objects if we don't need them in the scene. In the future, I will be adding controls for the motion capture data and animations. While I've made menus in the past for Unity, this one is attached to the headset and moves around with the player. Following the VRTK tutorial for a Headset Menu gave me a great start on the format, and then I adjusted it to fit our purposes.
Tori and I did a playtest together of the first basic teleportation level, just to make sure the buttons work and debug a few issues from Tuesday. Below is a video of our test and working through the ground plane issues.
Once the project framework was in place, I nailed down what exactly the navigation functions were going to look like and look for any holes in my logic. Thinking through what the player impact would be, what level of control they would have, and what purpose these changes would serve. All four of these options address very different concerns in the environment and are viable options to explore for potential user interaction. I have a pretty good idea of how to set these up in Unity, but I'm going to spend this weekend actually doing that. Next week will have more information on the development and results. On Tuesday, Tori and I will do another playtest to determine whether we need further exploration in each scene.
On the theoretical side, we had a conversation today about what our research question was looking like for this project as a whole. Tori is exploring the immersion side of the environment- telling the story through acting and environment. I'm exploring the interaction- how users experience and navigate through this story. We're both still pretty new to writing proper research questions and the results of the next four weeks are going to determine a lot about where we go moving forward. We just needed to start putting language to our work and determine how it all fits together in the big picture. The picture below is our start on this conversation, though it will be developing over the next few weeks as we work on writing our own questions and bring them together.
I mentioned using Mendeley last week for organizing my research- the reading has officially begun. I downloaded the desktop app and started adding all the current studies that Tori and I have gathered. While waiting for the uploads to complete, I read about the Virtual Human Interaction Lab (VHIL) at Stanford University, which studies the impact of human interaction in virtual reality and it's larger societal effects. Their website has a large archive of research papers over the past ten years, and I left with 16 studies on topics ranging from interaction, children's education, and racial bias.
Tori is pretty far ahead of me on readings right now, but I've started prioritizing them and working my way through the list based on those most relevant to my development in Unity. So far I've read:
The cybersickness reading really tied in with the current issues I'm challenging with navigation. Because the user will be navigating a large space (and in the endgame, the user will be a younger student), our tests should address comfort in navigation as well as functionality. Maria mentioned thinking about how these navigation forms could influence the story itself- for example, having limited movement when playing as Ruby, or losing the ability to navigate the space altogether. Really emphasizing the role of the child as also being one with minimal control over their world, Ruby particularly. Although I wonder if that effect would be as prominent if the user is already a child who may experience this in their life. Unless it's emphasized to a new degree? Food for thought.
"How to Do Things With Videogames", by Ian Bogost, explores the variety of uses games have been applied to. Some of these uses are unrelated to us at the moment- the debate over whether video games qualify as art is interesting but not really what we're exploring. But there are chapters on empathy, reverence, and work that give great examples of games to look at and how they deal with these topics. The empathy chapter discusses two games made by USC graduate students called "Darfur is Dying" and "Hush", both dealing with genocide and fostering empathy for the people trying to survive in these situations.
It also introduced the concept of the vignette in games, giving an impression of experience rather than advancing a narrative. Bogost also write an article on Gamasutra explaining his thoughts on video game vignettes. Our experience does not focus on one particular aspect of Ruby's walk to the school but would highlight multiple things she would face- confusion, loud crowds, angry faces, lack of control. Because of this I do not believe we could call this experience a vignette, but it's a good reminder to consider breaking down each aspect of her experience and how we portray that to students.
I've started making progress on the Vygotsky paper Maria sent us about Imagination in Childhood, about 11 pages in (out of 92). So far he's discussing what imagination actually is, it's origination in childhood, and imagination's basis in reality. I'm interested to see where this goes as far as discussing the perception of reality in VR, and how that ties in with the other papers I downloaded from VHIL.
NEXT:
From here on I'll be developing the other three teleportation techniques and making progress on the readings. In working with the teleportation, I'll be learning more about UI and setting up the controllers for specific functions, thus knocking out two of my three goals.
On Sunday, Tori and I have three actors from the Theater Department coming into the motion capture lab to capture data and potentially interact with in the scene. I should be able to teleport and move around them while they act. This will serve as a good test of scale in the environment and we'll be able to run through the procedures we learned last week. Once Tori has this data ready to go, we can bring it into the scene and I'll start playing with user control over animations/time.