The Extended Reality and Games Lab at the University of Arizona is engaged in a number of projects that blur the lines between the physical and virtual worlds.
In Netflix's popular sci-fi techno-thriller series "Black Mirror," the lines between the physical and virtual worlds are often blurred. The results are thought-provoking – and sometimes terrifying.
But how close are we, really, to living in a world in which virtual and actual reality blend seamlessly into one?
UArizona News visited the Extended Reality and Games Lab, also known as the XRG Lab, in the University of Arizona School of Information to see what kind of research is going on in the area and how it might benefit people, despite what horror movie plotlines may have us fear.
The lab, housed in the College of Social and Behavioral Sciences, is co-directed by assistant professors Lila and Ren Bozgeyikli. They are interested in how best to leverage virtual reality, or VR, and spatial augmented reality – which incorporates both real and virtual elements – for beneficial uses, such as training first responders to navigate hazardous environments or teaching employees how to perform certain tasks in the workplace.
"Virtual reality enables us to give experiences that would be very difficult or hazardous to do in real life. This means that we can train individuals beforehand for such high-stakes scenarios," Ren Bozgeyikli said.
The researchers ultimately aim to decrease the boundary between the real and virtual worlds and make VR experiences more effective and engaging for users, be it for educational or entertainment purposes.
"The overall goal in all of our studies is to increase sense of presence – to immerse users more, and to trick them into feeling like they're experiencing a real situation versus a virtual situation," Lila Bozgeyikli said. "When we increase that sense, it yields lots of benefits. It improves learning if it's a learning experience, it improves training, it improves enjoyment. It spans lots of different user-experience aspects."
The researchers talked about three projects that are currently underway in the lab.
I Spy, With My Googly Eyes…
Donning a virtual reality headset can be an isolating experience. The moment you put it on, you're immersed in a virtual world, no longer able to see anyone physically in the room with you. They can't see you either – at least not your eyes, which are typically shielded from view by solid-colored goggles.
The "Googly Eyes" project aims to help break down the barrier between headset wearers and non-wearers, using a pair of cartoon eyeballs.
The virtual reality headsets used in the XRG Lab come with built-in eye-tracking software.
Ren and Victor Gomes, a senior majoring in computer science, developed a smartphone app that communicates with that software, as well as a tablet computer game that allows VR users and non-users to collaborate on a shared task.
The smartphone is mounted on the front of the VR headset, and through the app, a set of large animated eyes appear on the phone's screen. The "googly eyes" mirror the eye movements of the person wearing the headset using eye tracking data – looking left, right, up, down; blinking, winking, even rolling.
"When a user wears a virtual reality headset, they are disconnected from the real world, and their friends are disconnected from them, too," Ren said. "We thought, what if there's kind of a little window to inside the headset that can try to decrease the boundary between the real world and the user? And what if we show the user's eye movements and facial gestures that are important in communication?"
During googly eyes experiments, which will begin soon, the researchers will ask VR users and non-users to collaborate on a task. They'll then measure whether or not the eyes improved communication or made the task more socially enjoyable.
"We expect to find an increase in social communication and social presence, which would help with making virtual experiences more blended into the daily lives of users," Ren said.
Teamwork Makes the Dream Work – With Help From an Avatar
The "Give Me a Hand?" project, funded by a recent grant from the National Science Foundation, aims to get humans and virtual characters working together in a way that feels as real as possible.
Lila is developing a series of experiments in which a person will cooperate on physical tasks with a virtual character, or avatar. To make the experience feel more real, study participants will interact with tangible items that they "share" with their avatar companion.
In one experiment, for example, the study participant stands in front of a projection screen with a hole in it. Protruding through the screen is one half of a steering wheel. Projected on the screen is a virtual environment, which includes an avatar that appears to be steering the other half of the wheel.
The human participant is able to physically touch and turn the wheel, while the avatar's own movements of the wheel respond accordingly.
"We know tangible interaction improves user experience and user performance, and we know shared experiences, like collaboration with a virtual character, improve user experience," Lila said. "This combines the two."
The researchers are using 3D printers to create prototypes of the steering wheel and other equipment they'll use in the two-year project. Human experiments are expected to begin next summer. Jack Clark, a doctoral student in the School of Information, is helping with the experiment design and will carry out the user studies.
"We expect to find an increase in task performance, level of engagement and presence," Lila Bozgeyikli said.
Getting a Kick Out of Virtual Reality
The "Tangiball" project, which the Bozgeyiklis recently completed, similarly integrates a dynamic real object into a virtual environment.
Researchers asked VR headset-wearing study participants to kick a virtual ball, with the goal of getting it as close as possible to a series of virtual targets. The participants completed the task with and without the presence of a real, tangible ball. The physical ball was equipped with sensors that allowed it be rendered digitally in the virtual environment.
"Virtual reality interactions are usually more gesture-based or through controllers, but it's known from learning theory that if you include familiar objects that are tangible, it helps with learning and spatial reasoning," Lila said. "We're interested in how we can take familiar objects, like a ball, and incorporate them dynamically into the virtual world so that users can have a more intuitive and familiar experience."
The user studies were carried out with the help of Samarth Puri, a School of Information graduate. Participants reported that they preferred the experience when a real ball was incorporated, and they also hit their targets with greater accuracy. The researchers are preparing to submit the "Tangiball" findings for publication.
Lab Offers Students Hands-on Experience
The "Googly Eyes," "Tangiball" and "Give Me a Hand?" projects are just a few of the research efforts underway in the XRG lab. Another project, in development by Lila and School of Information alumnus Christopher Schnell, will explore the effects of mirrored virtual interactions on users' spatial abilities, with experiments scheduled to begin next month.
The work done in the XRG Lab work gives students in the School of Information an opportunity to practice what they learn in class.
"The School of Information offers a wide array of courses, from Virtual Reality to Game Development and Human-Computer Interaction, where we train students on the design and development of such experiences, and then recruit some for working in the lab on these research projects to help them gain experience on real-world research applications of the learned topics," Ren said.
The Bozgeyiklis say that when it comes to using virtual reality for training or entertainment purposes, the more realistic it feels, the better.
"The strongest selling point of VR is tricking the users' brains into thinking they're experiencing something real," Lila said "If you can't get users convinced that they're experiencing a real situation in VR, then you lose the power of it. It's not much different than playing a computer game on a flat-screen monitor, then."
Asked whether or not blurring the lines between virtual and actual reality might have the potential to get a little creepy, Lila says it's possible, but "we are not there yet."
Written by Alexis Blue, University Communications. Story "'Googly Eyes' Bridge Gap Between Virtual and Actual Reality" first published on UArizona News.