Collaboration in Mixed Reality

Details are updated when publicly available. Contact me for more information.

( Ongoing )

Mixed Reality has been shown to enhance remote guidance and is especially well-suited for physical tasks. This project explores how certain properties of mixed reality can be exploited to offload the effort required for the communication task from the collaborator to the system, and how alternate representations of relevant information can reduce the effort required for it.

The first phase of the project focuses on referencing - the ability to refer to an object in a way that is understood by others - a crucial process that warrants explicit support in collaborative Mixed Reality systems. We conducted a 2x2 mixed factorial experiment that explores the effects of providing spatial information and system-generated guidance to task objects, and investigated the effects of such guidance on the remote collaborator’s need for spatial information. Our results show that guidance increases performance and communication efficiency while reducing the need for spatial information, especially in unfamiliar environments. Our results also demonstrate a reduced need for remote experts to be in immersive environments, making guidance more scalable, and expertise more accessible.


Publications:

Janet G Johnson, Danilo Gasques, Tommy Sharkey, Evan Schmitz, and Nadir Weibel. Do You Really Need to Know Where “That” Is? Enhancing Support for Referencing in Collaborative Mixed Reality Environments. CHI '21, Yokohama, Japan. [Video]


Collaborators:

Danilo Rodrigues Gasques, Tommy Sharkey, Evan Schmitz, Wanze Xie, Nadir Weibel