The Experts below are selected from a list of 32757 Experts worldwide ranked by ideXlab platform
Anthony Steed - One of the best experts on this subject based on the ideXlab platform.
-
real time collaboration between Mixed Reality users in geo referenced virtual environment
arXiv: Human-Computer Interaction, 2020Co-Authors: Shubham Singh, Daniele Giunchi, Anthony SteedAbstract:Collaboration using Mixed Reality technology is an active area of research, where significant research is done to virtually bridge physical distances. There exist a diverse set of platforms and devices that can be used for a Mixed-Reality collaboration, and is largely focused for indoor scenarios, where, a stable tracking can be assumed. We focus on supporting collaboration between VR and AR users, where AR user is mobile outdoors, and VR user is immersed in true-sized digital twin. This cross-platform solution requires new user experiences for interaction, accurate modelling of the real-world, and working with noisy outdoor tracking sensor such as GPS. In this paper, we present our results and observations of real-time collaboration between cross-platform users, in the context of a geo-referenced virtual environment. We propose a solution for using GPS measurement in VSLAM to localize the AR user in an outdoor environment. The client applications enable VR and AR user to collaborate across the heterogeneous platforms seamlessly. The user can place or load dynamic contents tagged to a geolocation and share their experience with remote users in real-time.
Mark Billinghurst - One of the best experts on this subject based on the ideXlab platform.
-
exploring enhancements for remote Mixed Reality collaboration
International Conference on Computer Graphics and Interactive Techniques, 2017Co-Authors: Thammathip Piumsomboon, Arindam Day, Barrett Ens, Youngho Lee, Gun A Lee, Mark BillinghurstAbstract:In this paper, we explore techniques for enhancing remote Mixed Reality (MR) collaboration in terms of communication and interaction. We created CoVAR, a MR system for remote collaboration between an Augmented Reality (AR) and Augmented Virtuality (AV) users. Awareness cues and AV-Snap-to-AR interface were proposed for enhancing communication. Collaborative natural interaction, and AV-User-Body-Scaling were implemented for enhancing interaction. We conducted an exploratory study examining the awareness cues and the collaborative gaze, and the results showed the benefits of the proposed techniques for enhancing communication and interaction.
-
the design of a Mixed Reality book is it still a real book
International Symposium on Mixed and Augmented Reality, 2008Co-Authors: Raphael Grasset, Andreas Dunser, Mark BillinghurstAbstract:In this paper we present the results of our long term development of a Mixed Reality book. Most previous work in the area has focused on the technology of augmented Reality books, such as providing registration using fiducial markers. In this work, however, we focused on exploring the design and development process of this type of application in a broader sense. We studied the semantics of a Mixed Reality book, the design space and the user experience with this type of interface.
-
game city a ubiquitous large area multi interface Mixed Reality game space for wearable computers
International Symposium on Wearable Computers, 2002Co-Authors: Adrian David Cheok, Mark Billinghurst, Fong Siew Wan, Xubo Yang, Wang Weihua, Lee Men Huang, Hirokazu KatoAbstract:This paper presents a novel wearable computer interaction and entertainment system which provides an interactive physical and Mixed Reality computer environment that spans large areas with multi-users, and can be extended to a whole city. Our system called "Game-City" is an embodied (ubiquitous, tangible, and social) wearable computing based Mixed Reality (MR) game space which regains the social aspects of traditional game plays whilst also maintaining the exciting fantasy features of traditional computer entertainment.
Waqas Ahmad - One of the best experts on this subject based on the ideXlab platform.
-
an electrical muscle stimulation haptic feedback for Mixed Reality tennis game
International Conference on Computer Graphics and Interactive Techniques, 2007Co-Authors: Farzam Farbiz, Corey Manders, Waqas AhmadAbstract:We have developed a novel haptic interface for Mixed Reality applications. Specifically, we have constructed an electrical muscle stimulation system which is wirelessly controlled by a computer and generates electrical pulses with controlled amplitude, timing, and frequencies. The characteristics of the pulses are similar to the commercial electrical muscle stimulators used in medical applications. We apply these pulses through two electrode pads attached to the user's forearms. Hence, the user can feel muscle contractions when pulses are activated.
Derek Reilly - One of the best experts on this subject based on the ideXlab platform.
-
doing while thinking physical and cognitive engagement and immersion in Mixed Reality games
Designing Interactive Systems, 2016Co-Authors: Nabil Bin Hannan, Khalid Tearo, Arthur Bastos, Derek ReillyAbstract:We present a study examining the impact of physical and cognitive challenge on reported immersion for a Mixed Reality game called Beach Pong. Contrary to prior findings for desktop games, we find significantly higher reported immersion among players who engage physically, regardless of their actual game performance. Building a mental map of the real, virtual, and sensed world is a cognitive challenge for novices, and this appears to influence immersion: in our study, participants who actively attended to both physical and virtual game elements reported higher immersion levels than those who attended mainly or exclusively to virtual elements. Without an integrated mental map, in-game cognitive challenges were ignored or offloaded to motor response when possible in order to achieve the minimum required goals of the game. From our results we propose a model of immersion in Mixed Reality gaming that is useful for designers and researchers in this space.
Bruce L. Daniel - One of the best experts on this subject based on the ideXlab platform.
-
a patient specific Mixed Reality visualization tool for thoracic surgical planning
The Annals of Thoracic Surgery, 2020Co-Authors: Stephanie L. Perkins, Brian A. Hargreaves, Bruce L. Daniel, Brooke Krajancich, Chifu Jeffrey Yang, Mark F BerryAbstract:Purpose Identifying small lung lesions during minimally invasive thoracic surgery can be challenging. We describe 3-dimensional Mixed-Reality visualization technology that may facilitate noninvasive nodule localization. Description A software application and medical image processing pipeline were developed for the Microsoft HoloLens to incorporate patient-specific data and provide a Mixed-Reality tool to explore and manipulate chest anatomy with a custom-designed user interface featuring gesture and voice recognition. Evaluation A needs assessment between engineering and clinical disciplines identified the potential utility of Mixed-Reality technology in facilitating safe and effective resection of small lung nodules. Through an iterative process, we developed a prototype employing a wearable headset that allows the user to (1) view a patient’s original preoperative imaging; (2) manipulate a 3-dimensional rendering of that patient’s chest anatomy including the bronchial, osseus, and vascular structures; and (3) simulate lung deflation and surgical instrument placement. Conclusions Mixed-Reality visualization during surgical planning may facilitate accurate and rapid identification of small lung lesions during minimally invasive surgeries and reduce the need for additional invasive preoperative localization procedures.
-
A Mixed-Reality System for Breast Surgical Planning
2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), 2017Co-Authors: Stephanie L. Perkins, Subashini Srinivasan, Amanda J. Wheeler, Brian A. Hargreaves, Bruce L. DanielAbstract:One quarter of women who undergo breast lumpectomy to treat early-stage breast cancer in the United States undergo a repeat surgery due to concerns that residual tumor was left behind. This has led to a significant increase in women choosing mastectomy operations in the United States. We have developed a Mixed-Reality system that projects a 3D “hologram” of images from a breast MRI onto a patient using the Microsoft HoloLens. The goal of this system is to reduce the number of repeated surgeries by improving surgeons' ability to determine tumor extent. We are conducting a pilot study in patients with palpable tumors that tests a surgeon's ability to accurately identify the tumor location via Mixed-Reality visualization during surgical planning. Although early results are promising, it is critical but not straightforward to align holograms to the breast and to account for tissue deformations. More work is needed to improve the registration and holographic display at arm's-length working distance. Nonetheless, first results from breast cancer surgeries have shown that Mixed-Reality guidance can indeed provide information about tumor location, and that this exciting new use for AR has the potential to improve the lives of many patients.