Niloufar Ramezanzadegan

Niloufar Ramezanzadegan

Master's Thesis

Application of Indoor position in a MR virtual environment for a communication application

Advisors

Wolfgang Mehringer (M. Sc.), Prof. Dr. Björn Eskofier, Anton Ebert (Siemens Healthineers)

Duration

03 / 2023 – 09 / 2023

Abstract

We have seen rapid development and increased public interest in mixed reality technologies in recent years. Mixed reality is a blend of physical and digital worlds, unlocking natural and intuitive 3D human, computer, and environmental interactions. This new technology is based on advancements in computer vision, graphical processing, display technologies, input systems, and cloud computing¹. As a hybrid technology, mixed reality (MR) combines aspects of virtual reality (VR) and augmented reality (AR). VR as a tool for communication is an interactive, participatory environment that could sustain many remote users sharing a virtual place [1]. AR is an interactive experience where real-world objects are augmented using digital information².

Researchers and companies are investigating the use of extended reality (XR) applications in remote collaboration, learning and assisting in different fields. The demand for such applications has grown in recent years, especially in the wake of the Corona pandemic. It is already possible to collaborate virtually in VR and AR. The existing platforms and applications in multi-user collaboration and communications provide digital experiences as an alternative to or a replica of the real world, along with key aspects like social interactions. They incorporate digital avatars of the users to express one’s emotions and movements uniquely and enable social interaction and indirect communication through translating body movement or facial features. They converge reality and the virtual world to provide an immersive experience, as well as use human-computer interfaces to integrate the user’s activities into the virtual world to offer a shared workroom and improve teamwork. Now we can improve immersive teamwork by sharing more information about people’s surroundings or their indoor locations. Currently, indoor positioning and navigation are utilized to enhance the user’s experience in AR applications. In a smart museum, an indoor localization system is applied to enhance the visitor’s experience in a museum. The proposed system relied on bluetooth beacons (Low Energy Bluetooth) to localize the visitors and an android application to automatically provide some cultural content and information related to the artworks that they are observing [2]. In healthcare facilities, an indoor navigation system based on a mobile application provides a wayfinding feature so that the patient’s or visitor’s position is estimated inside buildings to guide them through the building to their desired destination [3]. In another study, immersive visualization was used to improve 3D navigation and situation awareness for multiple users performing real-time operations in a real physical environment utilizing indoor navigation. It supports finding teammates in an indoor environment, importing locations, and selecting a navigation path for oneself [4].

In this work, it will be studied how sharing indoor positions and building information with users can affect the experience on mixed-reality communication platforms. Moreover, how this information provides opportunities to support collaboration and improve users’ experience and engagement. The proposed framework includes research and design for a possible solution for indoor positioning and mapping and implements the solution in an MR collaboration prototype that enables players to use a head-mounted display such as HoloLenses or VR headsets. In hybrid meetings, two target groups attend a session together. During the meeting, one group participates in a physical shared room while another group participates remotely via XR devices. Therefore the prototype includes two types of users, who are on-site or remote clients, and as a result, the scene should be customized for each targeted group. The application allows remote clients to be immersed in the 3D virtual room which is the 3d virtual model of the same building where on-site clients meet. Remote clients are able to see the avatars of on-site clients in the 3D virtual model of the building. On-site people’s avatars are positioned in relation to their real-world location where they are positioned. Simultaneously, on-site clients are able to see the avatars of remote clients through their HoloLens headsets. Finally, the prototype will be tested by 10 users to evaluate the effect of sharing positions and building information in a mixed-reality application.

_________________________________________________________________________

¹ https://learn.microsoft.com/en-us/windows/mixed-reality/discover/mixed-reality
² https://www.techtarget.com/whatis/definition/augmented-reality-AR

References
[1] M. A. Gigante, “Virtual reality: definitions, history and applications,” in Virtual reality systems. Elsevier, 1993, pp. 3–14.
[2] P. Spachos and K. N. Plataniotis, “Ble beacons for indoor positioning at an interactive iot-based smart museum,” IEEE Systems Journal, vol. 14, no. 3, pp. 3483–3493, 2020.
[3] A. Luschi, E. A. B. Villa, M. Gherardelli, and E. Iadanza, “Designing and developing a mobile application for indoor real-time positioning and navigation in healthcare facilities,” Technology and Health Care, no. Preprint, pp. 1–25, 2022.
[4] A. Ayyanchira, E. Mahfoud, W.Wang, and A. Lu, “Toward cross-platform immersive visualization for indoor navigation and collaboration with augmented reality,” Journal of Visualization, vol. 25, no. 6, pp. 1249–1266, 2022.