MultiUser Immersive AR Guiding Prefab

  • Area: Immersion & Presence (WP6), Social Visiting (WP4)
  • Contributors: Hochschule RheinMain
  • Key Contact: Ulrike Spierling (ulrike.spierling@hs-rm.de)
  • Date: March 2025

Overview

Augmented Reality (AR) applications can be used to guide people indoors. In a museum, this can be employed to route visitors towards exhibits that feature AR content. While AR guiding applications can be found for single users already, multi-user scenarios are rare, as they add a new layer of complexity.

The MultiUser Immersive AR Guiding prefab offers an authoring solution for multiple users wearing augmented reality glasses that are synchronously connected to each other in a museum environment. This prefab features the ability to customise a guide avatar, with a default option of a blue butterfly. Users can create a guiding path within Unity by utilizing a scanned layout of the intended space. The prefab allows for easy adjustments of guiding parameters, including avatar height, distance from users, speed thresholds for varying flight speeds, and overall movement dynamics.

What to consider in multi-user guiding scenarios

When implementing multi-user functionality, it is crucial to establish how the avatar interacts with the group. The prefab provides flexibility in this regard, allowing the author to choose whether the avatar responds to the entire group or a designated individual. If the avatar is set to react to the entire group, it will only proceed when every user is within a specified distance. Conversely, if it is set to react to only one person from the group, the avatar will ignore all other users, only adapting the guiding behavior to this person.

Authoring in Unity and on the mobile device

To author the guiding experience in Unity, users can define the flight path of the avatar directly within Unity’s scene editor by distributing path nodes along the route in an imported scan of the environment where it will take place (Fig. 1). The prefab’s Inspector provides intuitive controls for adjusting guiding parameters, enabling the customization of the guiding parameters without accessing the scripts. These parameters include flying height of the avatar, distance between avatar and user, thresholds for changing flying speed and the speed itself for slow and fast mode (Fig. 2).

Once the path is established, authors can deploy the application on the target device, such as the HoloLens 2, where they can finalise the configuration. This includes selecting whether the avatar should react to the whole group or specify one of the users as the “expert” (Fig. 3). Instead of using the pre-authored path from Unity, it is also possible to define a new path directly on the device by physically walking the route in the real exhibition and adding nodes along the way. It is also possible to tweak the guiding parameters set in Unity on runtime on the device if the experience needs optimisation. When everything is set up, guiding can be started by entering the starting point (Fig. 1).

Figure 1: Setup in Unity. Starting point (blue) and path nodes (yellow) on a scan of the Senckenberg Museum.

Figure 2: Setup in Unity. Here the author can exchange the avatar and set the guiding parameters.

Figure 3: Setup on the HoloLens. Settings menu to define group dynamic and the guiding parameters.

Figure 4: Author test on the HoloLens. Example deployment, transferred through easy prefab adaptation from the Senckenberg museum to the Hessisches Landesmuseum Darmstadt, Germany, where the white butterfly avatar waits for both users from a certain distance. The still visible yellow authoring path nodes are normally set to invisible during the end user experience.

The MultiUser Immersive AR Guiding Prefab Team is:

Jessica L. Bitter – Researcher, AR Developer and Designer
Dr. Ulrike Spierling – Principal Investigator