Overview
Augmented Reality (AR) applications can be used to guide people indoors. In a museum, this can be employed to route visitors towards exhibits that feature AR content. While AR guiding applications can be found for single users already, multi-user scenarios are rare, as they add a new layer of complexity.
The MultiUser Immersive AR Guiding prefab offers an authoring solution for multiple users wearing augmented reality glasses that are synchronously connected to each other in a museum environment. This prefab features the ability to customise a guide avatar, with a default option of a blue butterfly. Users can create a guiding path within Unity by utilizing a scanned layout of the intended space. The prefab allows for easy adjustments of guiding parameters, including avatar height, distance from users, speed thresholds for varying flight speeds, and overall movement dynamics.
What to consider in multi-user guiding scenarios
When implementing multi-user functionality, it is crucial to establish how the avatar interacts with the group. The prefab provides flexibility in this regard, allowing the author to choose whether the avatar responds to the entire group or a designated individual. If the avatar is set to react to the entire group, it will only proceed when every user is within a specified distance. Conversely, if it is set to react to only one person from the group, the avatar will ignore all other users, only adapting the guiding behavior to this person.
Authoring in Unity and on the mobile device
To author the guiding experience in Unity, users can define the flight path of the avatar directly within Unity’s scene editor by distributing path nodes along the route in an imported scan of the environment where it will take place (Fig. 1). The prefab’s Inspector provides intuitive controls for adjusting guiding parameters, enabling the customization of the guiding parameters without accessing the scripts. These parameters include flying height of the avatar, distance between avatar and user, thresholds for changing flying speed and the speed itself for slow and fast mode (Fig. 2).
Once the path is established, authors can deploy the application on the target device, such as the HoloLens 2, where they can finalise the configuration. This includes selecting whether the avatar should react to the whole group or specify one of the users as the “expert” (Fig. 3). Instead of using the pre-authored path from Unity, it is also possible to define a new path directly on the device by physically walking the route in the real exhibition and adding nodes along the way. It is also possible to tweak the guiding parameters set in Unity on runtime on the device if the experience needs optimisation. When everything is set up, guiding can be started by entering the starting point (Fig. 1).

Figure 1: Setup in Unity. Starting point (blue) and path nodes (yellow) on a scan of the Senckenberg Museum.

Figure 2: Setup in Unity. Here the author can exchange the avatar and set the guiding parameters.

Figure 3: Setup on the HoloLens. Settings menu to define group dynamic and the guiding parameters.

Figure 4: Author test on the HoloLens. Example deployment, transferred through easy prefab adaptation from the Senckenberg museum to the Hessisches Landesmuseum Darmstadt, Germany, where the white butterfly avatar waits for both users from a certain distance. The still visible yellow authoring path nodes are normally set to invisible during the end user experience.
Related Publications
Team
- Jessica L. Bitter – Researcher, AR Developer and Designer
- Dr. Ulrike Spierling – Principal Investigator
Related Resources
-
Directional Audio Navigation Cues
1. Overview Pattern Level: Sub-pattern of Audio-First Dual-Mode NavigationPrimary Phase: GuidingOne-line Summary: A navigation mechanism that plays periodic spatially positioned musical cues derived from each character’s theme, giving players directional…
-
Character-Triggered Layer Transition
1. Overview Pattern Level: High-levelPrimary Phase: Cross-phaseOne-line Summary: A layer switching mechanism that transitions the player to a different content layer when they complete an encounter with a designated portal…
-
Orientation-Based Combat
1. Overview Pattern Level: High-level Primary Phase: PresentingOne-line Summary: A challenge mechanic that uses head orientation tracking to let players evade spatially positioned threats, turning the same sensing infrastructure used for…
-
Head-Directed Target Locking
1. Overview Pattern Level: Sub-pattern of Audio-First Dual-Mode NavigationPrimary Phase: GuidingOne-line Summary: A target selection mechanism that locks onto a navigation target when the player maintains head orientation toward it…
-
Progressive Proximity Audio Zones
1. Overview Pattern Level: Sub-pattern of Audio-First Dual-Mode NavigationPrimary Phase: PresentingOne-line Summary: A zone-based audio system that transitions players from silence through character music to narrative dialogue as they physically…
-
Audio-First Dual-Mode Navigation
1. Overview Pattern Level: High-levelPrimary Phase: Cross-phaseOne-line Summary: A navigation system that automatically separates exploration and content delivery into two distinct modes, preventing navigation audio and character audio from competing…
