LUTE: AR Component
See also: LUTE Framework
What is the LUTE AR Component?
The LUTE AR Component includes a number of Augmented Reality (AR) orders that can be used to make the game interactable in AR – this involves placing, selecting, and moving, virtual 3D objects placed into real world scenes detected by the camera. The component includes orders that let the user place objects on detected planes, at specific locations, interacting with said objects, and tracking images. The AR Component has been used in a variety of games in our case studies to implement features such as tracking virtual animal prints across real world fields, scanning for buried items, and visualising a player’s virtual creations in the real world.
There are a number of different possible interactions built in, with a possibility to extend it later. The two implemented actions are to interact with the object by touching it, or drag the object to a “ghost” image of it (an example of that is below). Through Unity’s XR Environment Simulation, testing can be done on PC first before uploading it on a device.


Figure 1: Caption.
Technical Details:
The orders use Unity’s built in AR system, which will pick between ARCore (Android) and ARKit (iOS) depending on the device it is built for.

Figure 2: Caption.
“AR functionality allows us to have some great mixed reality interactions. For example, at the beginning of the Avebury experience players use an AR scanner to send out fictional ground penetrating radar pulses to help them locate buried stones – it’s a much more playful way of finding locations and helps balance people’s attention between the real world space and their device.”
Prof. David Millard (University of Southampton PI)
See also: LUTE Framework
The LUTE Team is:
Dr Jack Brett – Lead Engineer
Dr Charlie Hargood – Academic Investigator and Architect
Dr David Millard – Academic Investigator and Architect
Dr Yoan Malinov – Engineer
Dr Bob Rimmington – Qualitative Researcher