Technologies used in XR

Next-gen technologies are transforming how we interact with the world, and by integrating AR and XR specific technologies, we can create rich, interactive, and immersive experiences that are transforming industries.

Interaction Technologies

Eye Tracking

  • Tracks where the user is looking to provide more responsive and personalized experiences.
  • Enhances the realism and interactivity of VR and AR applications.

Gesture Recognition

  • Uses sensors and cameras to detect and interpret hand and body movements.
  • Enables intuitive interactions with digital content.

Spatial Audio

  • Provides 3D audio effects that change based on the user’s position and movements.
  • Enhances immersion by making sound appear to come from specific directions within the environment.

Voice Recognition

  • Technologies like Google Assistant, Siri, and Alexa.
  • Allow users to control AR and XR applications through voice commands.

Hardware Technologies

AR Glasses and xr Headsets

  • Devices like Apple Vision Pro, Microsoft HoloLens and Magic Leap
  • Provide hands-free, immersive AR experiences with advanced optics and sensors.

VR HEadsets

  • Devices such as Oculus Rift, HTC Vive, and PlayStation VR.
  • Offer fully immersive virtual environments, often used in XR applications.

smartphone and tablets

  • Equipped with cameras, GPS, accelerometers, and gyroscopes.
  • Use AR apps to overlay digital content onto the real world.

Motion Tracking Sensors

  • Sensors like UltraLeap, Intel RealSense, Azure Kinect, VicoVR and Structure Sensor by Occipital
  • Track hand and body movements to enable interaction with digital content.

Haptic Devices

  • Devices that provide tactile feedback.
  • Enhance the sense of touch in virtual interactions.

Software Technologies

XR Development Platforms

  • Platforms like Unity and Unreal Engine.
  • Support the creation of both AR and VR experiences with powerful 3D graphics and physics engines.

AR SDKs (Software Development Kits)

  • Tools like ARKit (Apple) and ARCore (Google).
  • Enable developers to create AR applications with features like motion tracking, environmental understanding, and light estimation.

Computer Vision AI

  • Algorithms that enable devices to understand and interpret visual information.
  • Used for object recognition, image tracking, and spatial mapping.

SLAM (Simultaneous Localization and Mapping)

  • Technology that maps the physical environment in real-time.
  • Essential for AR applications to anchor digital content accurately in the real world.


  • Platforms like Amazon Sumerian and Microsoft Azure Spatial Anchors.
  • Provide cloud-based processing and storage for complex AR and XR applications, enabling features like persistent AR experiences and multi-user interactions.

Pixel Streaming

  • Technology that streams high-resolution graphics from powerful servers to end-user devices.
  • Allows for graphically intensive AR and XR experiences on devices with limited processing power.


Selected Clients

Handbuilt (VIC)

Based centrally in Melbourne's CBD for easy access

Handbuilt (NSW)

Only a few minutes from Sydney's Central Station

Handbuilt (TAS)

We still have a presence in this idyllic historic area