Skip to content

imldresden/label-authoring-toolkit

Repository files navigation

MR Label Authoring Toolkit

A research prototype for authoring and deploying labeled Mixed Reality (MR) experiences across multiple platforms, including handheld AR (Android) and head-mounted MR (Meta Quest 3). This toolkit enables the creation of interactive, spatially anchored labels on 3D models, allowing for guided, and proximity-based experiences.

The system is released as an artifact alongside the short paper "The Invisible Hand of the Context: Authoring of Context-Aware Mixed Reality Labels" accepted at the Mensch und Computer 2025 (MuC) conference.

✨ Key Features

  • Desktop Authoring Tool Create MR experiences by augmenting 3D models with textual descriptions, images, and structured guidance.

  • Cross-Platform Experience Deployment Deploy authored projects to:

    • Handheld AR (Android) using ARFoundation & ARCore
    • Head-mounted MR (Meta Quest 3) using Meta XR SDK
  • Core Labeling Features

    • Spatial anchor point definition (manual or automatic)
    • Interactive label expansion
    • Proximity-based visibility
    • Guided narrative sequences
    • Temporal coherence

📦 System Overview

1. Authoring Tool (Desktop, Unity-based)

The Unity-based authoring environment lets users import 3D models (.glb format), define features, and place anchor points. Projects are saved as structured directories that bundle all content and metadata.

Main components:

  • Project Management: Create and manage labeled MR projects
  • Feature Configuration: Attach names, descriptions, and optional images to model components
  • Anchor Placement: Define one or multiple anchor points per feature
  • Narrative Sequences: Create step-by-step guided learning experiences
  • Structured Export: All data (model, features, metadata) stored in a readable JSON format

2. Experience Systems (Runtime MR)

The authored experiences are rendered using platform-specific MR implementations. Each runtime loads the exported project data, performs label placement, and provides interaction mechanisms tailored to the device.

Platform Details:

  • Android (ARCore via AR Foundation):

    • Touchscreen interactions
    • Label tap detection via raycast
    • Model placement on detected surfaces
  • Meta Quest 3 (Meta XR SDK):

    • 3D UI (floating panels)
    • Hand tracking or controller input
    • Spatial positioning of content using pass-through and anchors

📁 Project Structure

Each project is saved as a self-contained folder with the following structure:

Persistent App Path
├── newModel.glb
├── someImage.png
└── Saves
	├── Project1
	│	├── model.glb
	│	├── project.json
	│	└── thumbnail.png	
	└── Project2

Projects are created in a Saves folder at the AuthoringTools persistent data path. All the necessary materials to create a project (i.e., a 3D model and optional images) must be placed in the app's persistent path. When a project is created, its corresponding files are copied into the project folder.

The ExperienceViewers look for available projects at their respective persistent data paths.

🚀 Getting Started

Note: This prototype is a research artifact and not a production-ready toolkit.

Authoring Tool

  1. Clone the repository and open the Unity project.
  2. Build and run the desktop authoring tool.
  3. Place material (3D models and images) in the applications persistent path
  4. Import a .glb model and begin authoring.
  5. Move the project folder to a target device for deployment.

Deploying to Devices

  • Android AR: Build the scene and install on device.
  • Meta Quest 3: Build the scene and install on device.

⚠ Shader Setup Instructions

To ensure correct rendering of 3D models across platforms, Unity's shader stripping behavior needs to be configured for each application (Authoring Tool, Android, Meta Quest):

  1. Import your 3D model and place it into the scene.
  2. Go to Project Settings → Graphics → Shader Stripping.
  3. Click "Save to asset" to generate a shader variant collection.
  4. Then, add that asset to "Preloaded Shaders" under the same menu.
  5. Repeat this process for each target application (Authoring Tool, Android, Meta Quest).

Without this step, shaders may be stripped during the build process, leading to broken or invisible visuals at runtime.

🕹 Interaction Guide

Authoring Tool (Desktop)

Use your mouse to navigate and place anchor points in the 3D model view:

  • Rotate view: Right mouse button
  • Pan view: Middle mouse button
  • Zoom view: Mouse wheel

Mobile (Handheld AR)

  • Place model: Tap on a detected plane
  • Interact with labels: Tap labels directly, Tap and hold to show all leader lines
  • Navigate narrative sequence: Use on-screen buttons

Meta Quest 3 (Head-mounted MR)

  • Place model:

    • Controller: Press Button.One (A button) on the right-hand controller
    • Hand tracking: Touch thumb and index finger together
  • Navigate narrative sequence:

    • Next: Thumbstick right
    • Previous: Thumbstick left
  • Label interactions:

    • Show all leader lines: Press PrimaryHandTrigger
    • Interact with individual labels: Press Button.One on the right controller

🛠 Third-Party Libraries & Licenses

This project makes use of the following components:

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published