Skip to content

AI Glasses Compiler

Compiles HoloScript to AI glasses platforms — Meta Ray-Bans, Snap Spectacles, Android XR glasses, and Apple AI glasses. Unlike full VR headsets, AI glasses are always-on, hands-free, and ambient: they overlay information on the real world while the wearer moves naturally.

Overview

The AI glasses compiler (--target ai-glasses) generates platform-appropriate output for the thin-client wearable glasses tier. These devices have:

  • No controllers — input via voice, head-pose, hand gestures, eye gaze
  • Always-on display — semi-transparent overlays on real world
  • Edge inference only — lightweight models, no heavy GPU
  • Tight battery budget — experiences must be < 5W total
  • Social context — other people can see you wearing them
bash
holoscript compile experience.holo --target ai-glasses --glasses-platform snap --output ./glasses/

Supported Platforms

PlatformDeviceSDKOutput Format
meta-raybansMeta Ray-Ban StoriesMeta View SDKJavaScript + HTML
snapSnap Spectacles 3Lens Studio (Snap OS)Lens Script + AR
android-xr-glassesAndroid XR glassesAndroid XR SDKKotlin + Jetpack
apple-ai-glassesApple AI glassesvisionOS LightSwiftUI + ARKit
genericGeneric OpenXR glassesOpenXR + WebXRC++ / JS

Trait Profile for AI Glasses

AI glasses have a restricted trait set due to hardware constraints:

TraitSupportNotes
@voice_activated✅ FullPrimary input method
@eye_tracked✅ FullFoveated attention, gaze selection
@billboard✅ FullText/info always facing user
@spatial_audio✅ Full3D audio via bone conduction / earbuds
@anchor✅ FullWorld-locked info anchors
@llm_agent⚠️ EdgeLightweight quantized model only
@physics❌ SkipToo GPU-heavy for glasses
@particle❌ SkipBattery cost too high
@networked✅ WiFiSync via companion app
@hitl✅ FullGlasses are a primary HITL interface

Example

holo
composition "WorkplaceAssistant" {
  template "InfoLabel" {
    @billboard
    @anchor
    @voice_activated

    geometry: "plane"
    scale: [0.3, 0.15, 0.001]
    color: "#0088ff"
    opacity: 0.85
  }

  agent "WorkplacePilot" {
    @llm_agent
    @voice_activated
    @eye_tracked

    on_gaze_dwell(target, duration: 1.5) {
      info = llm.identify(camera.frame, target)
      spawn "InfoLabel" at gaze.hit_point with { text: info }
    }

    on_voice("help me with this") {
      llm.assist(camera.frame)
    }
  }
}

Output (Snap Spectacles)

glasses/
  Lens/
    Scripts/
      WorkplaceAssistant.js   # Lens Studio JavaScript
      VoiceML.js              # Voice recognition module
    Resources/
      InfoLabel.material
    Public/
      lens.json               # Lens manifest

Design Principles for AI Glasses

  1. 3-second rule — any response must appear within 3 seconds of intent
  2. Glanceable — information readable in < 1 second
  3. Dismissible — everything disappears on voice or gesture
  4. Battery aware — use @low_power trait for ambient experiences
  5. Social — no content that embarrasses the wearer in public

Compiler Options

OptionDefaultDescription
--glasses-platformgenericTarget platform (see table above)
--glasses-edge-llmfalseInclude edge LLM inference bundle
--glasses-battery-budget5Max Watts; trims heavy traits
--glasses-companion-syncfalseGenerate companion phone app sync code

See Also

Released under the MIT License.