Skip to content

AI Architecture: Infinity & Beyond โ€‹

Status: ๐ŸŸข Production Ready
Core Components: Infinity Assistant / Native AI Traits
Integration: @holoscript/infinityassistant


1. The "Infinity" Protocol โ€‹

HoloScript is the first language designed to be written by both Humans and AI. The Infinity Protocol standardizes how AI agents perceive, modify, and generate HoloScript code.

Architecture โ€‹

mermaid
graph TD
    User["User Implementation"]
    IB["Infinity Builder Client"]
    API["Infinity Assistant API"]
    LLM["Fine-Tuned LLM (GPT-4o)"]

    User -->|Prompts| IB
    IB -->|Context + Scheme| API
    API -->|Optimized Generation| LLM
    LLM -->|HoloScript AST| API
    API -->|Compiled Artifacts| User

Key Components โ€‹

A. Infinity Builder (@holoscript/infinityassistant) โ€‹

The official client for AI-driven development.

  • Natural Language to Code: "Create a red bouncing ball" โ†’ composition @prop { ... }
  • Platform Optimization: Auto-tunes assets for VR vs Mobile.
  • Self-Healing: Automatically fixes syntax errors in generated code.

B. Native AI Traits (Experimental) โ€‹

New directives that allow objects to expose themselves to AI perception.

  • @detectable: Object broadcasts its semantic tags to AI agents.
  • @affordance: Object declares how it can be used (e.g., "sit", "grasp").
holoscript
// Object that teaches AI how to use it
chair#wooden @detectable @affordance {
    tags: ["furniture", "seat"]
    actions: {
        "sit": { posture: "seated", offset: [0, 0.5, 0] }
    }
}

2. Experimental Bindings โ€‹

@holoscript/llm (Local Inference) โ€‹

Direct bindings to local LLMs (Llama 3, Mistral).

typescript
import { LLM } from '@holoscript/llm';

// Run inference inside the HoloScript runtime
const response = await LLM.infer('What is this object?', {
  context: object.toString(),
});

@holoscript/compiler (VRChat/Unity Targets) โ€‹

AI-assisted target compilation to Udon/Unity outputs.

  • Uses AST transformation to map HoloScript logic to Udon Graph interactables.

3. Roadmap โ€‹

PhaseFeatureStatus
Phase 1Text-to-HoloScript Generationโœ… Live
Phase 2Context-Aware Modification ("Make it bigger")โœ… Live
Phase 3Semantic Scene Understanding (@detectable)๐ŸŸก Beta
Phase 4Autonomous Agents (Self-Coding)๐Ÿงช Alpha

4. Strategic Direction: Brittney And Studio Primacy โ€‹

The long-term AI architecture should converge toward a Studio-first operating model:

  • Brittney becomes the primary intelligence interface.
  • HoloScript Studio becomes the primary creation environment.
  • HoloScript remains the native medium for scenes, systems, and agent behavior.
  • Conventional IDEs remain important, but primarily as compatibility and maintenance surfaces.

This means advanced intelligence accumulation should converge around:

  • project memory,
  • scene understanding,
  • asset and world refinement,
  • fine-tuning,
  • deployment orchestration,
  • and spatial debugging.

In practice, generic IDE agents continue to help build HoloScript itself, while Brittney becomes the place where the ecosystem's deepest context and creation capabilities compound over time.

Released under the MIT License.