Resonance Lab · Episode 02 · Frontier Tower, SF

Mapping the Edge Between
Memory and Alignment A field note from the hackathon floor — where Bonfires meets ROOM

// 01 · context

Where We Are

Resonance Lab is a loose collaboration between three builders working at the edges of spatial computing, personal AI, and collective knowledge. We came together at the RP1 Open Metaverse Hackathon at Frontier Tower — not with a finished plan, but with a shared sense that something important was converging.

This is Episode 02. Not a redesign — a field note. A snapshot of what the hackathon clarified, and a careful attempt to map the boundary between Bonfires and ROOM.

The emerging spatial web isn't primarily about virtual worlds. It's about services attached to places — software that activates based on proximity, identity, and context. That reframe changes what we're building.

// 02 · fresh from frontier tower

What the Hackathon Clarified

Spending time inside the RP1 ecosystem — with Patched Reality's tools live and the MSF architecture running — made something concrete that had been abstract: spatial environments are proximity-based service delivery systems. RP1 isn't building a virtual world; it's building a browser. Services register into a shared coordinate system and activate when you're near them.

🔴 Floor note · Frontier Tower, SF ManifolderMCP — Patched Reality's new open-source tool — lets Claude edit MSF scenes through natural language. That's AI-native spatial authoring: a daemon that speaks an environment into existence. The connection to our architecture is direct and worth watching.

The second thing that landed: most spatial tooling today solves rendering, not meaning. The semantic layer — what a place is, what happened there, who cares about it — is almost entirely absent from the open stack. That's the gap all three of our systems are filling, from different angles.

NotebookLM — Resonance Lab Research →

The Spatial Information Stack — Where We Operate

The SensAI World Models framework maps spatial information into four ascending layers. Most open-stack tooling today lives at layers 2–3. Our collaboration lives entirely at layer 1 — the semantic layer — and that's not an accident.

The Semantic Engine: Architecting Meaning in the Spatial Web — four-layer spatial information stack showing Bonfires, ROOM, and Daemons operating at the semantic layer
The Semantic Engine · Spatial Information Stack
Bonfires — data substrate  ·  ROOM — interpretation layer  ·  Daemons — identity & signal agent
SensAI Hackathon · March 14–15

// 03 · mapping the edge with bonfires

Where Bonfires Ends and ROOM Begins

Getting this boundary right matters — not just technically, but as a foundation for ongoing collaboration. We don't want redundant systems. We want complementary ones.

Bonfires remembers.ROOM interprets.Daemons express.RP1 visualizes.

Bonfires excels at extracting structured relationships from conversations, preserving them over time, making them queryable. That's foundational. ROOM has no interest in duplicating it — only in adding the layer above: asking what those relationships mean for the people inside the graph right now.

// boundary sketch

Bonfires nodes · edges · entity extraction · collective memory · graph query
────────────────────────────────────── ← edge
ROOM alignment detection · cluster emergence · collaboration signal

// ROOM reads Bonfires. Does not replace it.
// Daemons write to Bonfires directly — no intermediary required.

A thought for Joshua: knowledge graphs capture relationships well. What they don't surface naturally is emerging clusters of shared intent — moments when several nodes converge around a question before anyone has spoken. That detection problem is exactly what ROOM is exploring. Not a competing layer — a human-centric interface to the intelligence already in the graph.

// 04 · hackathon scope

The Loop We're Closing

The goal this weekend is deliberately small: prove the loop.

// the loop

1. Daemon emits → user intent as structured identity signals
2. Bonfires stores → relationships written to the knowledge graph
3. ROOM detects → alignment patterns surface from the graph
4. RP1 shows → signals appear in the spatial environment

// if this closes once, the foundation is real

A working version — even rough — demonstrates something absent from the current open metaverse stack: spatial proximity activating meaningful alignment signals, not just geometry.

// 05 · mutual purpose

Why This Works

Bret is exploring what it means for AI to carry and express personal identity over time. Joshua is building infrastructure for collective memory — structured, persistent, queryable. Mike is asking whether genuine alignment between people can be made visible before they've found each other.

Together these form something none of us set out to build alone: a living map of people, knowledge, and emerging collaboration — navigable in space, not just queryable in a database.

Most details are still unresolved. This document exists to clarify where edges currently meet — and to keep the space open for what emerges.

// field note ends here

Still Mapping

The architecture is evolving. The most interesting discoveries are probably the ones that surprise all of us.

Resonance Lab · Episode 02 · March 2026 · Frontier Tower, SF
ROOM · Bonfires.ai · Daemons · NotebookLM · OMB Wiki