PROJECT HOLOGRAM
SYS.ONLINE // OP.READY
[00] PHASE ONE // ADAPTIVE MAPPING

KNOW
WHERE
TO PROVE.

A measured first phase for projection mapping and future live hologram work: prove the timing backbone, stabilize a deforming surface, then gate the fogscreen/NIR path.

[01] SYSTEM LEAKAGE

EVERY LIVE SYSTEM RUNS ON THREE RESOURCES:
TIME, LIGHT, AND LATENCY.

This phase exists to reveal where the real-time loop breaks before fog, dense reconstruction, or hidden sensing can hide the failure.

RESOURCE .01

TIME

Capture, ingest, solve, warp, render, and projection must land inside a measured 120fps loop without cadence collapse.

RESOURCE .02

LIGHT

Visible projection proves the mapping stack. 830/850nm NIR gets its own fog visibility gate before expensive integration.

RESOURCE .03

LATENCY

Photodiode and trigger traces are the authority. Software timestamps are useful, but they do not get the final vote.

[02] OPERATIONAL PROTOCOL

HOW WE
WORK

01

Backbone

Bring up the reusable camera → CXP → GPU → projector path on a rigid board. Measure the truth before introducing a moving target.

02

Membrane

Move the same stack onto a fan-driven matte sheet with passive visible features and a coarse warp mesh.

03

NIR Scout

In a dark bench, project an 830/850nm line or sparse feature. Rigid board first, fog second. No decode logic yet.

04

MVP Loop

Sustain the strict 120fps adaptive loop for ten measured minutes and prove that adaptive mapping beats static mapping.

[03] MVP DEFINITION

THE MVP IS A DEFORMING-SHEET LOOP,
NOT A FOGSCREEN SYSTEM.

TARGET .02

10 MIN

Long enough to reveal thermal, cadence, buffer, and operational instability.

TARGET .03

≤2 FRAMES

Sensor-to-photon latency at or below two 120fps frame intervals.

TARGET .04

VISIBLE WIN

Adaptive projection visibly outperforms static mapping on a disturbed membrane.

[04] SYSTEM ARCHITECTURE

ONE WORKSTATION.
ONE HOT PATH.
NO HIDING.

CAPTURE // 5MP mono global-shutter CXP camera
INGEST // Euresys Coaxlink Duo CXP-12
SOLVE // GPU-side surface proxy + coarse mesh
OUTPUT // 1080p 120Hz projector, 240Hz stress available
VERIFY // photodiode + trigger traces
NO Ethernet hop in the hot path
NO second render node
NO middleware between timing-critical stages
NO fog hardware until the core loop is trusted
NO NIR spending until same-side visibility is real
[05] PHASE GATES

HOW TO START

GATE 01

Rigid Board

Is the camera-to-GPU-to-projector path measurable and stable before moving targets?

GATE 02

Membrane

Can passive visible features support stable coarse warp at 120fps?

GATE 03

NIR Fog

Can projected same-side NIR structure be seen on fog as structure, not glow?

GATE 04

MVP Run

Can the deforming-sheet loop sustain 120fps cleanly for ten minutes?

GATE 05

240Hz

Can the visible path sustain 240Hz cleanly when future stress tests ask for it?

INITIATE
PROTOCOL.

Buy the reusable timing backbone first. Prove rigid-board timing. Prove membrane stabilization. Treat NIR-on-fog as a major gate, not an assumption.