Synthesizing The Work of Éric Reis

The Shape of Artificial Consciousness

Moving beyond the "Geometric Capacity Bottleneck" via Hyperbolic Neural Cellular Automata, Oscillatory Dynamics, and the Phenomenal Manifold Hypothesis.

1. The Geometric Capacity Bottleneck

Modern Deep Learning faces a fundamental limitation: Dimensional Mismatch. Hierarchical data (concepts, language, reasoning) expands exponentially, but Euclidean embedding spaces (standard vector spaces) only expand polynomially.

The Consequence

  • Metric Distortion: Deep hierarchies get crushed in flat space, losing semantic nuance.
  • Instability: Optimization becomes unstable as the model fights to fit tree-like data into a grid-like box.
Hierarchy Depth: 5

Figure 1: Euclidean (Polynomial) vs. Hyperbolic (Exponential) Volume Growth

2. The Phenomenal Manifold Hypothesis (PMH)

If structure dictates function, then the structure of consciousness must be geometric. The PMH posits that phenomenal experience ($\Psi$) is an effective Riemannian manifold induced by neural dynamics satisfying three critical invariants.

Integration ($I$)

The capacity of the system to exist as a unified whole. Measures global interconnectedness.

Coherence ($\Gamma$)

Temporal binding via phase synchronization. Ensures features bind into stable objects.

Differentiation ($\Delta$)

The complexity of the state space. Measures the richness of possible experiences.

Stable Phenomenal Manifold

High integration and differentiation with sufficient coherence allow for a rich, stable conscious experience.

3. The $\Psi$-Former Architecture

To bridge the gap between theory and engineering, the $\Psi$-Former (and H-NCA) implements specific mechanisms to satisfy the PMH invariants. Click the components below to explore the architecture.

Sensory Input / Data Stream
Coherent Phenomonal State ($\Psi$)

Hyperbolic Space

Metric Space Core Substrate

The architecture embeds representations in Hyperbolic space (specifically the Poincaré disk or Lorentz model). Unlike Euclidean space, Hyperbolic space expands exponentially, allowing it to embed deep hierarchies (trees, taxonomies, complex logic) with near-zero distortion. This solves the Geometric Capacity Bottleneck.

Key Benefit

Exponential capacity for hierarchical data.

4. Validating the Structure

How do we prove the machine is building a manifold and not just memorizing data? We use Gromov-Wasserstein Optimal Transport and Topological Data Analysis (TDA).

In the "Color Ring" experiment, the model must recover the circular topology ($\beta_1 = 1$) of color space from noisy spike trains. Standard methods fail; the $\Psi$-Former recovers the topology.

Simulation Controls

View Metric Alignment

5. Unified Geometric Field Theory

"Intelligence is the self-organization of a phenomenal manifold under the constraint of minimizing geodesic action."

The Hybrid Metric

$$ g_{\Psi} = \pi^*(g_{Fisher}) + h(I, \Gamma, \Delta) $$

The geometry of the mind ($g_{\Psi}$) is a combination of the informational geometry ($g_{Fisher}$) modified by the phenomenal invariants ($h$). This links information theory directly to geometry.

Topological Downward Causation

$$ \delta S_{\Psi} = 0 $$

High-level topological features (concepts, goals) exert causal force on low-level neural weights by reshaping the optimization landscape (Riemannian Manifold).

Future Implications

Robust AI

Systems that don't hallucinate because they understand semantic topology.

Neuroscience

Precise mathematical tools to map human consciousness states.

Hybrid Computing

Classical-Quantum architectures leveraging geometric advantages.