Why four, and why these four.
Each primitive names a field and proposes a new shape for it. Reduction — for the mathematics of complex multi-agent systems (BTUT). Crystallization — for the training paradigm of AI (TCD-JEPA / Crystara). Signal — for the philosophical foundation of macroeconomics (Regenerationism, operationalized by NIV). Ingestion — for the architecture of digital data governance (the Participatory Data Estate). Four fields, four paradigm proposals, each with working code and measured results.
BTUT — a new substrate for complex multi-agent systems. Multi-agent coordination is the mathematical problem underneath traffic networks, power grids, drone fleets, logistics, civic coordination, and autonomous vehicles — the systems the 21st century depends on. The dominant default is a PDE on agent density with O(N³) numerics, which crashes before 10,000 agents. BTUT replaces the default with a phase transition on a scale-free network under Fermi-rule updates, mean-field universality, constant 12-iteration convergence from 500 to 10,000 agents, and six live applied domains. Along the way it is also a new approach to DARPA Mathematical Challenge 13.
TCD-JEPA (Crystara) — a new AI training paradigm past transformers. Transformer scaling fixes the architecture at self-attention and pours more parameters and compute through it. TCD-JEPA refuses the fixed-architecture assumption: a recursive three-system loop explores the energy landscape with Fisher-information Langevin dynamics, runs Vietoris–Rips persistent homology on the trajectories, and crystallizes the stable features into typed H₀ / H₁ / H₂ predictor modules at runtime. The architecture is not designed; it is discovered. The first runtime-discovered predictor architecture for the JEPA family, and the first concrete instance of a post-transformer paradigm on working benchmarks: beats vanilla JEPA on three real heterogeneous graphs (+20 to +36.6 AUC pts) and beats supervised GAT (DeepMind), GCN (Google Brain), and GraphSAGE on a Georgetown CSET semiconductor supply chain.
Regenerationism — a new economic philosophy. The claim: the leading indicator of macroeconomic regime health is the velocity of capital formation with compounding margins, measured against cumulative friction — not bond-market sentiment, not linear averages of coincident series, not equilibrium-return dynamics. NIV is Regenerationism’s first operational instrument. It beats the Fed yield curve on several recession benchmarks (ROC-AUC 0.8538 @ 18 mo, 41.71% orthogonal variance), and the ensemble’s own Gini importance picks the regenerative-capital term unprompted at 0.9328. The repository is named regenerationism for a reason — the school precedes the signal.
Participatory Data Estate — a new framework for digital data governance. Every governance environment whose knowledge corpus is amended through human submissions (municipal agencies, NGOs, regulators, standards bodies, policy platforms, student governments) needs a data architecture that is continuous rather than batched, publicly auditable rather than privately moderated, and hardened to federal-agency patterns. The Participatory Data Estate proposes exactly that: Submit → Moderate → Thin → Crystallize, over pgvector + GIN FTS hybrid retrieval, with a publicly-readable approval ledger and federal-hardening controls. SGUNCCH is its first live deployment — a full UNC student-government stack running the framework end-to-end. A security posture student government has never needed, and has never had.
The four beats, stated plainly.
- BTUT — a new substrate for complex multi-agent systems. The class that civilization runs on: traffic, grids, drones, logistics, civic coordination, markets. BTUT replaces the O(N³) PDE default with a phase transition on a scale-free network (mean-field universality, β ≈ 0.5, constant 12-iteration convergence from 500 to 10,000 agents). One primitive, six live applied domains: traffic (Eclipse SUMO, 800-vehicle peak stress, zero gridlock), robotics (ROS / Turtlebot3), drone swarms (50 – 200, 100% cooperation), civic data (franklinstreetdata.com), game modeling (bigdunc.com), and four production cloud surfaces. Also: a new approach to DARPA Mathematical Challenge 13.
- TCD-JEPA (Crystara) — a new AI training paradigm past transformers. Transformer scaling pours compute through a fixed attention architecture. TCD-JEPA refuses the fixed-architecture assumption — a recursive three-system loop explores the energy landscape with Fisher-information Langevin dynamics, runs Vietoris–Rips persistent homology on the trajectories, and crystallizes stable features into typed H₀ / H₁ / H₂ predictor modules at runtime. +36.6 AUC pts over baseline JEPA on Georgetown CSET’s 519-entity semiconductor supply chain (82.7% vs 46.1%); also beats supervised GAT (DeepMind), GCN (Google Brain), GraphSAGE. +22.1 on GDELT; +20.0 at SEC EDGAR scale where GAT runs out of memory. 16 interpretable modules crystallized from persistent homology, 1-to-1 with real industry clusters — no labels, no prompting. The first runtime-discovered predictor architecture for the JEPA family.
- Regenerationism — a new economic philosophy, with NIV as its first operational instrument. The school’s claim: the leading indicator of macroeconomic regime health is the velocity of capital formation with compounding margins, measured against cumulative friction — not bond-market sentiment, not linear averages of coincident series, not equilibrium-return dynamics. NIV writes that school as a scalar. Ensemble ROC-AUC 0.8538 at 18 months across 504 months (1970 – 2024) and six OOS tests. 98.5% false-alarm suppression — 7 critical alerts in 42 years. 41.71% orthogonal variance to the Fed 10Y – 3M spread. Under Gini importance, the regenerative-capital term scores 0.9328 and the yield spread scores 0.0298 — the model picked the school unprompted.
- Participatory Data Estate — a new framework for digital data governance. The dominant data- governance stacks treat ingestion as batched, moderation as private, and audit trails as compliance tax. The Participatory Data Estate inverts all three: continuous ingestion (Submit → Moderate → Thin → Crystallize), moderation as a public transition (publicly-readable approval ledger via RLS), hybrid pgvector + GIN FTS retrieval with graceful fallback, federal-hardening controls (time-constant auth, RLS on every table, rate limiting on four action classes, CSP/HSTS, XSS detection). SGUNCCH is the first live deployment — a full UNC student-government stack running the framework end-to-end. A security posture student government has never needed, and has never had.
Every “paradigm” cited above has a working implementation, a measured benchmark against a named incumbent, and an open repository. The deep-dive chapters below give the mathematics, the tables, and the tear sheets in full.