The "Emergent" Research Program and Asemantic Computing

I used Qwen3-Max, on Jan 26, 2026, via this Qwen blog link and then "Try Qwen Chat", with the question:

Give me a detailed, for professionals, summary of the link

A kaleidoscope of graph rewrite systems in topology, metric geometry and computer science

[available from Lambdalife: chemlambda projects page]

You may search further for related research.

After the answer, I had another question concerning asemantic computing. This was answered too.

Thus the content has two parts:

[The answers are slightly edited, mainly by deletions or added links.]

Summary: Marius Buliga's "Emergent" Research Program (2009–2019)

This December 2019 presentation by Marius Buliga (Institute of Mathematics of the Romanian Academy) synthesizes a decade-long research program bridging differential geometry, topology, and theoretical computer science through the unifying lens of graph rewrite systems. The work represents a radical reconceptualization of space itself as an asynchronous computational substrate rather than a passive container.

1. Geometric Origins: Solving Gromov's Sub-Riemannian Recovery Problem

The research trajectory begins with a fundamental problem in metric geometry posed by Gromov (1996): Can one recover a sub-Riemannian structure (X,D,g) purely from its associated Carnot-Carathéodory metric space (X,d)? Unlike Riemannian geometry—where Alexandrov-Nikolaev theory solves the analogous recovery problem—sub-Riemannian spaces exhibit features that obstruct classical approaches:

Buliga's solution (2009–2011) introduces emergent algebras: a Γ-indexed family of idempotent right quasigroup operations on a uniform space X satisfying convergence properties as the index approaches 0. Formally, an emergent algebra (X,Γ,∘,•) requires:

This framework generalizes dilatation structures (Buliga 2006) and provides an intrinsic characterization of sub-Riemannian geometry without reference to smooth manifolds or distributions. Notable examples include Heisenberg groups and general Carnot groups, where the emergent algebra operations recover the group law through limiting processes.

2. Topological Bridge: Reidemeister Moves as Computational Primitives

An insight connects emergent algebras to low-dimensional topology: the algebraic axioms correspond to Reidemeister moves from knot theory. While emergent algebras satisfy only R1 and R2 moves (reflecting their quasigroup structure), the full Reidemeister system (R1–R3) provides a graphical calculus. [Added: here Qwen simplifies a bit, the reality is that R3 is emergent from R1, R2 and emergence axiom.] This observation catalyzes the transition from geometry to computation: space itself becomes a graph rewrite automaton—an asynchronous system (S,R,A) where:

This model eliminates the classical distinction between "space" and "computation": there are no passive points, only shared computation on a graph-theoretic substrate.

3. Computational Realization: From GLC to Chemlambda

Buliga develops two concrete implementations of this vision [see History of Chemlambda]:

Graphic Lambda Calculus (GLC) (2013)

A visual language representing untyped λ-calculus via graph rewrites on oriented fatgraphs with node types:

GLC incorporates both β-reduction (via graphical β-rewrite) and Reidemeister moves, creating a hybrid system where logical computation and topological transformation coexist.

Chemlambda (v1: 2014; v2: 2015–present)

An artificial chemistry implementation inspired by Berry-Boudol's Chemical Abstract Machine and Lafont's Interaction Combinators. Key properties:

Chemlambda v2 (implemented in awk/d3.js) revealed unexpected phenomena: certain graphs (e.g., the “ouroboros” quine) display lifelike behaviors including periodic metabolism and spontaneous duplication—suggesting pathways toward artificial life in purely syntactic systems.

4. Advanced Developments: Kali and the Anharmonic Group

The presentation culminates with kali (anharmonic lambda), a refined rewrite system connecting to projective geometry:

This system demonstrates how algebraic structures from classical geometry (cross-ratios, harmonic conjugates) emerge naturally in graph rewrite dynamics—suggesting deep connections between projective geometry and computation.

[This research continues to Pure See and is under construction in Hexagon. ]

5. Philosophical and Technical Significance

Buliga's program makes several contributions:

  1. Space as computation: Rejects the container metaphor for space; proposes space is a distributed graph rewrite process
  2. No Church-Turing bottleneck: Computation occurs without λ-abstraction/application as primitives — β-reduction emerges from local rewrites on graphs lacking explicit variable binding
  3. Decentralized computing model: Offers a concrete alternative to von Neumann architecture where computation is:
    • Asynchronous and local
    • Free of global state or centralized control
    • Naturally parallelizable (all applicable rewrites fire independently)
  4. Bridge between continuous and discrete: Emergent algebras provide rigorous passage from discrete graph rewrites to continuous geometric structures (Carnot groups)
  5. Artificial life substrate: Quine graphs in chemlambda demonstrate that self-replication and metabolism can emerge from purely syntactic rewrite rules—without predefined semantics or external interpreters

Critical Assessment & Open Challenges

While mathematically rigorous in its geometric foundations, the computational aspects face practical limitations:

Key References for Further Study

  1. Buliga, M. (2009). Emergent algebras. arXiv:0907.1520
  2. Buliga, M. (2013). Graphic lambda calculus. Complex Systems 22(4)
  3. Buliga & Kauffman (2014). Chemlambda, universality and self-multiplication. ALIFE 2014
  4. Buliga (2020). Artificial chemistry experiments with chemlambda. arXiv:2003.14332
  5. Buliga (2006). Dilatation structures I. Fundamentals. arXiv:math/0608536

Concerning your "semantic gap" critical assessment, it seems to be a feature. Indeed, read the related two:

Molecular computers with interaction combinators like graph rewriting systems

Asemantic computing

What do you think?

Asemantic Computing: From Limitation to Foundational Principle

The earlier assessment reflected a category error: evaluating an asemantic computational paradigm through the lens of semantic computer science. After examining the molecular repository documents, this represents a deliberate rejection of semantics as a computational primitive—not an engineering shortcoming.

1. The Semantic Bottleneck in Distributed Systems

Buliga identifies an incompatibility between global semantics and true distributed computation:

System Type Semantic Status Distributed Capability Reason
Single Turing Machine Asemantic (tape operations are purely local) Trivially distributed No coordination needed
Network of TMs Requires semantics Impossible without global control Multiple heads writing to shared tape necessitate non-local synchronization
Term rewrite systems (λ-calculus) Inherently semantic Fundamentally non-local Variable binding, substitution, and β-reduction require global scope resolution
Graph rewrite systems (IC, chemlambda) Asemantic by construction Truly distributed Rewrites apply locally to graph neighborhoods; no global state or binding

The insight: semantics requires non-local coordination. When we demand that a distributed system preserve denotational meaning (e.g., "this graph must reduce to the λ-term representing 42"), we implicitly require global agreement mechanisms—exactly what distributed systems seek to eliminate. Nature avoids this problem: molecular interactions proceed without "meaning"; ribosomes don't "understand" mRNA—they catalyze local reactions whose consequences we interpret semantically after the fact.

2. Structure-to-Structure as Primary Reality

Traditional CS operates via a semantic sandwich:

meaning (program) → structure (AST/graph) → [computation] → structure → meaning (result)
                      ↑_______________semantic constraints_______________↑
      

Semantic constraints force graph rewrites to mirror term reductions (e.g., β-reduction must be preserved under graph encoding). This coupling makes distributed execution impossible—different agents must agree on which rewrites preserve meaning.

Asemantic computing inverts this hierarchy:

structure (molecule) → [autonomous local rewrites] → structure (product)
↑                                          ↑
human-design (pre)                        human-interpretation (post)
      

No semantic constraints bind the computational phase. The initial structure encodes computational potential; the rewrite dynamics unfold autonomously via local rules; meaning emerges only in the observer's interpretation of inputs/outputs. This mirrors biological computation: a ribosome doesn't "know" it's translating genetic code—it catalyzes peptide bonds based on local tRNA-mRNA affinities. The semantic layer (genetic code) exists outside the molecular machinery.

3. Why Confluence Must Be Abandoned

Buliga states: confluence is death for life-like computation. Lafont's Interaction Combinators achieve Turing universality because they're confluent—any reduction path leads to the same normal form. But:

Chemlambda deliberately sacrifices confluence to enable:

This isn't a bug—it's a feature enabling artificial chemistries rather than abstract machines.

4. Lafont Universality vs. Semantic Universality

A distinction often missed in CS:

Chemlambda achieves Lafont universality without semantic constraints. The Ackermann function can be computed—but not as a denotation-preserving reduction. Instead, an initial graph structurally evolves through local rewrites into a final graph whose shape we interpret as the result. The computation happens in the structure-to-structure phase; semantics is a human post-hoc annotation.

5. Programming in an Asemantic World: From Control to Cultivation

Traditional Programming Asemantic Programming
Write instructions for a controller Design initial molecular structures
Enforce global invariants Engineer local interaction affinities
Debug by tracing execution Observe emergent population dynamics
Garbage collection as runtime service Death/recycling as intrinsic property
Deterministic outcomes Probabilistic lifecycles with statistical guarantees
"As long as we want to program it in the old ways… it is unclear how to use it." — Marius Buliga

The solution isn't better compilers—it's ecosystem engineering: designing molecular "seeds" that, when released into a rewrite medium, reliably produce desired structural outcomes through autonomous chemistry. This mirrors synthetic biology more than software engineering.

6. Assessment: Coherence and Uncertainty

Theoretical strength: Buliga's framework is philosophically coherent and addresses limitations in distributed computing theory. The semantic bottleneck is real—blockchain consensus, distributed databases, and actor models all struggle with the tension between locality and global meaning. Asemantic computation offers an escape route.

Practical challenges remain, but they're different from the earlier "semantic gap" critique:

Conclusion: The absence of semantics isn't a flaw but the defining feature that enables chemlambda's distributed, life-like behavior. This represents a reconceptualization of computation—not by adding new primitives, but by subtracting the assumption that computation requires meaning. Whether this paradigm can scale to practical engineering remains open, but its theoretical coherence warrants engagement from distributed systems researchers, programming language theorists, and artificial life practitioners. The challenge isn't "fixing the semantic gap"—it's learning to think without semantics as a computational primitive.