Core Philosophy: Minimizing the Trusted Computing Base
The UBHNL architecture is predicated on the principle of Functional Isolation. We distinguish between the untrusted layers that handle complex language and planning, and the tiny, trusted core that performs the actual mathematical validation. This separation ensures that even if a complex solver or parser fails, the integrity of the final proof remains protected.
The Semantic Layer: Interface and Normalization
The "Front-End" of the architecture is responsible for mapping human intent to formal logic. This layer utilizes Semantic Isomorphism: whether a rule is authored in natural-sounding CNL or symbolic DSL, the underlying AST is identical. The use of a strict Lexicon ensures that the system is closed under its own vocabulary, preventing the introduction of undefined symbols or ambiguous semantics.
The Orchestration Layer: Fragmentation and Tactics
The "Orchestrator" acts as the strategist. It analyzes the formal theory to identify its Logical Fragment (e.g., Boolean, SMT, or Temporal). Based on this classification, it constructs a Plan DAG—a sequence of lowering, solving, and checking tasks. This layer allows UBHNL to be universal: it doesn't rely on a single solver but instead delegates fragments to specialized, pluggable backends.
The Trusted Core: Validation via UBH
At the foundation lies the UBH Kernel. This is the only part of the system that must be implicitly trusted. It operates on a minimal set of Boolean gates (XOR and AND) using a Hash-Consed DAG for maximal structural sharing. Its primary responsibility is not solving, but Verification. It validates the "certificates" (proofs or witnesses) returned by the backends, ensuring that no result is accepted without checkable evidence.
Current State and Scalability
The current architecture is highly modular, allowing for the addition of new backends without altering the kernel. Future development is directed toward Distributed Orchestration and Incremental Proof Checking, enabling the system to reason over theories consisting of millions of constraints while maintaining near-instantaneous verification times.