AI-assisted implementation
AI plays two distinct roles in NEXUS. Conflating
them misrepresents both. Understanding the
distinction is essential to understanding how
NEXUS works.
The Two AI Roles
AI as design collaborator — works alongside
the human during the design phase. Helps explore
the Event Model, challenges assumptions, identifies
gaps in paths, refines specs, asks clarifying
questions. The human drives; the AI is a highly
informed thinking partner that extends human
judgment rather than replacing it.
This is not a future capability — it is how NEXUS
design work happens now. AI conversations that
shape designs, refine requirements, and work
through specifications are part of the process.
Those conversations are preserved in LOGOS as
provenance — the reasoning behind decisions is
part of the permanent record.
CORTEX is the long-term vision for this role:
an AI deeply familiar with the accumulated
knowledge base, established patterns, previous
decisions, and the full history of the system
being designed. Not a generic AI assistant —
a collaborator you have shaped: trained on this
system's history, this methodology as you
practice it, and your own preferences,
constraints, and ways of thinking.
AI as renderer — receives an approved spec
and produces code. A fundamentally different
task requiring no design judgment. The design
is done. The renderer's job is faithful,
idiomatic translation of the spec into the
target language following the patterns and
conventions NEXUS prescribes.
The renderer could be a purpose-trained or
fine-tuned model optimized specifically for
rendering NEXUS specs into idiomatic F#.
Different training, different optimization,
different role than the design collaborator.
The Handoff Point
The approved spec is the handoff between the
two AI roles.
Design collaborator works up to approval —
helping shape the spec, identify gaps, refine
rules and constraints, validate example data.
Renderer takes over after approval — consuming
the spec as a complete, unambiguous input and
producing code from it.
This separation has a practical consequence:
the two roles can be filled by different
AI systems. The design collaborator needs
broad reasoning ability, knowledge of the
domain, and familiarity with the accumulated
LOGOS knowledge base. The renderer needs deep,
specific knowledge of F# and NEXUS patterns —
trained and tuned for that specific task.
Rendering, Not Implementing
The word "implementation" slightly misrepresents
what the renderer does. Implementation implies
creative problem-solving — figuring out how to
make something work. The renderer does not do
that. The design has solved the problem. The
renderer translates the solution into code.
Rendering is the better term. The same approved
spec can be rendered into multiple target
languages from the same source. The render
target is a parameter — F#, JavaScript, Haskell,
Kotlin — not a design decision. Changing the
render target does not require redesigning.
NEXUS currently targets F# as its primary render
target. The methodology does not require it.
See: NEXUS → Language-agnostic design — one
model, many implementations.
The Rendering AI
The rendering AI for NEXUS F# is a specific,
trainable target. It has:
- Well-defined inputs — approved specs in
a consistent format - Well-defined outputs — idiomatic F#
following NEXUS patterns - Explicit rules — how to approach specific
tasks, which patterns to apply, which
conventions to follow - A growing training corpus — every
spec-to-implementation pair that accumulates
over time becomes training data
The renderer improves as more specs are rendered.
Early renders are informed by the spec and
general F# knowledge. Later renders are informed
by accumulated examples of how NEXUS specs
become NEXUS F# — the specific patterns, the
type-driven conventions, the event sourcing
idioms that NEXUS prescribes.
This is a tractable fine-tuning target. The input
format is controlled. The output conventions are
defined. The corpus grows naturally through use.
F# and Type-Driven Design
Within NEXUS, F# is the chosen primary render
target for reasons that align directly with
the methodology:
Invalid states are unrepresentable — F#'s
type system, used correctly, makes it impossible
to construct values that violate business rules.
Constraints specified in a spec map directly to
F# types and smart constructors that enforce
them at compile time.
Event sourcing is natural — discriminated
unions model event types cleanly. Pattern
matching over events is exhaustive and
compiler-checked. Projections as pure functions
from events to state are idiomatic F#.
Functions are first class — commands and
projections are functions. F# treats functions
as first-class values. The alignment between
the Event Modeling concept and F# implementation
is direct.
The rendering AI, working from an approved spec,
produces F# that reflects these properties —
not just code that passes tests but code that
uses the type system to make correct states
natural and incorrect states impossible.
The Design Collaborator in Practice
Using AI as a design collaborator is not a
special mode or a future state — it is integrated
into how NEXUS design work happens. An AI
conversation that works through an Event Model,
refines a path, or challenges a spec is part
of the design process.
What makes this work well rather than poorly:
The AI needs context. A generic AI assistant
with no knowledge of the system produces generic
suggestions. CORTEX — fed from LOGOS — produces
informed suggestions grounded in the actual
history and decisions of the system being
designed.
The human drives. The AI surfaces options,
identifies gaps, asks clarifying questions. The
human makes decisions. The AI does not approve
its own suggestions — the human does.
Conversations are preserved. The reasoning
that happens during AI-assisted design is not
lost when the conversation ends. It lives in
LOGOS, linked to the design decisions it
produced, permanently part of the record.
See: NEXUS → The knowledge accumulation
principle.
See: CORTEX — Start Here.
Acceptance
Design collaborator: An AI with access to
the LOGOS knowledge base can participate
meaningfully in Event Modeling, path definition,
and spec refinement — producing suggestions
grounded in the system's accumulated decisions
rather than generic patterns.
Renderer: An AI given only an approved spec
produces an implementation that passes all
acceptance criteria and satisfies all example
data cases. The implementation uses F# types
to make invalid states unrepresentable where
the spec prescribes constraints. No design
decisions are made during rendering — all
design decisions are resolved in the spec.
See Also
- Event Modeling as the design foundation
- Specs and the design-before-implementation discipline
- Paths as first-class artifacts
- The problem NEXUS solves
- NEXUS → Language-agnostic design — one model,
many implementations - NEXUS → The knowledge accumulation principle
- NEXUS → Tools and how they fit