Module system primitives
The Nix module system implements a sophisticated configuration composition mechanism that can be understood through its core algebraic primitives. This document provides three-tier explanations (intuitive, computational, and formal) for each primitive to illuminate both practical usage and mathematical structure.
Understanding these primitives is essential for working with Nix at scale because they explain why the module system supports complex patterns like conditional imports, priority overrides, and recursive submodules while maintaining predictable semantics. The algebraic foundations are not merely theoretical—they are the reason NixOS configurations compose reliably.
Where this fits
Section titled “Where this fits”This document provides the foundational understanding of Nix module system primitives. After mastering these concepts, see:
- Flake-parts and the module system — How flake-parts wraps these primitives for flake integration
- Deferred module composition — Organizational patterns built on these primitives
deferredModule
Section titled “deferredModule”Source reference: nixpkgs lib/types.nix:1138-1180
Intuitive explanation
Section titled “Intuitive explanation”A deferred module is a module definition that hasn’t been evaluated yet because it needs to wait for the final configuration to be computed. Think of it as a recipe that says “once you know what the final meal looks like, I’ll tell you what ingredients I need.” This is the fundamental mechanism that allows modules to reference the final configuration value without creating infinite loops.
The most common form is a function taking { config, ... } arguments, where config refers to the fully-merged configuration after all modules have been combined.
This enables conditional logic like “if some other option is enabled, then enable this feature too” without the evaluator getting stuck in circular dependencies.
Computational explanation
Section titled “Computational explanation”When you write a module as a function:
{ config, pkgs, ... }: { services.nginx.enable = config.services.web.enable;}This function IS called immediately during the collection phase. The module system:
- Collects all modules by calling their functions with args including a lazy
configreference - Normalizes returned attrsets to canonical form
{ _file, key, imports, options, config } - Computes a fixpoint where the lazy
configreference resolves to the merged result
The “deferral” is in the lazy evaluation of config values within the returned attrsets, not in suspending function calls.
The type implementation shows this clearly:
deferredModuleWith = { staticModules ? [] }: mkOptionType { name = "deferredModule"; check = x: isAttrs x || isFunction x || path.check x; merge = loc: defs: { imports = staticModules ++ map (def: lib.setDefaultModuleLocation "${def.file}, via option ${showOption loc}" def.value ) defs; }; # ...};The merge function doesn’t evaluate the modules—it just collects them into an imports list.
The actual evaluation happens later in evalModules via the fixpoint computation.
Type vs value: deferredModule is an option type (specifically types.deferredModuleWith), not a value. When you write flake.modules.homeManager.foo = { ... }: { ... }, you’re assigning a value of type deferredModule to an option. The type’s merge function defines how multiple such values compose.
What “deferred” means (and doesn’t mean)
Section titled “What “deferred” means (and doesn’t mean)”The term “deferred” in module system context is frequently misunderstood:
NOT deferred (happens immediately during collection):
- Module function calls —
applyModuleArgsinvokes functions eagerly - Import resolution — paths are imported and processed during collection
- Structural normalization — modules converted to canonical form
IS deferred (resolved later):
- Config value evaluation — thunks referencing
configresolve on demand during fixpoint computation - deferredModule-typed option consumption — values stored in options like
flake.modules.*are only evaluated when a consumer imports them into their ownevalModulescall
The circularity in { config, ... }: { foo = config.bar; } works NOT because the function call is deferred, but because config is a lazy reference to the fixpoint result.
The function executes immediately and returns an attrset containing an unevaluated thunk (config.bar).
Formal characterization
Section titled “Formal characterization”A deferred module is a morphism in a Kleisli category over a reader-like effect that provides access to the final configuration.
where is the environment containing the fixpoint configuration , and is a structure defining options and configuration values.
More precisely, modules form a category where:
- Objects are module interfaces (sets of option declarations)
- Morphisms are module implementations (functions from configurations to definitions)
The deferred module type embeds the Kleisli category for the reader monad into .
A module expression of the form:
awaits the fixpoint-resolved configuration as its function argument:
where denotes the least fixpoint and is the join operation in the configuration lattice.
The module system orchestrates two complementary algebraic structures:
- Type-level monoid (module collection): deferredModule values form a monoid under concatenation of imports lists—identity is the empty list
[], operation is list concatenation++, and composition happens before fixpoint computation - Semantic-level join-semilattice (configuration merging): merged configuration values form a join-semilattice with type-specific merge operations, computed after the fixpoint resolves cross-module references
The transition from monoid (module collection) to semilattice (configuration merging) happens via evalModules fixpoint computation.
Deferred modules enable a traced monoidal category structure where the trace operation implements the fixpoint that ties the configuration back to itself.
Connecting formalism to implementation
Section titled “Connecting formalism to implementation”The Kleisli category characterization directly corresponds to everyday Nix module syntax for the reader-like operations:
| Reader Operation | Module System Primitive | Nix Manifestation |
|---|---|---|
ask | Config access | { config, ... }: config.foo.bar |
fmap | Option transformation | Defining values in terms of other options |
The reader monad captures deferred access to configuration: modules are functions awaiting the final config value.
However, the fixpoint computation that provides that config value requires a different categorical framework.
Fixpoint as trace: The equation config = F(config) is a trace operation in a traced monoidal category, not a Kleisli operation.
In categorical terms, trace “ties the knot” on an internal state:
For modules: is external input (pkgs, lib), is output configuration, and is the configuration being computed.
The trace feeds back into itself, implementing the recursive config = F(config) binding.
These two frameworks work together:
- Reader monad (Kleisli): explains why modules can reference
configwithout explicit threading - Traced monoidal category: explains why the self-referential fixpoint is well-defined
Concretely:
{ config, lib, ... }: { options.paths.base = lib.mkOption { type = lib.types.str; }; options.paths.processed = lib.mkOption { type = lib.types.str; };
config.paths.processed = "${config.paths.base}/processed";}This seemingly circular reference works because it combines both structures: the module is a reader computation (accessing config), and evalModules applies the trace operation to resolve the fixpoint.
The module doesn’t immediately evaluate config.paths.base—it constructs a function from config to definitions.
When evalModules computes the fixpoint via demand-driven lazy evaluation, it ties the knot: the final config becomes the argument to all module functions, resolving config.paths.processed without explicit threading.
Note that while this demonstrates the fixpoint mechanism, paths.base must still be defined by another module or have a default value.
The module system resolves the reference to the final merged value, but doesn’t create values from nothing.
If paths.base is undefined, evaluation will fail with “The option `paths.base’ was accessed but has no value defined. Try setting the option.”
This is not a limitation of the fixpoint—it’s the correct behavior: the module declares it needs a base path but doesn’t provide one.
evalModules
Section titled “evalModules”Source reference: nixpkgs lib/modules.nix:84-367
Intuitive explanation
Section titled “Intuitive explanation”evalModules is the function that takes a list of modules and produces a final configuration by:
- Recursively discovering all imported modules
- Collecting all option declarations
- Collecting all configuration definitions
- Computing a fixpoint where configuration values can reference the final merged result
- Merging all definitions according to option types and priorities
- Checking that all definitions match declared options
It’s the “main” function of the module system—the evaluator that turns a collection of module fragments into a coherent configuration.
The fixpoint computation is what enables powerful patterns like “enable this service if that service is enabled” without falling into infinite recursion, because all references to config are accessing the same final value.
Computational explanation
Section titled “Computational explanation”The evaluation proceeds in phases:
Phase 1: Collection (collectModules)
collectModules class modulesPath (regularModules ++ [internalModule]) argsRecursively expands all imports, filters disabledModules, and produces a flat list of normalized modules.
Phase 2: Merging (mergeModules)
merged = mergeModules prefix (reverseList (doCollect {}).modules);Traverses the option tree, matching definitions to declarations and recursing into submodules.
Phase 3: Fixpoint resolution
config = let declaredConfig = mapAttrsRecursiveCond (v: !isOption v) (_: v: v.value) options; freeformConfig = /* handle unmatched definitions */;in if declaredConfig._module.freeformType == null then declaredConfig else recursiveUpdate freeformConfig declaredConfig;The fixpoint is implicit in Nix’s lazy evaluation: when a module function is called with config, it receives a thunk that will eventually evaluate to the merged result—which may include values produced by that same function.
Nix’s laziness ensures this only works if there are no strict cycles (e.g., a = a + 1 fails, but a = if b then x else y; b = someCondition works).
The key implementation detail:
config = addErrorContext "if you get an infinite recursion here, you probably reference `config` in `imports`..." config;This shows config is a self-referential binding that relies on lazy evaluation to resolve.
Formal characterization
Section titled “Formal characterization”evalModules computes the least fixpoint of a module configuration functor in a domain-theoretic framework via demand-driven lazy evaluation, not classical Kleene iteration.
Let be the set of all modules, and define the configuration space as a complete lattice of partial configurations ordered by information content (the Smyth order: iff extends ).
Each module defines a function:
that takes a configuration and produces additional definitions. The combined module system defines:
where is the join operation in the configuration lattice (merging definitions according to type-specific merge functions and priority ordering).
By the Knaster-Tarski theorem, since is monotone on the complete lattice , it has a unique least fixpoint:
The classical Kleene characterization describes the mathematical object (where is the minimal configuration and denotes applications of ), but Nix does not compute this series directly—it uses demand-driven thunk evaluation instead, computing only the portions of the fixpoint actually demanded.
Lattice structure: The configuration lattice is product of per-option lattices:
where each option’s lattice is determined by:
- Primitive types (int, string): flat lattice (only and incomparable values)
mkMerge: forms join of sublatticesmkOverride: imposes priority ordering (values with priority dominate those with priority )submodule: recursive fixpoint on nested configuration lattice
Convergence: Nix reaches the fixpoint without exhaustive iteration because:
- Lazy evaluation computes only demanded portions of the configuration space
- Each demand resolves more thunks, monotonically increasing defined values
- The demanded slice has finite height (no infinite ascending chains in practice)
- Stabilization happens per-demanded-value, not globally across the entire lattice
Unlike classical Kleene iteration (which computes until global stabilization), Nix evaluates thunks on demand. The mathematical result is identical (the least fixpoint), but the computational path is fundamentally different.
Category theory perspective: The fixpoint computation implements a trace operation in a traced monoidal category. The module system forms a compact closed category where:
- Objects are option sets (interfaces)
- Morphisms are modules (implementations)
- The trace of a morphism is , which “ties the knot” on the internal state (the configuration being computed)
The equation config = F(config) is precisely the trace operation that connects the output configuration back to the input, relying on domain-theoretic fixpoints for well-definedness.
Option merging primitives
Section titled “Option merging primitives”Source references:
- nixpkgs lib/modules.nix:1469-1509 (mkIf, mkMerge, mkOverride)
- nixpkgs lib/modules.nix:1155-1257 (mergeDefinitions)
Intuitive explanation
Section titled “Intuitive explanation”Option merging determines what happens when multiple modules define the same option. The module system provides several primitives to control this:
mkMerge: Explicitly combine multiple values. Without this, writing foo = [a]; foo = [b]; in the same module would be an error (duplicate attribute). mkMerge [a b] says “I’m intentionally providing multiple values to be merged.”
mkOverride (and its aliases mkDefault, mkForce): Attach a priority to a value.
Lower numeric priorities win.
This enables the pattern where modules can set sensible defaults (mkDefault) that users can override without conflicts.
mkIf: Conditionally include a value.
Unlike a plain Nix if expression, mkIf conditions can reference the final configuration, and mkIf false values disappear entirely (don’t contribute to merging).
mkOrder (and mkBefore/mkAfter): Control the order of list elements when merging.
Useful for ensuring certain items appear first or last in merged lists.
Computational explanation
Section titled “Computational explanation”The merging process (mergeDefinitions) operates in stages:
Stage 1: Discharge properties
defsNormalized = concatMap (m: map (value: if value._type or null == "definition" then value else { inherit (m) file; inherit value; }) (dischargeProperties m.value)) defs;This expands mkMerge (flattens nested merges) and evaluates mkIf conditions:
mkMerge [a b c]becomes[a, b, c]mkIf true xbecomes[x]mkIf false xbecomes[]
Stage 2: Filter by priority
defsFiltered = filterOverrides' defsNormalized;Examines all mkOverride priorities and keeps only definitions with the highest priority (lowest number):
getPrio = def: if def.value._type or "" == "override" then def.value.priority else defaultOverridePriority; # 100highestPrio = foldl' (prio: def: min (getPrio def) prio) 9999 defs;values = filter (def: getPrio def == highestPrio) defs;Priority values (lower number = higher priority, “wins” in merge):
mkForce: 50 (force override)- No modifier: 100 (user value)
mkDefault: 1000 (module default)mkOptionDefault: 1500 (option’s own default)
Stage 3: Sort by order
defsSorted = if any (def: def.value._type or "" == "order") defsFiltered.values then sortProperties defsFiltered.values else defsFiltered.values;For list-valued options, mkBefore (priority 500) items appear before default (1000) before mkAfter (1500).
Stage 4: Type-specific merge
mergedValue = if type.merge ? v2 then checkedAndMerged.value # new v2 protocol else type.merge loc defsFinal; # classic mergeEach type defines its own merge function:
- Lists: concatenate
- Attribute sets: recursive merge
- Integers: must all be equal (or use
mergeEqualOption) - Submodules: recursive
evalModules
Formal characterization (semantic-level join-semilattice)
Section titled “Formal characterization (semantic-level join-semilattice)”Option merging forms a join-semilattice with priority stratification—this is the semantic-level algebraic structure that operates after fixpoint computation resolves cross-module references.
Join-semilattice structure: For each option of type , the set of possible merged values forms a join-semilattice where:
The join operation is type-dependent:
Lists: with join:
(list concatenation, associative with identity ).
Note: List concatenation is technically a monoid operation, not a true join-semilattice operation (it is not commutative or idempotent). The notation here emphasizes the merge semantics rather than strict lattice-theoretic structure.
Attribute sets: with pointwise join:
Submodules: (recursive fixpoint).
Priority stratification: Definitions carry a priority (lower is higher priority). The priority-filtered merge is:
This forms a lexicographic ordering: first compare priorities, then merge equal-priority values.
Formally, we have a stratified lattice:
ordered by iff and ( or ).
Conditional merging (mkIf): The condition mechanism extends the lattice with a bottom element representing “not defined”:
with and absorbing in the priority ordering.
The mkIf operation:
Order control (mkBefore, mkAfter): For list-typed options, elements carry an order priority .
The merge operation first sorts by order priority, then concatenates:
where sorts tuples by their first component.
Why this matters: The algebraic structure ensures:
- Associativity: Order of module evaluation doesn’t matter (up to priority ties)
- Commutativity: Module order doesn’t matter (except for lists without order annotations)
- Idempotence: Importing a module twice has the same effect as once (if no side effects)
- Composability: Submodules compose cleanly because they use the same merge algebra
The join-semilattice structure is the key to modular reasoning: you can understand each module’s contribution independently and combine them without global analysis.
Type system and constraints
Section titled “Type system and constraints”Source reference: nixpkgs lib/types.nix (entire file, especially type definitions)
Intuitive explanation
Section titled “Intuitive explanation”Option types serve two purposes:
- Runtime validation: Check that defined values match the expected shape (e.g., “is this actually an integer?”)
- Merge behavior specification: Define how to combine multiple definitions into a single value
Every option has a type (defaulting to types.unspecified if not given).
The type’s check function validates values, and its merge function combines them.
Common types:
- Primitives (
int,str,bool,path): Validate structure, require all definitions to be equal - Containers (
listOf,attrsOf): Recursively validate elements, concatenate or recursively merge - Submodules (
submodule,submoduleWith): Nest entire module evaluations - Combinators (
nullOr,either,enum): Logical combinations of types
Types form a little language for describing configuration schemas, similar to JSON Schema or TypeScript types, but with merge semantics baked in.
Computational explanation
Section titled “Computational explanation”A type is an attribute set with specific attributes:
mkOptionType { name = "descriptive-name"; description = "human-readable description"; check = value: /* returns true if value is valid */; merge = loc: defs: /* combines list of definitions into single value */;
# Optional but important: getSubOptions = prefix: /* for documentation and submodules */; getSubModules = /* list of submodules if this is a submodule type */; substSubModules = m: /* rebind submodules for type merging */; typeMerge = functor: /* merge two types (e.g., two listOfs) */;}Type checking happens during merge:
mergedValue = if isDefined then if type.merge ? v2 then if checkedAndMerged.headError or null != null then throw "not of type ${type.description}" else checkedAndMerged.value else if all (def: type.check def.value) defsFinal then type.merge loc defsFinal else throw "not of type ${type.description}" else throw "option has no value defined";Type merging allows combining option declarations:
# Module 1options.foo = mkOption { type = types.listOf types.int; };
# Module 2options.foo = mkOption { type = types.listOf types.int; };
# Result: types are merged, option is declared onceThe typeMerge function checks compatibility:
defaultTypeMerge = f: f': if f.name != f'.name then null # incompatible else if hasPayload then if mergedPayload == null then null else f.type mergedPayload # e.g., merge elemTypes else f.type;Submodule nesting creates recursive lattices:
types.submoduleWith { modules = [...]; }When merged, creates a nested evalModules call with the submodule list, creating a recursive fixpoint:
merge = loc: defs: let configuration = base.extendModules { modules = allModules defs; prefix = loc; }; in configuration.config;Formal characterization
Section titled “Formal characterization”The type system forms a category with:
- Objects: Types (potentially infinite sets equipped with merge operations)
- Morphisms: Type refinements (subtyping relations)
Each type defines:
- Value space: (the set of valid values)
- Merge algebra: where is a join-semilattice
- Interpretation function:
Type constructors are functors :
ListOf functor:
AttrsOf functor:
Submodule as fixpoint:
This is a recursive type equation: the type of a submodule is defined as the fixpoint of evaluating its modules, which may themselves contain submodules.
Type merging as coproduct: When two modules declare the same option with different types, the system attempts to merge the types. This is a pushout in :
For compatible types (e.g., both listOf int), the pushout exists.
For incompatible types (e.g., int and string), it doesn’t, and the system throws an error.
Constraint propagation: Type checking is interleaved with merging via the v2 merge protocol:
returning either the merged value or a type error. This enables fine-grained error messages pointing to specific problematic definitions.
Categorical perspective: The type system implements a graded monad where:
- The grade is the type
- The functor maps values to “typed optional values”
- The join operation is the merge function
The grading ensures type-safe composition: you can only merge values of compatible types.
Why types matter for composition: The type constraint lattice ensures:
- Local type checking: Each module’s definitions can be checked independently against declared types
- Compositional merging: Merge operations distribute over type constructors (e.g., merging two
listOf intgiveslistOf int) - Submodule isolation: Submodules can’t violate their parent’s type constraints
- Documentation generation: Types provide machine-readable schemas for automatic documentation
The type system transforms the module system from untyped attribute set merging into a typed configuration language with static guarantees.
Fixpoint computation and lazy evaluation
Section titled “Fixpoint computation and lazy evaluation”While not a single primitive, the interaction between Nix’s lazy evaluation and the module system’s fixpoint computation deserves explicit treatment.
Intuitive explanation
Section titled “Intuitive explanation”The module system’s “killer feature” is allowing modules to reference the final configuration while that configuration is still being computed. This works because Nix doesn’t evaluate expressions until their values are actually needed.
When you write:
{ config, ... }: { services.foo.enable = config.services.bar.enable;}Nix doesn’t immediately try to look up config.services.bar.enable.
Instead, it creates a thunk (a suspended computation).
Later, when something needs the value of services.foo.enable, Nix evaluates the thunk, which triggers evaluation of config.services.bar.enable, which may trigger other evaluations, and so on.
As long as there’s no strict cycle (A needs B’s value before B is computed, and B needs A’s value before A is computed), lazy evaluation finds a consistent solution.
Computational explanation
Section titled “Computational explanation”The fixpoint is established via Nix’s let rec binding:
let # Simplified view of evalModules internals config = mapAttrs (_: opt: opt.value) options; options = /* compute options by evaluating modules with 'config' */;in configThis is a mutually recursive definition. Nix resolves it by:
- Allocating thunks for both
configandoptions - When
optionsis demanded, evaluate it, which may demand parts ofconfig - When parts of
configare demanded, evaluate those specific attributes, which demands parts ofoptions - Continue until a consistent fixpoint is reached (or detect infinite recursion)
Infinite recursion detection: Nix tracks which thunks are currently being evaluated. If evaluating thunk A demands the value of thunk A (before A has finished), that’s infinite recursion:
# This fails:{ config, ... }: { services.foo.value = config.services.foo.value + 1;}But conditional recursion works:
# This works:{ config, ... }: { services.foo.value = if config.services.bar.enable then config.services.baz.value else 42;}As long as the chain of demands terminates before cycling back, lazy evaluation succeeds.
The practical implication: You can write modules that make decisions based on what other modules decided, creating a declarative “logic programming” style configuration where the order of module evaluation doesn’t matter.
Formal characterization
Section titled “Formal characterization”Terminology note: When describing the computational mechanism, we use “fixpoint” or “lazy fixpoint” to emphasize self-referential binding with demand-driven evaluation.
“Least fixpoint” is appropriate only in formal domain-theoretic contexts (like this section) where we characterize the mathematical object.
Nix does not compute via Kleene iteration ⊥, f(⊥), f²(⊥), ...—it uses let x = f x in x with lazy thunk resolution.
The fixpoint computation implements a domain-theoretic least fixpoint via Nix’s lazy evaluation strategy.
Scott domains: Nix values form a Scott domain—a partially ordered set where:
- Every directed subset has a least upper bound (join)
- The ordering represents “information content” ( means is “less defined” than )
The bottom element represents “not yet evaluated” (a thunk). As evaluation proceeds, thunks are replaced with more defined values.
Continuous functions: Each module defines a continuous function:
where is the Scott domain of configurations, and continuity means:
for any directed set .
Fixpoint theorem: For a continuous function on a pointed domain , the least fixpoint exists and equals:
Nix’s lazy evaluation implements this via demand-driven thunk resolution (not iteration):
- All values start as thunks (unevaluated expressions)
- When a value is demanded, its thunk is forced, which may demand other thunks
- Memoization caches evaluated thunks to avoid recomputation
- No explicit iteration sequence is constructed
Well-founded recursion: The domain has finite height in the demanded slice (the portion of the configuration actually needed). This ensures:
meaning iteration converges in finitely many steps.
Thunk semantics: A thunk is a suspended computation—an unevaluated expression in the domain . Thunks represent the bottom element before evaluation; forcing a thunk either produces a value or another thunk (in case of lazy chains). The partiality aspect comes from non-termination: a thunk that loops forever never produces a value.
Infinite recursion as non-termination: A strict cycle creates a non-continuous function where:
The least fixpoint is (undefined), and Nix’s evaluator detects this by tracking the call stack.
Categorical perspective: The fixpoint construction is a trace operation in the category of domains and continuous functions:
For , the trace is the fixpoint of the internal state .
In the module system:
- is the external input (module parameters like
pkgs) - is the output configuration
- is the internal state (the configuration being computed)
- is the module evaluation function
- is
evalModules, which ties the configuration back to itself
The trace operation implements the “knot-tying” that makes self-referential configurations work.
Why this enables modular reasoning: The fixpoint is deterministic (least fixpoint is unique) and depends only on the module functions, not evaluation order. This means:
- Module composition is order-independent (commutative)
- Adding modules is monotone (more modules = more defined configuration)
- Local reasoning is sound (understand each module in isolation, combine via fixpoint)
The domain-theoretic foundation ensures the module system’s declarative semantics are mathematically rigorous, not just “it works because Nix is lazy.”
Summary
Section titled “Summary”The Nix module system’s algebraic primitives form a coherent mathematical structure:
- Deferred modules embed Kleisli category morphisms, enabling computation to reference fixpoint results; at the type level, they form a monoid under imports list concatenation
- evalModules computes the unique least fixpoint in domain-theoretic configuration lattices via demand-driven lazy evaluation, not classical Kleene iteration
- Merging primitives implement semantic-level join-semilattice operations with priority stratification (after fixpoint resolves references)
- Types define graded monads constraining merge algebras and enabling compositional reasoning
- Lazy evaluation realizes domain-theoretic fixpoints via demand-driven thunk evaluation, computing only demanded portions of the configuration
Together, these primitives transform attribute set merging into a powerful typed functional language for declarative configuration, where:
- Composition is order-independent (associative, commutative)
- Submodules nest cleanly (recursive fixpoints)
- Overrides work predictably (priority lattice)
- Circular dependencies resolve automatically (lazy fixpoint)
Understanding these algebraic foundations explains why the module system supports complex patterns (conditional imports, priority overrides, recursive submodules) while maintaining predictable semantics. The mathematics isn’t just theoretical—it’s the reason NixOS configurations compose reliably at scale.
Further reading
Section titled “Further reading”Related documentation
Section titled “Related documentation”- Flake-parts and the module system — How flake-parts wraps evalModules for flake composition
- Deferred module composition — Aspect-based organizational patterns using these primitives
External resources
Section titled “External resources”- nix.dev module system tutorial — Official introduction to module system concepts
- nixpkgs lib/modules.nix — evalModules implementation
- nixpkgs lib/types.nix — Type system implementation including deferredModule