2026-01-30

recursion

this document outlines the main choices when designing recursive control.

design variables

each recursive construct can be analyzed along the following axes:

termination and descent

  • descent metric: a measure that strictly decreases on every recursive call (size, depth budget, fuel, rank)
  • base case: a closed-form result when the metric is minimal
  • totality surface: which inputs are guaranteed to reach a base case without dynamic checks

generative vs structural descent

  • structural

    • descent follows the input’s inductive structure
    • descent metric and base-case surface are induced by data shape
    • termination and totality collapse into a single well-foundedness condition
  • generative

    • descent follows a computed subproblem

    • requires an explicit well-founded metric over the full call state

    • termination and base-case coverage are independent obligations

    • every reachable state must monotonically approach the base surface

    • peel and recomposition must satisfy a coherence condition

    • reachability may be a strict subset of the totality surface

    • traversal order, pruning, and pre-order effects become semantically observable

decomposition interface

  • peel operation: extracts a smaller subproblem plus the information needed to recompose
  • branching factor: single descent, multi-branch, nested, or mutual recursion
  • mutual recursion: descent metric spans a product state space across functions

recomposition semantics

  • post-order: recurse first, then combine results
  • pre-order: act or rewrite before recursion
  • enter/exit: both descent and ascent actions
  • result algebra: how recursive results are combined

    • identity base value vs sentinel base value

    • order sensitivity and algebraic laws

state threading

  • parameters as state: explicit state passing
  • accumulators: monotone state toward final result
  • multiple accumulators: parallel invariants
  • derived state: implicit in call depth or path
  • continuation as state: suspended recomposition

stack discipline and representation

  • non-tail recursion: pending recomposition stored implicitly on the call stack
  • tail recursion: recursive call in tail position
  • CPS or explicit stack: recomposition frames made explicit
  • tail-call optimization: implementation property, not a semantic one

effects and observability

  • pure recursion: only return values matter
  • effectful recursion: effects on descent, ascent, or both
  • idempotence hazards: repeated visits in multi-branch recursion duplicate effects

core recursion forms

each form is characterized by where work occurs and what state is explicit.

  • post-order structural recursion

    • recurse on subproblem
    • combine results on return
    • pending recomposition stored in stack frames
  • tail-recursive accumulator form

    • update state before recursive call
    • return accumulator at base case
    • constant stack if TCO holds
  • pre-order recursion

    • rewrite or act before descent
    • useful for pruning and generative descent
    • termination depends on rewritten state decreasing the metric
  • multi-branch recursion

    • recurse into multiple subproblems
    • recomposition is a fold over recursive results
    • termination requires metric decrease along every branch
  • mutual recursion

    • functions call each other
    • termination metric spans the combined state space
  • continuation-passing style

    • suspended recomposition made explicit as a value

    • converts post-order recursion into tail calls

    • enables early exit and non-local control

recursion as a two-phase process

recursion factors into descent and ascent.

  • descent

    • transition over problem states
    • governed by a strictly decreasing metric
    • defines reachability of base cases and totality
  • ascent

    • fold over suspended recomposition contexts

    • governed by a result algebra

    • defines meaning and output structure

the call stack is an encoding of suspended recomposition, not a semantic primitive.

recursion as programmable traversal

some recursive designs make three roles explicit and independently configurable: how recursion chooses the next subproblems, how results are combined, and what information survives across calls. this treats recursion as a small control language rather than a single fixed pattern.

descent relation as a parameter

  • choose-children: a function that selects which recursive calls to make

    • examples: recurse into all children, only the first child, only nodes matching a predicate
  • prune?: a predicate that stops recursion on a branch

    • examples: stop when depth exceeds k, stop when a value is found, stop when a condition holds
  • rewrite-before-descent: transform a node before recursing

    • examples: normalize a subtree, drop empty nodes, expand a symbolic node
  • traversal order: order in which chosen children are visited

    • examples: left-to-right, right-to-left, depth-first, breadth-first

recomposition algebra as a parameter

  • rebuild: return a structure of the same shape with transformed elements

    • examples: map over a tree, replace selected nodes
  • summarize: fold child results into a single value

    • examples: sum all leaves, test if any element satisfies a predicate
  • collect: produce a flat list of selected elements

    • examples: collect all leaves, collect all matches
  • short-circuit: stop combining when a condition is met

    • examples: stop at the first match, stop when an accumulator reaches a bound

state and control carrier as a parameter

  • depth: pass the current nesting level

    • examples: count depth, apply different rules at different depths
  • context: pass the path from the root to the current node

    • examples: build filesystem paths, track enclosing constructs
  • accumulators: carry partial results or counters

    • examples: running totals, output buffers
  • continuations: decide whether and how to continue recursion

    • examples: early exit, skipping recomposition, resuming later

concrete pattern

  • input: a nested structure
  • descent: recurse only into nodes that satisfy a predicate; stop at depth 5
  • recomposition: rebuild the structure but drop elements that fail a test
  • state: carry current depth and path from the root
  • control: allow early exit when a match is found

this makes explicit what is usually implicit in plain recursion: which subproblems are visited, in what order, with what carried information, and how results are assembled. it also explains why many recursive functions differ only in small changes to these three roles.

considerations when designing recursion

  • identify the descent metric and prove strict decrease
  • ensure base cases cover all minimal metric states
  • decide where work occurs: pre-order, post-order, or both
  • determine what state must survive descent
  • decide whether recomposition must be deferred or can be accumulated
  • if descent is generative, justify termination explicitly
  • if effects occur, specify whether they run on descent, ascent, or both
  • if early exit is required, encode it via short-circuit, explicit continuation, or sum types

conversion between recursion and iteration

recursion and iteration are two surface encodings of the same three invariants: a descent metric, a state vector, and a recomposition rule. conversion consists of relocating these invariants between parameters, local variables, and control structure.

recursion to iteration

eliminate self-calls while preserving descent, state evolution, and recomposition.

  • descent metric

    • the recursive argument that strictly decreases becomes the loop variant
    • the base-case predicate becomes the loop termination condition
  • state

    • recursive parameters become loop-local variables
    • arguments of the recursive call become assignments in the loop body
    • parameter updates become rebinding of loop variables
  • base discharge: the base-case return value becomes the value produced at loop exit
  • recomposition

    • if no work occurs after the recursive call (tail recursion):

      • the self-call becomes reassignment followed by the next iteration
      • the result is already encoded in loop state
    • if work occurs after the recursive call (post-order recursion):

      • the recomposition rule must be made explicit as state or control mode
      • the information needed to finish the computation must no longer be implicit in control flow
  • evaluation polarity

    • pre-order work remains before the metric step

    • post-order work remains after base discharge

the conversion preserves meaning if and only if:

  • the descent metric strictly decreases on every recursive call
  • the same state values survive across descent
  • recomposition is applied in the same logical order
  • the base-case value matches the loop exit value

iteration to recursion

factor a loop into well-founded descent plus recomposition.

  • descent metric

    • the loop variant becomes a recursive parameter
    • the loop termination condition becomes the base-case predicate
  • state

    • loop-local variables become function parameters
    • variable updates become recursive argument changes
  • base discharge: the loop exit value becomes the base-case return value
  • recomposition

    • if the result is already encoded in loop state:

      • use direct tail recursion
    • if work occurs after the loop finishes:

      • post-loop work becomes post-order recomposition

      • the loop exit value becomes the base-case value

the conversion preserves meaning if and only if:

  • the descent metric strictly decreases on every recursive call
  • the same state values survive across descent
  • recomposition is applied in the same logical order
  • the base-case value matches the loop exit value