Skip to content

Quality Signal

sentrux computes a single continuous score called Quality Signal. It’s the geometric mean of 5 root cause metrics, each normalized to [0, 1], then scaled to 0–10,000.

quality_signal = (modularity × acyclicity × depth × equality × redundancy) ^ (1/5) × 10000

The Nash Social Welfare theorem (1950) proves the geometric mean is the unique aggregation satisfying:

  1. Pareto optimality — if all scores improve, signal improves
  2. Symmetry — all root causes weighted equally
  3. Independence — irrelevant dimensions don’t affect the result

Practically: you cannot game one metric while tanking another. The ONLY way to raise the geometric mean is to improve ALL factors. This forces genuine architectural improvement.

Letter grades create artificial boundaries. Score 79 = B, score 81 = A. AI agents game the boundary instead of genuinely improving. With continuous scores, every +1 matters equally. The agent converges naturally when improvements become marginal — like gradient descent.

Proxy metrics (coupling ratio, dead code %, function length) measure symptoms, not root causes. An AI agent can game individual proxies — add fake imports to boost cohesion, split functions superficially. The metrics improve but the code doesn’t.

Root cause metrics measure fundamental structural properties that can only improve through genuine architectural change.

A codebase is a directed graph G = (V, E) where V = files and E = dependencies. This graph has exactly 5 independent structural properties:

MetricTheoryWhat it measuresEdge or Node?
ModularityNewman 2004Do edges cluster into modules?Edge
AcyclicityMartin 2003Are there circular edges?Edge
DepthLakos 1996How deep are edge chains?Edge
EqualityGini 1912Are node properties concentrated?Node
RedundancyKolmogorovAre there unnecessary nodes?Node

3 edge properties + 2 node properties = 5 total. Adding more would be redundant.

modularity = (Q + 0.5) / 1.5 # bounded [-0.5, 1] → [0, 1]
acyclicity = 1 / (1 + cycles) # unbounded → sigmoid
depth = 1 / (1 + max_depth / 8) # unbounded → sigmoid, midpoint 8
equality = 1 - gini # bounded [0, 1] → invert
redundancy = 1 - ratio # bounded [0, 1] → invert

3 of 5 metrics have zero arbitrary parameters. Only 2 need a sigmoid midpoint.

Iteration 1: signal = 5800 → equality is lowest (3500)
→ refactor god function → equality improves → signal = 6300
Iteration 2: signal = 6300 → modularity is lowest (5500)
→ extract module → modularity improves → signal = 6900
Iteration 3: signal = 6900 → redundancy is lowest (7200)
→ remove dead functions → redundancy improves → signal = 7400
...diminishing returns → natural convergence

No artificial stopping point. The AI converges when marginal improvement per change approaches zero — exactly like gradient descent.

TheoryYearWhat it provides
Cybernetics (Wiener)1948Feedback loop architecture
Systems Engineering (Tsien)1954Decomposition into independent subsystems
Kolmogorov Complexity1963Theoretical ground truth
Graph Modularity (Newman)2004Modularity Q metric
Nash Bargaining1950Geometric mean as optimal aggregation
Gini Coefficient1912Inequality measurement
Tarjan’s Algorithm1972Cycle detection
Lakos Levelization1996Dependency depth