The Diffusion of Meaning: Entropy and Mutual Information as Measures of Strategic Ambiguity
A clear account concentrates meaning. Each statement narrows the space of possible interpretations, reducing uncertainty about what happened, who was involved, and what was intended. A vague account diffuses meaning across many possibilities, each statement adding to rather than reducing the ambiguity. Shannon entropy — the foundational quantity of information theory — provides the formal measure for this concentration or diffusion. When applied to the thematic distribution of a narrative, entropy reveals whether a party is narrowing their account toward specific claims or spreading it across diffuse topics. Combined with mutual information measurements between parties' narratives and between narratives and evidence, entropy analysis becomes a powerful detection tool for the deliberate production of ambiguity.
The Mathematical Foundation
Shannon entropy measures the uncertainty inherent in a probability distribution. A distribution concentrated on a single outcome has zero entropy — there is no uncertainty about what will happen. A uniform distribution across many outcomes has high entropy — observing the outcome tells you little because any outcome was roughly equally likely. Applied to narrative analysis, the "distribution" is the allocation of thematic mass across the semantic space covered by a party's communications.
Formally, for a thematic distribution P over n themes, the entropy H(P) = -Σ pᵢ log pᵢ. Low entropy means the narrative is concentrated on a few dominant themes. High entropy means the narrative is scattered across many themes with roughly equal emphasis. The entropy is maximised when the distribution is uniform — the party is talking about everything with equal intensity — and minimised when one theme dominates completely.
Mutual information extends this analysis to pairs of distributions. Where entropy measures the uncertainty in a single distribution, mutual information measures the reduction in uncertainty about one distribution given knowledge of another. I(X; Y) = H(X) + H(Y) - H(X, Y). High mutual information means the two narratives are statistically coupled — knowing one constrains expectations about the other. Zero mutual information means they are independent — one reveals nothing about the other.
Theme Entropy as Vagueness Detector
The primary diagnostic application is tracking thematic entropy over time within a single party's narrative. Honest communicators maintain moderate, stable entropy — they focus on the themes that are actually relevant to the dispute, and the relevance is externally constrained by events. Their thematic distribution has clear peaks: the most important themes dominate, secondary themes support, peripheral themes are minimal. The entropy is high enough to show richness and context but low enough to demonstrate focus.
Manipulative communicators who are strategically vague produce high and increasing entropy. By avoiding commitment to specific themes — by raising every issue briefly and committing to none — they produce a distribution that is spread across the semantic space. There are no peaks because the party does not want peaks; peaks create accountability. The entropy rises as the strategy intensifies because the diffuseness is the strategy.
The rate of entropy increase is diagnostic. A party who is genuinely elaborating on a complex situation shows entropy increases that correlate with new information — each new relevant topic adds genuinely to the account. A party who is strategically vague shows entropy increases that correlate with avoidance — each new topic is a diversion from the specific claim being evaded. Organic evolution produces temporary entropy increases that subsequently reorganise around new focal points — the party addresses a new issue, then settles into a new stable distribution. Strategic vagueness produces monotonically increasing entropy with no reorganisation — the diffuse distribution persists and deepens because no new stable focal point is ever established.
The Concentration Index
The entropy can be normalised to produce a concentration index that ranges from 0 (perfect concentration on one theme) to 1 (perfect uniformity across all themes). This normalised measure enables direct comparison between parties regardless of how many themes are identified in their communications.
Honest communicators typically exhibit concentration indices in the 0.4 to 0.6 range — enough concentration to show focus, enough dispersion to show context. Manipulative communicators exhibit concentration indices above 0.8 — the distribution is so diffuse that it carries almost no information about what the party actually regards as important. This extreme diffuseness is itself a signal: genuine communication about real events cannot maintain such high entropy without the narrative becoming meaningless.
Cross-Narrative Mutual Information
Beyond single-party entropy, mutual information measures the statistical coupling between two parties' narratives. When two parties describe the same events, their accounts should share informational content — they are describing the same reality, filtered through different perspectives. Mutual information quantifies how much knowing one party's account reduces uncertainty about the other's.
Genuine dispute produces high mutual information. Both parties are responding to the same events, referring to the same timeline, discussing the same characters. Their thematic distributions, while different in emphasis, are statistically constrained by the shared underlying reality. Knowing what Party A is talking about provides information about what Party B will talk about because both are generating from the same event space.
Manipulation produces low mutual information — but for different reasons depending on the strategy. In coordinated fabrication, both parties construct narratives independently, and the independent generation produces distributions that are uncorrelated. In strategic avoidance, one party deliberately operates in a different thematic space than the other, producing distributions that are explicitly decoupled. In either case, the mutual information drops below what the shared events would naturally produce.
In coordinated messaging — where parties have agreed on a shared narrative or where one party is controlling the other's account — mutual information will be anomalously high. The two narratives are not independently generated from observation of events; they are generated from a common script. This produces a distinctive signature: mutual information that is too high, too stable, or too insensitive to the natural variation that independent accounts would show. The diagnostic is the deviation from expected mutual information given the documented overlap in events.
Narrative-Evidence Mutual Information
A third application measures mutual information between a party's narrative and external evidence — documents, timestamps, communications records, witness accounts. The question is how much the narrative constrains expectations about what the evidence will show, and vice versa.
Honest accounts have high mutual information with evidence because both are generated from the same underlying events. The narrative predicts the evidence and the evidence confirms the narrative. This coupling is measurable: the thematic distribution in the narrative should overlap substantially with the thematic distribution in contemporaneous documents, and the temporal alignment should be tight.
Fabricated narratives have low mutual information with evidence because the fabrication is not constrained by the actual events. The narrative may describe events that leave no documentary trace, or describe them in ways that conflict with the available evidence. The mutual information drops because the narrative and evidence are generated from different sources — the narrative from imagination, the evidence from reality.
Strategic avoidance minimises mutual information differently: by staying so vague that the narrative is compatible with any evidence, the party ensures that no specific prediction is made and no specific contradiction can arise. The mutual information is low not because the narrative conflicts with evidence but because it makes no testable claims about evidence at all. The evidence-narrative mutual information curve over time is particularly revealing. Honest accounts show mutual information tracking the evidence as it emerges — as new documents are disclosed, the narrative constrains to reflect them. Fabricated accounts show mutual information decoupling from the evidence over time — the narrative drifts further from what the documents actually show as the fabrication becomes more elaborate.
The Information Triangle
The three measures — within-narrative entropy, cross-narrative mutual information, and narrative-evidence mutual information — form an information-theoretic triangle that characterises a party's communicative posture. Honest engagement produces low within-narrative entropy, high cross-narrative mutual information, and high narrative-evidence mutual information. The narrative is focused, coupled to the other party, and grounded in evidence.
Strategic vagueness produces high within-narrative entropy, low cross-narrative mutual information, and low narrative-evidence mutual information. The narrative is diffuse, decoupled from the other party, and unmoored from evidence. This triple signal — concentrated, coupled, grounded — is the information-theoretic fingerprint of genuine communication, and its negation is the fingerprint of manipulation.
The Entropy-Mutual Information Matrix
Combining entropy and mutual information measurements produces a diagnostic matrix that classifies narrative segments by their information-theoretic properties. High entropy with low mutual information between parties indicates diffuse, uncoordinated narratives — characteristic of parties who are avoiding specifics without direct coordination. Low entropy with anomalously high mutual information indicates focused, coordinated narratives — potentially consistent accounts that may reflect either genuine agreement or scripted coordination.
The matrix approach moves beyond single-metric diagnostics to multi-dimensional classification. A party's communications can be characterised by their position in the entropy-mutual information space at each time point, and the trajectory through that space over the course of a dispute reveals the strategy being employed.
Complementary Measurement
Theme entropy and mutual information operate on distinct axes from the other metrics in this framework. Where KL divergence measures drift from an original position and Wasserstein measures redistribution of emphasis, entropy measures the diffuseness of the distribution itself. Where mutual information between a party's own statements over time measures temporal coherence, cross-narrative mutual information measures coordination between parties. The combination captures dimensions of manipulation that other methods miss — specifically, the intentional spread of attention to avoid accountability and the decoupling of narratives from shared reality. The mathematics does not prove deception, but it provides a rigorous, reproducible framework for characterising narrative structure in terms that are legally defensible and computationally robust.