The Fraying Thread: Mutual Information Decay as a Measure of Narrative Fragmentation
A truthful account of events is a single fabric. Pull any thread and it connects to every other — statements about different topics, made at different times, to different audiences, all share a common generative source: the communicator's actual knowledge and experience. This interconnectedness is not a metaphor but a measurable statistical property. Mutual information — the quantity that captures statistical dependence between random variables — provides the formal tool for measuring how tightly a communicator's statements are woven together and how quickly that coherence decays across time and topic.
The Mathematical Foundation
Mutual information between two random variables X and Y measures the reduction in uncertainty about one variable given knowledge of the other. If X and Y are independent, mutual information is zero — knowing one tells you nothing about the other. If they are perfectly dependent, mutual information equals the entropy of either variable — knowing one determines the other completely. Between these extremes, mutual information quantifies the degree of statistical coupling.
Formally, I(X; Y) = H(X) + H(Y) - H(X, Y), where H denotes Shannon entropy. This can be interpreted as the KL divergence between the joint distribution and the product of marginals — a measure of how far the two variables are from independence. Applied to communication analysis, X and Y become the semantic content of statements at different time points, and mutual information measures how much the content of one statement constrains expectations about another.
Temporal Coherence Analysis
The primary application is measuring how mutual information between a party's statements decays as a function of temporal distance. For each pair of statements separated by time gap Δt, compute the mutual information between their semantic representations. Plot I(Δt) against Δt to produce a temporal coherence curve.
Consistent communicators show a slowly decaying curve — high mutual information even between statements separated by months. Their message in January still strongly predicts their message in June because both are generated from the same stable model of reality. The decay is gradual and often approximately exponential with a long time constant, reflecting the natural evolution of emphasis and context while preserving the underlying structure.
Manipulative communicators show rapid decay. Statements made months apart share progressively less mutual information because they were not generated from a single consistent model. Each statement was constructed to serve the immediate context — to respond to a specific challenge, to advance a specific argument — without regard for structural coherence with the historical record. Over time, these context-dependent constructions accumulate into a corpus where distant statements are effectively independent.
The Half-Life Diagnostic
The mutual information half-life — the time gap at which I(Δt) drops to half its initial value — provides a single scalar summary of temporal coherence. Honest communicators typically have half-lives measured in months. Their coherence persists because the generative source is stable. Manipulative communicators have half-lives measured in weeks or even days. The contrast is typically dramatic: an order-of-magnitude difference in half-life between honest and dishonest communicators in the same dispute.
This half-life can be computed independently for each party in a dispute, providing a direct comparison that requires no subjective judgment about content. It answers a simple question: how quickly does this person's communication become independent of itself? A short half-life is not proof of deception — it could reflect genuine changes in circumstance or opinion — but it is a measurable structural property that distinguishes narratives with persistent internal coherence from narratives that fragment over time.
Cross-Topic Coherence
Mutual information analysis extends beyond same-topic temporal comparisons to cross-topic structural analysis. Compute the mutual information between a party's statements on Topic A and their statements on Topic B. The resulting cross-topic mutual information matrix reveals the full architecture of narrative coherence.
Honest communicators produce dense cross-topic matrices — high mutual information between statements on different subjects because their accounts of workload, relationships, timelines, and responsibilities are all generated from a single consistent worldview. Each topic constrains and is constrained by every other topic. Manipulative communicators produce sparse cross-topic matrices — fabrications in one domain are constructed independently of fabrications in another because maintaining cross-domain consistency requires a level of coordination that exceeds normal human cognitive capacity.
Fragmentation Topology
The structure of the cross-topic matrix reveals more than aggregate coherence. Honest communicators produce a single connected cluster — all topics are mutually informative. Manipulative communicators often produce fragmented clusters: groups of topics that are internally coherent but disconnected from other groups. These fragments correspond to independently constructed narrative modules — self-consistent stories about different aspects of events that fail to cohere with each other. Identifying these fragmentation boundaries pinpoints the joints in a fabricated narrative, revealing which parts were constructed together and which were improvised independently.