Essay

The Epistemology of Price Discovery

Philosophical foundations of the Continuum Trinity platform.

22 min read·Tim Hannon

This essay articulates the intellectual foundations underlying Continuum Trinity. It is not a product description. It is an attempt to make explicit the epistemological and psychological premises upon which our approach rests.

We believe investment professionals deserve to understand not just what a platform does, but why it is built the way it is—what assumptions about knowledge, markets, and human cognition inform its design.

What follows draws on philosophy of mind, cognitive psychology, intelligence analysis methodology, and behavioural finance. These are not decorative references. They are the load-bearing structure.

01

The Problem of Knowing

The Map and the Territory

In 1931, the Polish-American philosopher Alfred Korzybski offered a deceptively simple observation: "The map is not the territory."

The statement seems obvious. Of course a map is not the thing it represents. But Korzybski's insight runs deeper. He was making a claim about human cognition itself: we never access reality directly. We access representations of reality—mental models, conceptual frameworks, narratives—and we routinely mistake these representations for the thing itself.

This confusion is not a failure of intelligence. It is a feature of how minds work. The representations are useful. They compress complexity into tractable form. They enable rapid decision-making. But they also distort, omit, and occasionally mislead.

Markets are territory. Narratives are maps. Price discovery is the process by which millions of participants, each holding their own map, converge on a number. That number reflects not the territory, but the aggregate of maps—the collective representation of reality, not reality itself.

This distinction is the foundation of everything that follows.

The Lens You Cannot Remove

The German philosopher Immanuel Kant argued that human beings never perceive "things in themselves" (noumena). We perceive things as they appear to us (phenomena)—filtered through the inherent structures of human cognition.

We might wish to see the world as it truly is. We cannot. We see through lenses we did not choose and often do not notice.

Richards Heuer, a CIA methodologist whose work on intelligence analysis remains foundational, applied this insight to analytical practice:

Analysts construct their own version of 'reality' on the basis of information provided by the senses, but this sensory input is mediated by complex mental processes that determine which information is attended to, how it is organised, and the meaning attributed to it.
Richards Heuer

Heuer identified the sources of these mediating processes: past experience, education and training, cultural values, role requirements, and organisational norms.

An analyst trained in value investing sees different patterns than one trained in momentum strategies. A sector specialist notices different signals than a generalist. A sell-side analyst operating under coverage pressure perceives differently than a buy-side analyst with a multi-year horizon.

None of these lenses is "wrong." Each illuminates certain features of reality while obscuring others. The danger lies not in having a lens—that is unavoidable—but in failing to recognise that one is wearing one.

The Persistence of Mental Models

Why do flawed mental models persist even when contradictory evidence accumulates?

Heuer's research identified a troubling asymmetry: confirming evidence is perceived more vividly than disconfirming evidence. Information that fits the existing model registers clearly. Information that contradicts it is rationalised ("that's a one-off"), dismissed ("the methodology is flawed"), or simply not noticed.

This is not motivated reasoning in the conventional sense. Analysts are not consciously ignoring inconvenient facts. The filtering happens at the level of perception itself—before conscious evaluation begins.

The implications are significant. An analyst who "knows" that confirmation bias exists is not thereby protected from it. Awareness of bias does not neutralise bias. Heuer was emphatic on this point:

Even increased awareness of cognitive and other 'unmotivated' biases does little by itself to help analysts deal effectively with uncertainty.
Richards Heuer

What helps is not awareness but method—structured techniques that force the mind to operate against its default tendencies.

02

Markets as Collective Sensemaking

The Narrative Layer

Markets are not information-processing machines. They are sensemaking systems.

The distinction matters. Information processing implies a mechanical operation: data enters, analysis occurs, output emerges. Sensemaking is different. It is the process by which communities construct shared meaning from ambiguous, incomplete, and often contradictory signals.

Karl Weick, the organisational theorist, described sensemaking as fundamentally retrospective. We do not understand events as they occur. We construct understanding afterward, fitting events into narrative frameworks that render them comprehensible.

Markets do this continuously. A stock falls. Participants construct explanations. A narrative emerges. The narrative shapes perception of subsequent events. It becomes "what everyone knows"—regardless of whether it is true.

Robert Shiller's concept of "narrative economics" formalises this insight. Economic narratives spread like epidemics—contagious, self-reinforcing, and often decoupled from underlying fundamentals. They persist not because they are accurate but because they are shared.

The Stickiness of Consensus

Consensus is not merely slow to update. It actively resists revision.

Consider the forces maintaining a shared narrative:

Cognitive forces: Confirmation bias filters incoming information. Anchoring fixes initial estimates too firmly. Availability heuristics overweight recent and vivid data.

Social forces: Conformity pressure suppresses dissent (Asch's experiments demonstrated people will deny their own perception to match a group). Information cascades create apparent agreement that is actually herding. Preference falsification means public statements diverge from private beliefs.

Institutional forces: Analyst incentive structures reward consensus (access preservation, career risk). Changing a rating is costly; maintaining is default. Organisational inertia resists updates to established views.

The result: narratives persist longer than they should. Contradictory evidence accumulates without triggering revision. The gap between map and territory widens—until something forces a reckoning.

The Three Layers

We propose a framework distinguishing three layers that may align or diverge:

EVIDENCE · NARRATIVE · PRICE
The Three Layers: Reality, Belief, and Price may align or diverge. Divergence creates opportunity.

Layer 1: Reality. Reality, in this framework, is not "objective truth"—an epistemically inaccessible concept. It is the best achievable picture given comprehensive evidence synthesis. Reality is hard to access because evidence is scattered (across filings, regulatory documents, academic papers, legal proceedings, competitor disclosures), requires interpretation, and is often ignored (analysts read headlines, not the 847-page inquiry).

Layer 2: Belief. Belief is the collective mental model. The narrative. What the market "thinks." It is distinct from reality because narratives spread faster than analysis, beliefs cascade through social proof rather than independent verification, and inertia favours the existing story. Belief is extractable. Analyst reports, management commentary, investor positioning, and price behaviour itself reveal what the market thinks. This layer can be mapped.

Layer 3: Price. Price crystallises belief into a number. It embeds implicit assumptions about growth, margins, duration, and risk. These assumptions can be reverse-engineered. If the market truly believed a risk was material, it would be reflected in valuation. If it is not reflected, the market does not believe it—regardless of what commentary says.

Divergence as Opportunity

Investment returns, in this framework, arise from identifying where the layers misalign:

Reality ≠ Belief: Evidence shows something the market narrative does not appreciate. The map is wrong. Often this involves cross-domain synthesis—the signal is in regulatory filings, academic research, or competitor data rather than the company's own disclosures.

Belief ≠ Price: The market says one thing but prices another. Analysts acknowledge a risk in reports but do not reflect it in valuation. Commentary is cautious, but ratings stay Buy. This gap reveals what the market is doing versus what it is saying.

Reality ≠ Price: Direct mispricing. Price implies assumptions (growth rates, margin trajectories, risk probabilities) that cannot be supported by evidence or base rates.

When layers align, the opportunity is different: understanding the belief structure, identifying the embedded assumptions, and monitoring for signals that alignment may be breaking.

03

Heuer's Methodology

The Limits of Intuition

Richards Heuer spent decades studying how intelligence analysts fail. His conclusion: intuition, expertise, and access to information are insufficient safeguards against error. The mind's default operations—pattern matching, hypothesis confirmation, rapid closure—are optimised for speed, not accuracy.

On complex issues characterised by incomplete information, ambiguity, and uncertainty, intuitive judgement systematically misleads.

Heuer's response was not to abandon intuition but to discipline it. He developed structured analytical techniques designed to counteract cognitive defaults. The most important of these was Analysis of Competing Hypotheses.

Analysis of Competing Hypotheses

ACH inverts the natural analytical process.

The default: form a hypothesis, seek evidence to support it, increase confidence as confirming evidence accumulates.

ACH: generate multiple hypotheses, seek evidence that disconfirms each, increase confidence only in hypotheses that survive attempts at disproof.

The methodology: Identify all plausible hypotheses. List the significant evidence. Prepare a matrix with hypotheses across the top and evidence down the side. For each piece of evidence, assess: is it consistent, inconsistent, or irrelevant to each hypothesis? Focus on disconfirming evidence—evidence that rules hypotheses out. Hypotheses that survive disproof attempts deserve higher confidence. Identify what information would discriminate between surviving hypotheses. Report conclusions with sensitivity analysis.

The discipline is subtle but transformative. Instead of asking "what supports my view?" the analyst asks "what would disprove it?"

Mirror-Imaging

Heuer identified a specific failure mode he called "mirror-imaging": projecting one's own mental framework onto the subject of analysis.

In intelligence work, this manifests as assuming foreign leaders share Western logic, values, and decision-making processes. Behaviour that appears "irrational" usually indicates the analyst has failed to understand the subject's actual incentive structure—not that the subject is acting irrationally.

The parallel in investment analysis is direct. When management behaviour seems irrational—"why would they do that?"—the analyst should interrogate their own model before concluding management is wrong.

Management operates under different incentives (career preservation, compensation structure, board dynamics), different information (they know things the market does not), different time horizons (their tenure versus the analyst's coverage period), and different cognitive biases (optimism, commitment escalation, sunk cost fallacy).

Understanding management's logic as they experience it—not as the analyst would experience it—is essential to anticipating behaviour.

Honouring Doubt

Heuer argued that analytical culture systematically undervalues uncertainty. Conclusions like "we don't know" or "the evidence supports multiple interpretations" are treated as failures rather than accurate assessments.

This creates pressure for false confidence. Analysts overstate certainty to meet institutional expectations. Caveats are buried. Alternative interpretations go unmentioned.

Heuer advocated inverting this norm:

Such conclusions as 'we do not know' or 'there are several potentially valid ways to assess this issue' should be regarded as badges of sound analysis, not as dereliction of analytic duty.
Richards Heuer

Investment analysis faces the same pressure. Clients want conviction. Ratings require a direction. "We're uncertain" satisfies no one. Yet false confidence is more dangerous than acknowledged uncertainty. The analyst who says "I don't know" is providing more useful information than the one who expresses false certainty.

04

The Catalysts of Correction

Beyond Information

If map and territory diverge, what forces reconciliation?

The obvious answer is new information. Data is released that contradicts the narrative. The market updates.

But information is neither necessary nor sufficient for correction. Narratives often persist despite contradictory data (it gets rationalised). And narratives often break without new data (something else changes).

We identify nine distinct catalyst types:

The Nine CatalystsWhat forces markets to update (beyond new information)Perception Shifts1AttentionNew salience2Frame ChangeSame facts, new lens3PersonnelFresh eyesNarrative Dynamics4ExhaustionStory runs out5Social ProofConsensus breaks6Overton WindowPermission shiftsExternal Triggers7Time RevelationPredictions expire8ReflexivityLoop exhausts9Adjacent EventsPeer forces reviewInformation is neither necessary nor sufficient for correction
The Nine Catalysts: New information is only one trigger. Understanding what forces market updates is essential to timing.

1. Attention Shifts

The information was always available. It sat in filings, footnotes, regulatory documents. No one was attending to it.

Attention is scarce. Markets process a fraction of available information. William James observed that "what we attend to is reality." For markets, this is literally true: information that isn't attended to isn't priced.

What shifts attention: a journalist writes about it, a prominent investor mentions it, an adjacent event makes it suddenly relevant, a question on an earnings call surfaces it.

The catalyst is not new information. It is new salience.

2. Frame Changes

The facts remain constant. The interpretive frame changes.

Kahneman and Tversky demonstrated that identical information presented differently produces different judgments. "90% survival rate" and "10% mortality rate" are logically equivalent but psychologically distinct.

Market frames shift with macro context. "Platform investment" was visionary when rates were zero. The same spending became "unprofitable cash burn" when rates rose. No new facts—new frame.

3. Narrative Exhaustion

Narratives require continuous reinforcement. Each confirming data point extends the story's life. But narratives have natural lifespans.

Signs of exhaustion: price stops responding to good news (already believed), the bull case relies entirely on "optionality" and "when, not if," analysts struggle to write anything new.

The narrative hasn't been disproved. It has simply run out of energy. The pool of potential converts is exhausted. The story is priced in, with no marginal buyer remaining.

4. Social Proof Fracture

Consensus is self-reinforcing—until someone defects.

Asch's conformity experiments showed people suppress private beliefs when the group disagrees. But when one person breaks consensus, it gives permission to others. Private doubts become expressible.

In markets: a respected voice changes their view publicly, a large holder sells (revealed preference contradicting stated belief), one analyst downgrades with a detailed bear case. The cascade can reverse. What seemed like robust consensus reveals itself as fragile herding.

5. Time Revelation

Predictions have implicit time horizons. "The turnaround is coming" implies when.

As time passes without the predicted outcome, credibility erodes. No explicit disconfirmation occurs—the absence of the expected event is itself the catalyst.

"Margin recovery in H2" was the guidance. H2 passes. No recovery. No announcement of failure. But the prediction is falsified by non-occurrence. Time converts implicit predictions into testable propositions.

6. Reflexivity Unwinding

George Soros's reflexivity concept: beliefs influence the reality they are about. Rising prices validate the narrative that justified buying, which attracts more buyers, which raises prices further.

These self-reinforcing loops require continuous new entrants. When the last potential believer has bought, the loop exhausts its fuel.

The break often comes without external trigger. No bad news. Just... the music stops. The loop has consumed its inputs.

7. Personnel Turnover

Those who built the narrative are anchored to it. They have invested cognitive and social capital in its validity.

But people change. The analyst who has covered a stock for fifteen years is replaced. The new analyst arrives with fresh eyes, different anchors, no commitment to the existing story.

Kuhn observed that paradigm shifts often require generational turnover. The old believers don't convert; they retire.

8. Adjacent Events

Something happens to a peer, competitor, or adjacent sector. It forces re-examination.

"If it happened to them, could it happen here?" Assumptions shared across a sector—previously invisible because universal—suddenly become visible and questionable.

A competitor's write-down on automation investment. An industry peer's accounting restatement. A regulatory action against a similar business model.

9. The Overton Window

Some views are unsayable—not because no one holds them, but because expressing them carries unacceptable social or professional cost.

Timur Kuran's concept of preference falsification: people publicly express views that differ from their private beliefs when they perceive the true belief is minority or unacceptable.

When the Overton window shifts—when a high-status voice says the unsayable thing, when accumulated evidence makes denial untenable—previously suppressed views become expressible. What "everyone knew" privately becomes "what everyone says" publicly.

The catalyst is not new information. It is permission.

05

The Methodology

Structured Sensemaking

Heuer's central prescription: tools and techniques that force the mind to operate against its defaults. Our methodology applies this principle to investment analysis.

Step 1: Map the Narrative

Before assessing whether the market is right, understand what the market believes.

This requires extraction, not assumption. What is the consensus thesis? What growth is implied? What margins? What risks are acknowledged? What risks are dismissed?

The narrative exists at multiple temporal layers: long-term thesis ("defensive compounder," "structural grower"), medium-term expectations ("cost programme delivers by FY25"), and short-term positioning ("Q3 will show online acceleration"). Each layer has different rates of change and different tripwires.

Step 2: Identify Embedded Assumptions

Every narrative rests on assumptions—often unstated, frequently unexamined.

For the narrative to hold, what must be true? Stable market share? Successful execution on cost initiatives? Manageable competitive intensity? Benign regulatory posture?

These assumptions are the linchpins. If one breaks, the narrative loses coherence.

Step 3: Reverse-Engineer Price

Price embeds implicit assumptions that can be extracted. What earnings growth rate does this multiple require? What margin trajectory? What duration of competitive advantage? What probability weight on tail risks?

Compare implied assumptions to stated narrative. Gaps between what's said and what's priced reveal where the market is positioned versus where it claims to be.

Step 4: Set Tripwires

For each key assumption, define what evidence would signal it is breaking.

These tripwires are not predictions. They are pre-commitments: "if I see X, I will revisit the thesis."

The discipline is psychological as much as analytical. Defining tripwires in advance counteracts the tendency to rationalise disconfirming evidence after it appears.

Step 5: Monitor Across Domains

Different sources have different relationships to truth.

DomainRelationship to Truth
Corporate disclosuresMotivated communication—management controls the narrative
Regulatory filingsAdversarial scrutiny—different incentive structures
Legal proceedingsTestimony under oath—consequences for misrepresentation
Academic researchPeer-reviewed—though may lack commercial context
Competitor filingsReveals what management will not say

When a company's investor presentation says one thing and a legal filing under oath says another, that is not merely "more information." It is information with different epistemic status.

Synthesis across domains reveals where stories conflict. Conflict is where insight lives.

Step 6: Apply Analysis of Competing Hypotheses

Generate multiple plausible interpretations. The bull case, the bear case, the variant perception.

For each hypothesis, assess: what evidence supports it? What evidence undermines it? What would have to be true for it to hold?

Focus on disconfirmation. The hypothesis that survives the most attempts at disproof deserves the highest confidence—not the one that feels most intuitive.

Step 7: Identify Catalysts

A divergence without a catalyst is academic. Markets can remain "wrong" indefinitely if nothing forces re-examination.

For each identified divergence, ask: what would force the market to update? Is it a scheduled event (earnings, regulatory decision)? A predictable but unscheduled development (management change, competitive response)? An unpredictable shock?

Monitor for early signals that catalysts are approaching.

Step 8: Honour Uncertainty

Specify what you know, what you don't know, and what would change your assessment.

Uncertainty is not failure. It is an accurate characterisation of the epistemic situation. False confidence serves no one.

06

The Role of Artificial Intelligence

Comprehensive Synthesis at Scale

Humans cannot synthesise across all relevant domains. The cognitive load is prohibitive. The sources are too dispersed. The time required is incompatible with decision-making timelines.

This is the appropriate application of artificial intelligence: not replacing human judgment, but expanding the inputs available to it.

AI can ingest and process documents at scale (regulatory filings, legal proceedings, academic literature, competitor disclosures), extract and structure the embedded claims, assumptions, and assertions, maintain a continuously updated model of the market narrative, monitor for signals that assumptions may be breaking, surface divergences between domains, and identify what has changed versus the prior state.

What AI cannot do—and should not attempt—is render judgment. The assessment of what a divergence means, whether a catalyst will materialise, and how to act remains human.

The appropriate relationship is augmentation: AI expands the evidence base; humans evaluate and decide.

Making the Implicit Explicit

Much of what Heuer advocated involves making implicit mental operations explicit. Assumptions that lurk unexamined in the analyst's mind are surfaced and tested. Alternative hypotheses that would otherwise be suppressed are generated and evaluated.

AI can serve this function systematically. Rather than trusting that the analyst has considered alternatives, the system generates them. Rather than hoping that disconfirming evidence will be noticed, the system surfaces it.

The result is not artificial intelligence in the sense of machine judgment. It is structured analytical discipline—Heuer's methodology operationalised at scale.

Conclusion: Epistemological Humility

The thesis underlying this platform is not that we can see reality while others see illusion. That would be epistemically arrogant and almost certainly wrong.

The thesis is more modest:

  • Everyone perceives through lenses. The lenses distort.
  • Markets aggregate beliefs, not facts. Beliefs are sticky.
  • Consensus reflects the collective map, not the territory.
  • Divergence between map and territory creates opportunity—when catalysts force reconciliation.
  • Structured methods can discipline cognition against its defaults.
  • Comprehensive synthesis can reveal what narrow analysis misses.

We do not claim to know the territory. We claim to help you compare more maps, examine your own lenses, and identify when the collective map may be due for revision.

The market is usually right. But when it is wrong, the error is often visible in the gap between what cross-domain evidence reveals and what the narrative assumes.

That gap is where we work.

TH

Tim Hannon

Former Head of Equities at Goldman Sachs Australia. The methodology Continuum implements is the codification of what disciplined practice should be.