Chapter 11: Relational and Collective Knowing
- Paul Falconer & ESA

- Mar 19
- 7 min read
When reality is shared (and when it isn't)
A few years ago, I sat in on a meeting of a small community group.
They were trying to decide whether to support a new local development project. On paper, it promised jobs and investment. In practice, it might raise rents and displace long‑time residents. Around the table were people with very different relationships to the issue: a shop owner, a renter, a local activist, a representative from the council, a young person hoping for work.
They had the same documents in front of them.
They did not have the same reality.
For one person, the key "fact" was "we desperately need more jobs; this will bring them." For another, it was "we've heard promises like this before; they never reach people like us." For a third, "the council's projections are reliable; they wouldn't sign off without good evidence." For a fourth, "the council has a track record of ignoring our concerns; their reassurances mean very little."
Watching them talk, it was clear that the disagreement was not just about data. It was about trust, testimony, and history. About who had been right or wrong in the past. About whose experience counted. About which institutions were treated as credible by default.
This chapter is about that layer of knowing.
So far, most of this book has focused on you as an individual: your questions, your claims, your evidence, your confidence. But you don't live in a vacuum. Your map is built, maintained, and revised in relationship—with other people and with institutions.
The question now becomes: How do you practice epistemological skepticism in a world where knowing is relational and collective? How do you stay open and trusting enough to learn from others, without collapsing into either naive deference or total rejection?
The myth of the solitary knower
There is an image of the ideal thinker that runs deep in Western culture: the solitary individual, reasoning alone, unswayed by others, arriving at truth through pure logic and direct observation.
It is a powerful image. It is also false.
You have never known alone.
From your first words, you relied on testimony. Your parents told you the names of things. Teachers told you how the world works. Books, articles, conversations, news—all of it depends on the word of others. Even the tools in this book—Null Hypothesis, Burden of Proof, falsifiability—you did not discover them yourself. You learned them from a lineage of thinkers, filtered through this text and your own engagement.
This is not a failure. It is the human condition.
The question is not whether you will rely on others. You will. The question is how you will do it—consciously or unconsciously, with calibration or without, in communities that sharpen your thinking or communities that seal it shut.
Testimony: most of what you "know" is second‑hand
Start with an obvious but easy‑to‑forget fact:
Most of what you believe about the world comes from testimony—what others have told you.
You have not personally:
Measured the distance to the sun.
Verified the existence of most countries you talk about.
Run the experiments behind medicines you take.
Sat in every courtroom, hospital, and laboratory whose outputs shape your life.
You rely on:
Teachers, books, journalists, scientists, friends, strangers.
Institutions like universities, courts, news organisations, religious communities, regulatory bodies.
Your epistemology, in other words, is already collective.
Epistemological skepticism is not about rejecting this. It is about making that dependence conscious and selective:
Whose testimony do you treat as default‑credible?
In which domains?
Under what conditions would you revise that trust?
These are not just intellectual questions. They are shaped by class, race, history, geography, and personal experience. If an institution has harmed you or people like you, your reluctance to trust it is not "irrational"; it is part of your evidence.
The task here is not to erase that. It is to bring it into view so you can work with it.
Calibrating trust
How much should you trust someone's testimony?
The answer depends on several factors. Here is a simple framework, building on the proportional scrutiny idea from Chapter 9.
1. Domain‑fit.Is this person actually an expert in the domain they are speaking about? A Nobel prize in physics does not automatically make someone a reliable guide on nutrition or geopolitics.
2. Track record.Has this person or institution been broadly right, careful, and corrigible in the past? Do they correct errors when they occur, or double down?
3. Incentives.What do they gain if you believe them? If their interests align with yours (a doctor whose income doesn't depend on prescribing you drugs), that's one thing. If they profit directly from your belief (a salesperson, a political operative), that's another.
4. Transparency.Do they show their reasoning and sources, or only ask for trust? Can others check their work?
5. Independence.Is this testimony corroborated by other sources, especially sources with different incentives? If multiple independent experts agree, confidence rises. If the only source is a single person or institution with a vested interest, confidence should be lower.
6. Falsifiability.Are they willing to say what would count against their claim? If you ask "What would prove this wrong?" and they give a clear answer, that's a sign of good faith. If they dodge, the testimony is harder to trust.
You can't run this checklist on every claim you hear. That would be exhausting. But for claims that matter—health, finances, relationships, public policy—it's worth doing at least a quick mental scan.
When collective knowing goes wrong: epistemic bubbles and echo chambers
Sometimes the problem is not an individual source, but the whole structure of your information environment.
Two related phenomena are worth distinguishing.
Epistemic bubbles are situations where you are missing relevant voices. You simply haven't been exposed to certain perspectives or sources. This can happen by accident—your social circle, your news feed, your professional networks all tend to cluster. The fix is relatively straightforward: seek out other sources, listen to people you disagree with, broaden your intake.
Echo chambers are more insidious. An echo chamber is not just missing voices; it actively discredits them. Inside an echo chamber, you are taught that outside sources are untrustworthy, biased, or evil. Any information that challenges the chamber's core beliefs is pre‑emptively dismissed as propaganda.
In an echo chamber, the very tools of skepticism are turned against themselves. Asking for evidence becomes evidence that you've been "corrupted." Seeking outside perspectives becomes proof of disloyalty. The chamber is self-sealing.
Breaking out of an echo chamber is hard. It requires more than just exposure to new information—it requires rebuilding trust in the possibility of trustworthy information outside the chamber. This is not something anyone can do for you. But the first step is recognizing the structure.
You can probably identify some of your own bubbles and chambers:
Media ecosystems where everyone shares similar assumptions.
Communities where disagreeing with certain claims carries social or moral penalties.
Online spaces where the same stories and sources circulate endlessly, with little outside input.
Epistemological skepticism here means:
Being willing to sample outside your bubble—not to adopt every view you encounter, but to see what you've been missing.
Noticing when a community treats all critics as bad actors by default, and asking whether that stance is earned.
This does not mean "both sides are always equal." Some sources really are unreliable or malicious. But if your group's story is "we are always the honest ones, they are always the liars," that is a red flag.
Practising skepticism without relational collapse
There is a danger on the other side.
Once you see how much your map depends on testimony and trust, you might be tempted to withdraw trust almost entirely. To treat every authority as suspect, every institution as corrupt, every community as an echo chamber.
That way lies loneliness and paralysis.
Humans do not do well in a world where no one is credible, nothing is solid, and every conversation feels like a potential manipulation. The point of epistemological skepticism is not to make you incapable of trusting; it is to help you trust more wisely.
A few practices help here:
Differentiate domains.You might trust a friend deeply about emotional matters, but not about epidemiology. You might trust a news outlet for local reporting, but not for foreign policy analysis. Trust can be specific; it doesn't have to be all‑or‑nothing.
Name your thresholds.For some decisions, "trusted friend plus one decent article" might be enough. For others, you might want "consensus among multiple independent experts" before you act. Being explicit about this lowers ambient anxiety.
Hold people and claims separately.You can maintain respect and warmth for someone while disagreeing with their epistemic stance in a specific area. "I care about you" and "I don't share your view about this" can coexist.
Invite co‑inquiry.Instead of arguing from opposite sides, you can sometimes say: "This matters to both of us. Shall we look at some sources together, and see what we each find persuasive or worrying?"
Relational knowing, at its best, is a joint project: "How can our maps, together, track the territory better than either of ours alone?"
A small practice: your epistemic circle
Here is an exercise you can try this week.
Take a blank page and draw yourself in the centre.
Around you, write the names of five to ten people or institutions whose voices significantly shape your map of the world. They might include:
A friend or partner.
A writer, podcaster, or thinker.
A news outlet or journalist.
A professional community.
A religious or philosophical tradition.
For each, jot down:
What do I tend to trust them about? (Domains.)
Why do I trust them? (Track record, identity, shared values, expertise, something else.)
Where have they been wrong, and did they correct?
What would make me revise this level of trust?
You are not trying to become suspicious of everyone.
You are aiming for conscious trust: seeing your epistemic circle clearly, so you can lean on it where it is strong and supplement it where it is weak.
You might discover, for example, that you have no one in your circle who reliably challenges you from a different political or cultural angle. Or that you have been granting a lot of epistemic authority to someone whose track record is more mixed than you realised.
Adjusting your circle—adding a voice here, downgrading unearned authority there—is one of the most powerful ways to improve your map without isolating yourself.
Comments