Unpredictable Patterns #53: Webs of knowledge
Knowing, trusting, asking, answering and the debate on shared facts
Dear reader,
I hope the break - if you had one - has been great. I have had a wonderful break, albeit with a small spat with Omicron. This new stage of the pandemic is unfolding as we speak and I hope you all stay healthy and safe. The new year brings the opportunity to change things up, and I had been thinking about different new formats, but I am fond of the essay and will probably keep writing in this way. I may figure out interesting new formats along the way, though, we will see!
This week’s note is about knowing and builds on note #21 - which could be a good read to add to this if the subject intrigues you as it does me. There is also a side discussion about this in a recent podcast by Sean Carroll that I found absolutely fascinating and strongly recommend, where C Thu Nguyen and Carroll discuss games, but also conspiracy theories and their simplifying allure.
Knowing the world
What do we know? The answer to this simple question is surprisingly complicated - because we know in many different ways. We know in the kind of general, shallow way how a fridge works. We know in another way why the Second World War happened. And we know in yet another way that 11 is a prime. We also know that the sun will rise tomorrow and that the devil wants no good for mankind.
We know the world in a multitude of ways.
Often the question of knowledge is immediately reduced to a proposition. We look at knowledge and we say that we know p(1)…p(n) and then we discuss what it means to say that we know p(x). This way of treating knowledge - as propositional - is old, and dominates the way we think about knowledge. But is it true? Do we know propositions?
Think about how we test knowledge. We ask questions. What we know is the answer to a question, and so whenever we believe we can answer a question we also know something. Equally, not knowing something is simply being unable to answer a question about it.
The way we know, then, depends on the way we ask and answer questions.
Say we argue that we know that the world is not flat, but round. What we really know is the answer to the question ”what is the best crude approximation of the earth’s geometrical shape?”. Uncovering the questions that underpin the knowledge we have is a way to complete that knowledge and really understand it more deeply. We spent some time on this in #21 - On asking questions - and will not repeat that discussion further (if you are really interested, and read Swedish, Fredrik Stjernberg and I wrote a book about questions).
What we will do instead is to deepen the discussion on how we know. If we know by answering questions, and all our knowledge is a set of questions and answers, then we also need to examine how we arrive at our answers.
Every person you know could be described as a set of answers to different questions. They can answer questions for you in different ways. This may seem a strange way to describe your friends, but in a very real sense the friends you have form a network of questions and answers, and you are a part of that network.
This has far-reaching consequences.
A network of questions and answers
One consequence is that your knowledge consists of the answers that you can give, as well as the answers that your friends can or have given you before. When I ask you a question, I ask it of the web of knowledge of which you are a part.
This means that knowledge is relational. You know what your friends know and have told you, what you have heard from parents and teachers. But that knowledge is not empirical - it is not knowledge you have arrived at by careful examination of the world. It is networked knowledge.
Of what we know, probably more than 90% is networked knowledge. We know so much, about capitals in foreign countries, what happened on the titanic and about the landing on the moon - and so very little of this knowledge is based in your personal experience.
Networked knowledge is a key to human progress, but it depends on something very specific and important: it depends on a special kind of trust.
Trust is another one of those words that have a vast family of meanings, but in this case we need to explore the special kind of trust that makes you answer a question with knowledge that is not yours. We will say that you trust a node in a knowledge network if you answer with the answers held by that node as if these answers were your own.
When we speak of a knowledge society or a knowledge economy the key innovation is not the knowledge per se, the fact that we have more and more information - no the key element of progress that we have is that we now know collaboratively and collectively.
Collectively because we accept certain kinds of knowledge in the network even though none of us has validated it for ourselves.
Collaboratively because we work to validate, explore and examine the answers to questions together.
And it is not only trust in friends or parents - it is also trust in certain institutions. Institutions are also possible to model as sets of answers. An astronomy research institute produces answers to complex questions about the universe - but its institutional set up also produces something else that is key to its value: a trust in the answers that the institution can give.
A set of answers in the knowledge network that we do not trust do not connect to the network at all. They never become knowledge, because we will not take them as our own answers to give.
Knowledge - then - is a vast network of answers given by individuals and institutions we trust.
But what about truth?
Truth and trust
Is trust more important than truth?
The short answer is that trust precedes truth, for anything that we cannot validate ourselves. The truth available to us we can explore ourselves, but so much of the truth is one degree removed from our ability to explore it by our own efforts, that we need to trust before we take something to be true.
Something interesting happens here that we have not yet explored enough. For all knowledge one degree removed from our personal experience, trust and truth become synonymous.
At the horizon of our personal knowledge, these two concepts melt together as one - we give the answers we trust as true.
The information society can only become a knowledge society if it upholds basic institutions of trust in answers. The way to a shared truth goes through a shared trust in the answers given.
This is what is so frustrating about two main elements in the current social debate about information, misinformation and knowledge.
First, there is an emphasis on information and facts that completely misses the fact that trust and truth are indistinguishable once we move outside of the realm of personal experience.
And this is not a weakness - this is the very strength that has allowed us to produce more information, and more knowledge, than any other civilization.
The division of labor is a small thing compared to the division of knowing.
But when we focus on persuading someone that p(x) is true we waste time if they do not trust the individual or institution that gives this answer to their questions. And we make it worse by condemning the people who refuse to believe something that we think is true as irrational or stupid - since it is nothing but deeply rational to base your knowledge on trust.
In fact, it is the opposite that is the irrational thing. This is the second element in our debate that is problematic: that we think rationality is validating all that we claim to know ourselves.
As Elijah Milgram has argued in The Great Endarkenment, we are suffering the consequences of the overextended victory of Enlightenment here. You will remember that we often cite Edward Luttwak in these notes - and his theory that defeat is born out of victory, and that is exactly what is going on in the debate about misinformation and alternative facts.
The Enlightenment demanded that we dare to know (”sapere aude!”) and that brought us out from the dark ages in which there was no truth but that which we were forced to trust by an oppressive society, a dominant church and the few educated classes. This was a great step in human progress - but as all great victories, it contained the seeds of its own destruction.
As knowledge grew, and information exploded, daring to know was conflated with daring to know everything. We should see for ourselves, we were told, and make up our own minds. At the same time hyper-specialists emerged that knew things that no-one else could know with reasonable time spent - and so we end up caught between the individual’s claim to know and the sheer vastness of the knowledge available and, so, the impossibility of the task.
It is essentially impossible to strive to know everything for yourself or not trust the knowledge - it is an inhuman imperative that would force us to only trust those answers that we can give from our own experience, and that would fracture the knowledge society completely.
The great democratization of the Enlightenments demand that we dare to know slowly evolved in to the dissolution of that same democracy as we all decided that we can know for ourselves.
The disdain for experts that has been so lamented is an echo of that same Enlightenment creed - a wish to become autonomously wise, an individual knower in a sea of knowledge.
It forces us to ask the question of what subjects we can realistically digest and make a part of our own personal set of answers. Can we understand the complex science of epidemiology and give personal answers about what should be done, or are we better off closely examining who we trust and follow their advice?
The debate about misinformation, then, often misses two important aspects of knowledge: the first that the problem is not that people believe something that is not so, but rather that we have lost shared trust in the web of knowledge and that it has fractured in to many networks of knowledge - based on tribal trust patterns. The second is that we need to reintroduce trust in institutions - especially the institution of science - as an acceptable way to know, and limit the belief in the autonomously knowing individual.
We do not need a shared baseline of facts. We need a shared trust in the ways we answer our questions, in the individuals and institutions that answer them.
This is a much harder problem to solve, and trying to force a set of answers down our opponents throats is not going to succeed in anything than reducing trust further in our ways of answering.
Competing webs of knowledge
Let’s say we do not succeed. That we end up in a scenario where we will have competing webs of knowledge in society. What, then, are the likely outcomes? We should expect - as we can witness - increased polarization and heated debate. There is a risk of contagion to even the most basic societal institutions - courts, police, doctors - and the dissolution of shared networks of answers even there. A society that has no shared answers at all is hardly even a society.
Is there an upside? Could we argue that competing networks of knowledge and trust will speed the production of new knowledge, and that an active choice to trust one network over another actually re-engages us in democracy? Could we argue that the polarization we are witnessing ultimately means we care more about our society?
No, not now. Perhaps never. But there is a caveat here - and that is that the stability and reach of a network of knowledge is a bad measure of its social value. That we all know the same, and have the same answers, is not a recipe for progress but stagnation.
This presents another problem: we want competition between questions and answers in a thriving society, and we want to ensure that these competition drives progress. But how can we find the balance between the kind of competition that fosters new ideas and innovation, and the kind that merely entrenches us in old, rehearsed positions?
And how do we balance the need for us to recognize that we are not autonomous knowers with the curiosity that inspires some people to find better answers to old questions? Is this not the innovators impulse? To see for themselves if something could really work, and not trust those that say that it cannot be done? We need to trust and to seek the truth for ourselves at the same time.
So what?
The take-aways here are likely not surprising.
The first is that facts need trust more than truth, and figuring out how to build trusted webs of knowledge is the best way to counter misinformation.
The second is that we need to consciously think hard about what we choose to know ourselves and the answers we trust others to give. Together, these two conclusions suggest that in addition to a knowledge graph, we need a trust graph that can allow us to think hard about how we help people build healthy networks of trusted knowledge with others - a known problem many are exploring now.
The third is that the knowledge society is a set of questions and answers distributed over many, many individuals and institutions - and finding new ways to explore both questions and answers is key to increasing the pace of science and progress. It seems likely that things like scientific journals - a great tool for a world with less questions and even fewer answers - no longer suffice on their own. The way we question has changed, and so we need to organize our answers in new ways.
The fourth is that all victories reverse into defeats. Even the mighty enlightenment contained a pathology that we are now exploring and coming to terms with - this idea of knowledge as an individual duty and right. An idea that was premised on a relative scarcity of answers and now undermines trust in shared answers.
Welcome to a new year, and thanks for reading - as always, let me know if you have questions, ideas or thoughts!
Nicklas
There is a too-extreme version of this view that we can simply delegate all knowledge production to a small class of experts and then trust the outcome implicitly, simply based on their status. But I think that would be bad and unsustainable.
A good question is *what* are you trusting, the person or institution or the process they are following? One recipe for a productive and sustainable kind of trust is for everyone to have some foundational first-order beliefs about how the world works *and* how one ought to empirically investigate it. Then we can mostly leave it to specialists to do that work following the process, but generally understand what they are doing. And it means we need to invest in educating everyone to the point of having a foundation to build trust on - ie don't throw out enlightenment thinking entirely, but we certainly don't all need to be polymaths.