Seeing as a State, Metis and Legibility
Modern statecraft is largely a project of internal colonization, often glossed, as it is in its imperial rhetoric, as a 'civilizing mission'.
James C. Scott, Seeing Like a State
Even though I’ve never read it, Seeing Like A State, as described by SlateStarCodex, Samzdat and Ribbon Farm, has influenced me more than almost any book I have actually read. This book sets up a power struggle between the practical wisdom (metis) of the common people and the state’s imposition of legibility in order to faciliate governance.
The story goes that through experience the common people develop beliefs that are practically useful, but without really understanding why they work. Since there’ll always be annoying kids asking why, reasons are generated to satisfy them. So, if it’s discovered that a particular rotation of the crops works, the local storyteller will tell a tale about the God’s and how this plant honors Aphrodite and this one honors Demeter and only when they’re both happy will the crops grow. What else can they say? After all, it’s not like they know about the importance of fixing calcium and phosphate in the soil.
Let’s imagine that the kingdom begin to modernize. It’s time to cast out all superstitution! The local officials talk to the villagers and learn that rather that the villagers make their crop-planting decisions to earn the favour of the Gods rather than on the basis of the latest science. Those backwards simpletons! As the experts know, crops grow perfectly well in the rest of the country without clover. It’s time to do away with these outdated practises and bring in a new age of prosperity!
The villagers are initially resistant, so the officials threaten to rip out any clover they find and charge them for the expense. With no choice but to comply, the villagers do as they are told and then come harvest time they suffer a devastating famine.
In the world of our story, they are still early enough in the modernization process that the official have no way to measure nitrogen in the soil and to discover that the soil in the village happens to be extraordinary low. In fact, the officials don’t even know nitrogen exists!
Even though the villagers couldn’t provide reasons that were scientically legible - and their putative reasons were utter nonsense - in this instance they were correct to trust in the practical wisdom that had been passed down to them and stood the test of time. The villagers had no ability to speak in a language that the officials understood (Miranda Friceker has called this Epistemic Injustice). It’s worth noting the officials attempts to explain science might not be legible to the villagers either. To them it’s just a string of long words.
Recommended: Book Review: Seeing Like a State; Scott Alexander
Read more: Book Review: The Secrets of Our Success
The Struggle
Conflict theorists think of free speech and open debate about the same way a 1950s Bircher would treat avowed Soviet agents coming into neighborhoods and trying to convince people of the merits of Communism. Or the way the average infantryman would think of enemy planes dropping pamphlets saying “YOU CANNOT WIN, SURRENDER NOW”.
Conflict vs. Mistake, Scott Alexander
An important distinction popularised by Scott Alexander is the difference between conflict and mistake theory. In mistake theory, you default toward believing that your political opponents act the way they do because of mistaken beliefs and that engaging in dialog is crucial in order to correct these misconceptions. In conflict theory, you default towards believing your opponents are acting in bad faith. Any reasons your opponents provide are merely fig leafs to cover up their selfish pursuit of their own interests and engaging in dialog is pointless because they don’t care about the truth or what you say. It is normal for conflict theorists to paint their opponents as the devil, but in this case the villagers will simply be painted as too ignorant to understand their own interests.
In our hypothetical, it is very easy for both sides to end up in a conflict theory mindset. The officials may try to explain at first, but eventually give up and conclude that the villagers are just too dull-headed to understand anything. The officials are likely to become even more forceful in implementing policies and even less responsive to complaints. The villagers will become sick and tired of not being heard and of being told what to do by unresponsive officials following policies produced in the distant capital. Instead of just opposing specific policies, they may start to oppose all official action on the assumption that if it’s in the interests of the officials, it’s probably against their interests.
Eventually, instead of just rejecting official actions, they may even reject the language of the officials themselves. This may be because they feel that the officials’ prejudices and presumptions are deeply embedded in it or because they feel that using this language means fighting their opponent on their own terms such that they never win.
This may be a rejection of their literal language, or of their standardised units of measurement or of the use of scientific/rational thought. They may even start claiming that 2+2=5. This mirrors the concept of the “Language of the oppressors” from anti-colonial thought and critical race theory. As an example, consider the infamous graphic published by the National Museum of African American History & Culture which labelled objective, rational thinking as part of white culture.
As much as we can laugh about how this supposedly-progressive graphic accidentally managed to suggest that a minority can’t think rationally, there’s a point in here as well. African American’s have lower educational outcomes and as long as this disparity exists, it will be harder for them to express themselves in a way that would be accepted as rational. After all, a certain level of education is required in order to provide a coherent argument even if you know it is true from experience (see the social justice concept of positionality). And beyond this, we often judge the rationality of an argument not strictly according to logical flow, but according to the ability of people to use sophisticated-sounding words.
(In the positionality argument, there is an interesting though possibly resolvable tension between the message which undermines the importance of educational qualifications and the proponents who tend towards being highly educated. It reminds me of, “All Cretans are liars” being uttered by a Cretan).
Another reason to reject the language or systems of the officials may be to become illegible as to limit the power of officials to intrude. After all, you can’t police speech if you don’t understand the language. And it’s harder to assess taxes if instead of everyone having their own nice, little plots you have a slice of forest and a slice of marshland and a slice of plains.
Naturally, it is in the interest of the officials to impose it anyway and any unintentional consequences will likely be beneath their notice. As an outsider, they can’t understand how having a local dialect binds the community together and provides people with a sense of place in the world; or how having different types of land enabled self-subsistance.
The Uncrackable Code
Three can keep a secret, if two of them are dead.
Bejamin Franklin
Your conscious mind is more plausibly a press secretary. You’re not the president or the king or the CEO. You aren’t in charge. You aren’t actually making the decision, the conscious part of your mind at least. You are there to make up a good explanation for what’s going on so that you can avoid the accusation that you’re violating norms.
Robin Hanson
One of the biggest advantages the British had over the Germans in WW2 is that they cracked the enigma codes, so they knew what the Germans were planning. Enigma was a fiendishly difficult code, but apparently not difficult enough.
But what if the Germans hadn’t even known what they were going to do themselves? What if there wasn’t actually anything for the allies to crack? In fact, Robin Hanson has theorised that we’ve evolved to lie to ourselves about our motives or likely actions lest we accidentially reveal compromising information through a slip of the tongue or inadvertant facial expression.
First you say, “Someone needs to hang for this as a turn of phrase” and of course you don’t mean that literally. That would be horrific, it’s just a turn of phrase. Indeed you are genuinely horrified. Next it becomes “I wish we could just shoot him”, but you were just blowing off steam and of course you’d never actually do it. Again, you completely believe this. But before you know it, the palace is in flames and you’re getting ready to string up the king in his pajamas, but despite the illumination you’re still blind to your tendency to deceive yourself. Social justice advocates talk about dog-whistles and assume that you know exactly what you’re doing. But I suspect that plausible deniablity is as much about denying it to yourself as it is about denying it to anyone else.
Similar unconscious dynamics can occur in a group. Activists and moderates can unconsciously be playing out a good-cop/bad-cop dynamic, with the moderates believing that they would in no way condone the extreme actions taken by the activists, whilst their revealed preferences demonstrate otherwise. This is effectively a distributed Motte and Bailey.
As another example, cult members may not understand how arbitrary shibboleths maintain group cohesion and prevent imposters through costly signalling, but act it out regardless. Journalists throwing around the term “tech-bros” might not understand that they are constructing a superweapon against a competing group of elites, but it has that effect regardless.