We’ve got lots of new subscribers (thank you!) and thought it might be useful to reprint our very first item from May 2022. Called Dysinformation: How the exaflood created an information sickness, it sets the stage for one of the biggest stories in recent times – the exposure over the last few months of the Censorship Industrial Complex, as we coined it.
Earlier this week, Jacob Siegel of Tablet wrote the most comprehensive summary of the disinformation enterprise – A Guide to Understanding the Hoax of the Century. It’s a must-read.
Siegel’s article follows several months of historic investigative journalism by Matt Taibbi of
and of , who plumbed the internal communications of Twitter 1.0 (pre-Elon Musk). They found astonishing collaboration and/or coercion among the social media firm and dozens of federal agencies and think tanks, designed to suppress information inconvenient to preferred establishment storylines.At the same time, Mike Benz, a former State Department cyber official, added even more detail and historical context on how the government agencies began and went about their work – the nuts and bolts and origin of the whole operation.
We had been writing about growing digital censorship for the last half-decade. But the depth and breadth of this Censorship Industrial Complex is still mind-blowing. Here’s our item from May 2022.
Dysinformation: How the exaflood created an information sickness
Elon Musk’s prospective purchase of Twitter launches a new phase in the information wars. Last week, Musk announced 19 equity partners are supplying $7 billion for the deal, allowing the Tesla CEO to reduce his margin loan from $12.5 to $6.25 billion. This morning, amidst a broader market and tech downdraft, Musk put a temporary “hold” on the deal pending an audit of real users and fake bots. He remains committed to the acquisition, he says.
If consummated, faltering free speech will receive a crucial jolt. But why do free speech and open discourse need a jolt?
Despite raucous disagreements on nearly everything else, most Americans for most of our history shared a broad belief in free expression. And, like Musk, thought it was central to a functioning democracy. But now, far from welcoming Musk’s promise of online openness and his defense of a core American value, officialdom has reacted with horror.
As Mike Solana notes, the Washington Post has in the last month published more than 75 stories and op-eds on Musk’s gambit, nearly all warning against a more open mediasphere. For most of us, free speech has always been a given, or at least an unchallenged ideal. Now, a major world newspaper, whose motto is ‘Democracy Dies in Darkness,’ is going to war for more censorship on social media. What in the world happened?
Gurri’s Exaflood
One could do worse than look to Martin Gurri for an answer. Gurri’s book The Revolt of the Public: And the Crisis of Authority in the New Millennium, originally published in 2014 and updated in 2018, argues that despite much fanfare, we actually underestimated the epochal social shifts caused by the Internet. The Internet really did change everything. Including, it now appears, our pseudo-elites’ views on free speech.
A long-time CIA media analyst, Gurri noted that in the late-2000s, new waves of digital content suddenly and totally swamped his and his colleagues’ ability to analyze it. At the time, I was calling this digital content explosion the exaflood. Gurri argued that the 2011 Arab Spring protests, enabled by smartphones, Facebook, and Twitter, signaled a fundamental transformation of social hierarchy. Brexit and the surprising U.S. election of 2016 bolstered his thesis. Which boils down to this:
Beginning in the mid- to late-2000s, the volume and velocity of online content totally upended news and media. The ability for the public to see the mistakes of authorities and criticize them in real time – and the corresponding inability of established figures to control the narrative – forever changed the relationship between the governors and the governed.
A decade later, these technologies are even more pervasive and powerful. In its 2021 year-end review, Google reported one billion hours of YouTube video is consumed every day. Billions of smartphones mean every individual enjoys – enjoys? – access to (and the ability to create) unrelenting tweets, toks, pods, and stacks every waking minute. In the past, a billion people may have watched an hour or more of TV every day. But that was spread over a comparatively small number of relatively homogeneous content options. The billion daily hours of YouTube, on the other hand, span many thousands of diverse channels, topics, and content creators. The same goes for Instagram, Facebook, TikTok, Snapchat, Twitter, WeChat, Rumble, millions of websites, and hundreds of thousands of podcasts and Substacks, summing to hundreds of billions of info-bytes1 every day.
This exaflood of content entertains, teaches, and informs. It exponentially expands the avenues for speech and creativity. But it also tends to addict, confuse, rewire, and overwhelm.2
There’s more high quality content than ever – and more garbage, too.
I’ve argued the exaflood supercharged professor Timur Kuran’s cycle of private truths and public lies, lowering the cost and friction of public speech. It thus accelerates Kuran’s cascades where private truths do not remain hidden for decades, as they did under Communism, but more quickly become public truths.
This is great news for the public. But not for authorities, who feel besieged by inexpert digital rabble. Radical transparency and unrelenting criticism threatens authorities in government, media, education, science, and medicine.3
They react ferociously. Often unable to respond substantively and specifically, they first unleashed a flurry of epithets. Over time, they refined their approach, landing on the all-purpose tactical accusation of ‘misinformation.’
This charge not only disparages new and inconvenient voices but has, increasingly over the last six years, served as a rationale for established outlets – in Big Media, Big Tech, Big Science, and Big Government – to throttle, intimidate, or censor disfavored views.4
In his new Atlantic article – ‘After Babel’ in print, or ‘Why the Past 10 Years of American Life Have Been Uniquely Stupid’ online – NYU professor Jonathan Haidt argues that intimidation by social media mobs causes people to self-censor. And that this leads to ‘structural stupidity.’ When we lack vigorous debate and viewpoint diversity, we don’t learn. We don’t correct errors. Confirmation bias reigns. Hubris sets in. We get dumb.
This can lead to what Kuran calls ‘knowledge falsification.’ When we hide or misrepresent a preference, we also hide the knowledge upon which that preference rests. The rest of the world is thus deprived of potentially important insight.
Haidt emphasizes self-censorship. But what about the silencing of those courageous enough to speak their minds? In the last few years, we’ve moved beyond self-censorship by the timid to outright censorship of the brave.
In a vicious circle, censorship then leads to broader waves of self-censorship and preference falsification. If, for example, a heroic physician gets thrown off Twitter and questioned by a state medical board, other physicians will surely self-censor or fib in public. This is not merely a result of organic social media dynamics. It is also a result of top-down erasure.
Descent into Dysinformation
In several recent essays, Matt Yglesias argues the refrains of misinformation! are merely a ‘cope’ – a short-term tactic of the establishment to paper over intramural disputes and policy failures. In other words, a way to avoid facing sinking political fortunes. I think there’s something far more systemic and strategic going on. A phenomenon that reveals both the unconscious discombobulation of the exaflood and a conscious plan to exploit the confusion.
Consider former President Barack Obama’s recent speeches at Stanford and the University of Chicago, where he thoughtfully described the info-confusion we all sense…but then only lamented the capacity of average Joes and people of one political party to get things wrong on the Internet. The president said regulation of tech platforms will be needed to enforce more aggressive content moderation (read: speech suppression). Nary a word about official Covid propaganda or censorship or the growing list of recent disinformation scandals where lie and truth were spectacularly reversed by official sources.
Introducing Obama at Stanford was Michael McFaul. A major speaker at UChicago’s big three-day ‘Disinformation and the Erosion of Democracy’conference was Anne Applebaum. McFaul and Applebaum surely believe they are doing the Lord’s work. Yet both were chief spreaders of one of the boldest, most highly coordinated disinformation campaigns in memory – the Russia-collusion hoax.
Applebaum even waved away a student’s smart question about another massive and even more recent disinformation hoax by saying she didn’t find it “interesting.” The student was of course referring to 50 former intelligence officials, on the eve of the 2020 election, calling a true New York Post story “disinformation.” I imagine if those 50 intelligence officials had released a statement saying the New York Post story had “all the hallmarks of being uninteresting,” it wouldn’t have packed the same punch. Nor would it have given Twitter and Facebook the necessary excuse to censor the bombshell story.
The selective outrage – calling inconsequential anonymous tweeters a threat to democracy while brushing aside, and even participating in, coordinated mendacity by Big Media, Big Tech, and Big Government – is the big give-away.
Which brings us to ‘dysinformation’ as a disorder. At some point, the tactic becomes a strategy and then turns to addiction. The power of propaganda and censorship is seductive. Along the way, you mislead your followers over an epistemic cliff, and you lose touch with reality yourself.
I’m still chewing on a more concise description, but here’s what I’m seeing.
mis-in-for-ma-tion: false information that is spread, regardless of its intent to mislead.
dis-in-for-ma-tion: false information deliberately spread; manipulated narrative or facts; propaganda.
dys-in-for-ma-tion: 1. a disorder resulting from an inability to process information, grasp reality, or grapple with alternate views or inconvenient facts; 2. the elevation of grand narratives, often elaborately constructed; 3. resulting in the need to modify, falsify, or deny reality, to tell noble lies, and to suppress disfavored information at nearly any cost, in order to create or sustain a too-big-to-fail narrative; 4. vulnerability to social contagion based on related information disorders; 5. which may present as a phenomenon of self-delusion, also known as ‘high on their own supply’; 6. culminating in massive positive feedback loops of groupthink, confirmation bias, knowledge falsification, error amplification, projection, and ‘structural stupidity.’
Many will see a similarity to ‘mass formation,’ the far more comprehensive framework offered by Mattias Desmet to describe the trance that captured much of the world over the last two years during Covid. His new book The Psychology of Totalitarianism will be published in English in June. I think the two ideas are intertwined but that the exaflood adds an additional amplifying twist unpresent in previous totalitarian mass formations.
Neither concept assumes that all propaganda and censorship are necessarily malicious. Many of the censors and storytellers themselves get caught up in social waves, now intensified with rapid fire digital media. They can become just as deluded as the masses they misinform.
The instincts of well-meaning but confused censors are understandable. More than ever in our complex world, we need truth tellers, with judgment, perspective, wisdom. The question is: what’s the best architecture of error correction. I’d argue the best ‘fact checking’ is performed by decentralized individuals and institutions in an open, adversarial, check-and-balance marketplace of ideas. Too many, however, want to move in the opposite direction.
Acappella Expurgator
When the exaflood began washing over us 15 years ago, up popped a set of new publications and mechanisms aimed at sorting out an increasingly fast-paced and complicated world. I’m thinking of the ‘explanatory journalism’ of Vox and ‘fact checking’ by a host of legacy and new media orgs. More recently, think tanks, news outlets, and even for-profit startups have launched a dozen or more ‘disinformation labs’ with the stated aim of combatting foreign info-threats. And of course all the social media platforms turbocharged their ‘content moderation’ programs.
In concept, each of these could have been useful tools to navigate the murky and psychedelic exaflood. With unceasing waves of info, the functions of editing, searching, curating, and criticizing become more important, not less. (Musk admits we need more and better content moderation of some things – meaning basic social media hygiene to clear out bots and spam, not core speech.) In a knowledge economy, moreover, it makes perfect sense that most of our arguments will be over data, information, and narratives, less so about ideology.5
In practice, however, fact checking, explanatory journalism, and content moderation mostly turned out to be a giant conceit, mere partisanship and ideology posing as high-minded truth. Many of the anti-disinformation labs, meanwhile, have in fact turned out to be disinformation factories linked to industry interests or government intelligence.
The concept of anti-disinformation watchdog, originally aimed at foreign bad guys, has now been turned inward onto American citizens. In February, a Department of Homeland Security bulletin warned of domestic terrorism based on Covid-19 misinformation. Now, DHS has launched an entire new bureau, the Disinformation Governance Board (DGB). The State Department and Pentagon have long spent resources and energy to counter foreign disinformation and promote our own view of the world. DHS, however, refers to the homeland. And DGB, as Gerard Baker of the Wall Street Journal notes, is ominously if coincidentally close to the acronym of the famous Soviet security service.
The choice of Nina Jankowicz to head the DGB shows either how diabolical or fantastically clueless Washington has become. The Singing Censor bought into and promoted many of the worst info-scams of the last several years, often set to music. She also wants “verified” and “trustworthy” people to “edit Twitter in the same sort of way that Wikipedia is.” Social media of course already allows people to debate, correct, and debunk. It’s called a conversation. Wikipedia, meanwhile, has been overtaken by partisan editors, who’ve rendered useless any topic or entry with any political implication. What Jankowicz endorses is thus closer to censorship by ‘officials’ and ‘experts.’ Which is what Elon Musk seeks to avoid with a restoration of open, inclusive discourse.
World capitals have gone so batty, it’s hard to know if this retro Soviet mind-control is wicked cynicism or childlike naiveté. In the case of our new Disinformation Czar, the latter may apply and even prompts another silly coinage:
mys-in-for-ma-tion: the cozy feeling derived from spreading or indulging in mis/disinformation, perhaps unknowingly, allowing one to avoid unpleasant or inconvenient reality and luxuriate in fantasy. (In Swedish, the term ‘mys’ refers to a feeling of coziness and is often combined with other words as a prefix or suffix.)
Mysinformation might sooth journalists and political junkies. But these social media games and broader info-wars have turned deadly serious.
The Covid Calamity
Which brings us to Covid-19. No event better demonstrates just how devastating the new dysinformation disorder can be. Real world suffering and death, caused in large part by affirmative propaganda and suppression of life-saving truth.
We’ve detailed our views on the science and policy of Covid in many recent reports and articles. See, for example:
How the war on ‘misinformation’ propelled the Covid cataclysm – RealClearMarkets – February 4, 2022
A Pandemic Pivot Point: The Counterintuitive Dynamics of Covid-19 – Entropy Economics – November 2021
As rationale for total vaccination sputters, censorship soars – RealClearMarkets – September 20, 2021
In short, our analysis said: A central fact of Covid is its highly differential effect according to age and comorbidities, and our policy response should have focused on protecting the vulnerable. Lockdowns were a catastrophe with approximately zero benefits but massive costs for health, education, and the economy. Early treatment with cheap, widely available, generic anti-virals and anti-inflammatories could have prevented a large portion of morbidity and mortality and should have been aggressively promoted. Novel, untested vaccine platforms may have been appropriate for some high-risk groups but, if deployed en masseduring a pandemic, could yield dangerous evolutionary dynamics and do more harm than good for low-risk people.
With constructive scientific debate, where the best ideas rise to the top, Covid would have been manageable. Yet the scientists and analysts who warned against lockdowns and comprehensive reliance on the vaccines, and who promoted early treatment with safe, cheap drugs, were erased and destroyed. I’m thinking about Drs. Scott Atlas, Jay Bhattacharya, Sunetra Gupta, Martin Kulldorff, Robert Malone, Peter McCullough, Pierre Kory, Aaron Kheriaty, and so many more.
Now, the kings and queens of Covid, who demonized these good people and pushed exactly the opposite strategies, are backtracking and coming close to admitting their disastrous errors.
In a May 5 interview at the 92nd Street Y, Bill Gates said that early on we “didn’t understand that it’s a fairly low fatality rate and that it’s a disease mainly of the elderly.”
We absolutely did know. In the early days, I was pouring over the daily tables issued by the New York City Health Department. By April 22, 2020, I could see that just 0.54% of Covid deaths occurred in people under 65 years old with no obvious comorbidities. Within a month or two of the outbreak, we knew who was at risk and who was not. Kulldorff, from his epidemiology perch at Harvard, said so at the time and now says Gates “should stay away from public health.”
Even if we give Gates the benefit of the doubt – that he meant at the very very beginning we didn’t know these important facts but then quickly learned them – why did he spend the next two years stoking panic and promoting overly broad shutdowns, vaccine mandates, and early treatment bans?
Then there’s CDC director Rochelle Walensky. Speaking on March 4 at her alma mater Washington University in St. Louis, Walensky gently retreated from the 100% vaccine certitude the public health community had blared for the previous 15 months.
“Nobody said waning,” Walensky insisted. Nobody said it might “wear off,” she continued, referring to the extremely short durability of the vaccines and the new strategy of repeated boosters. She was just as surprised by Delta and Omicron and their ability to evade vaccine protection. “Nobody said the next variant…what if it’s not as potent against the next variant.”
Nobody? Nobody except the brave scientists, physicians, and analysts who predicted the vaccines’ short durability and their inability to effectively combat variants – indeed the likelihood that mass vaccination would select for and promote vaccine-evading variants. In two early March 2021 interviews [one] [two], for example, veteran vaccinologist Geert Vanden Bossche warned of vaccine-evading variants. On March 8, 2021, he correctly predicted that we’d first see vaccine waning/evasion in Israel, then the UK and U.S.:
For those who may have some difficulty in understanding how mass vaccination drives viral immune escape, it will suffice to watch infectivity and morbidity rates in those countries who have now succeeded in vaccinating millions of people in just a few weeks (e.g., UK, Israel, USA). Whereas these countries are now enjoying declining infectivity rates, they will undoubtedly start to suffer from a steep incline in Covid-19 cases....The steep decline we're seeing right now may be followed by a short-lived plateau but a subsequent steep incline of (severe) disease cases is inevitable.
When Walensky said “nobody,” she was perhaps being truthful. In the sense that nobody she knew or heard said these things. Virology, immunology, and vaccinology are wildly complex fields. It would be unsurprising if she herself were not expert in them, or if she didn’t understand the potential shortcomings and risks of the novel gene therapies. That’s what scientific interchange is all about. Yet the groupthink inside public health combined with the media blackout of any discussion or debate left her and thousands of her medical colleagues in the dark.
Gates, meanwhile, has been on the pandemic warpath for the last decade. Perhaps when Covid hit, all his presuppositions and plans about ‘the big one’ kicked in. Dreams of saving the world with lockdowns and new mRNA technology overwhelmed any analysis of real-world data and emerging medical and sociological facts. He was on a messianic mission and couldn’t or didn’t want to see alternate views. His presumably expert staff and scientist associates couldn’t tell him the truth and thus dash his dreams.
In these charitable interpretations of Gates and Walensky, we see the results of knowledge falsification and structural stupidity. In a less charitable interpretation, Gates and Walensky’s aggressive assertions and policies were guided more by avarice, ego, and politics. The ends justified their authoritarian means. They are now modestly backtracking to preserve some tiny measure of credibility for next time. Gates seems almost giddy about ‘preventing’ the next pandemic.
Either way, this looks like dysinformation disorder. Whether knowledge falsification makes supposed experts not so expert, or noble lies are needed to sustain a too-big-to-fail narrative, the obstacles to free flowing information prevented (1) the governors from making good decisions and (2) the governed from understanding better options were available and thus applying corrective pressure.
The elevation of (mis/dis/dys)information over ideology requires new thinking and new strategies. The dismal state of shared civic sense-making is dangerous. But it also opens doors to massive opportunity – for investors and builders of new institutions.
We’re still only in the middle of the beginning of the information wars.
Martin Gurri has weighed in – "Disinformation Is the Word I Use When I Want to Shut You Up." https://www.discoursemagazine.com/culture-and-society/2023/03/30/disinformation-is-the-word-i-use-when-i-want-you-to-shut-up/