The Censorship-Industrial Complex is new in its scale and expense, but in both language and ambition, a perfect repeat of our history’s darkest chapters
Matt Taibbi writes that history is repeating itself, thanks in part to material being taken out of context:
The two pieces just released by Racket place all of this in historical context. “A Century of Censorship,” by Matt Farwell (affectionately known here on Substack as The Hunt for Tom Clancy), is a great reference and a shocking read. Matt zeroes in on our incredible consistency across ten decades in reaching for the same delusional non-solutions to “hearts and minds” problems, both here and abroad.
As far back as 1965, CIA chief W.F. Raborn penned a long letter to Lyndon Johnson aide Clark Clifford, warning the Soviets had an elaborate desinformatsia campaign operation, designed to “sow distrust” and “denigrate American leadership.” Raborn’s letter perfectly predicted with miserably perfect accuracy the current dubious definition of disinformation, worrying not so much about actual fake news — “the KGB is able to fabricate whatever material is needed,” he wrote — but about material “documented from Western books,” material “often taken out of context.” Then as now, our secret services were searching for ways to stifle inconvenient, domestically produced, true information that foreign villains could use to undermine faith in “democratic institutions” and “NATO governments.”
This effort has attracted “hucksters”:
Wyatt with his background was able to talk to people from the ex-military/intelligence contracting scene who told him that “anti-disinformation,” like any business, attracts its hucksters. He’s let in on industry tricks for selling the new digital-censorship snake oil. “You can charge more if you call it information operations,” says one vet, while another casually admits the tools being sold to taxpayers and tech platforms alike may not work terribly well. “I don’t even use my company’s products,” one IO expert tells Tom. “I don’t find any of it useful.”
The problem with the modern anti-disloyalty reboot is the technology of today makes bad ideas far more powerful than before. As Wyatt explains, at least some experts making bank off anti-disinformation technology know it’s often not very good at achieving its stated ends, while more proactive strategies like investing in media literacy work better. Content-zapping tools sell, however, and that’s a problem, because, as he writes: “Healthy suspicion, the kind that keeps the evildoers at bay, sometimes turns toxic.” This can result in “operational paralysis and rampant mistrust, setting off a chain of reactionary measures run amuck.”