The Resistance Endgame, Part II: Desired end-state

Defining what a victory over disinformation and factionalism might look like

Whether its origin is domestic and innocuous, or foreign and malicious, disinformation’s fracturing effect on society is the critical effect we’ve identified. When it’s boosted by nation-states, and fed into by a positive feedback-loop of entrepreneurship and innovation from a global ecosystem of disinformation providers, disinformation becomes something of a national-security problem.

However, a state of 100% freedom from disinformation and fake news is both unfeasible and undesirable. This is for a vast variety of reasons, including maintenance of diverse viewpoints and cultural robusticity, and the moral imperative to protect speech rights; a country that is 100% disinformation-free is not a good country to live in for most people, and it’s definitely not America.

We are thus compelled to find, as a society, some way of dealing with disinformation that at least diminishes its effect to a tolerable level while still accepting its presence.

This problem structure gives us two questions to answer in defining what exactly we are working towards, as a society. In short, those could be put:

  1. How much of a problem do we accept having as a society that values and protects the freedom of speech?
  2. How effective of a countermeasures solution do we want in a context where we are unwilling to forcibly or coercively regulate?

In detail, these questions could be phrased as:

  1. What amount of disinformation is tolerable? Suppose we can go on percentages — is it acceptable if disinformation narratives and fake news consist of 1% of all available data in a society? 49%? If we are adjudging the necessity of acting on disinformation primarily by the prevalence of its effect, if the 42% of America who still approve of Trump and are fully bought-in is lessened to 37% like Clinton’s lowest approval rating, does that constitute a meaningful victory? What about 17% like the last days of Nixon?
  2. What effect should disinformation have for most people? Should it be treated as a serious threat on the scale of terrorism or nuclear weapons proliferation? Should it be off-beat and ludicrous, like a Colbert Show that doesn’t get its own joke? Should it be treated as appendage-like vestige of some atavistic, primitive fascist past? Should we apply a normative framework so that people feel bad about sharing and believing in fake news? What should fake news mean, going forward?

Let’s answer question 1 first. In order to define an end state of disinformation levels in objective, quantitative terms, we need to understand how disinformation operates and distinguish some kind of easy-to-understand metric to gauge its effect. In order to do that we need to understand at least in basic form how disinformation operates.

Common misconceptions about disinformation

Disinformation, or “false information intended to mislead”, is most commonly assumed to be limited to only factually false statements with a specific intent to mislead. Manipulative statements using factually true bases, “twistings” of facts to serve one narrative or another, or trivially factual narratives are a few examples of non-factual disinformation.

Since disinformation obeys different economics than “regular news” or entertainment or lifestyle outlets, the audiences it attempts to serve and how it does so are different as well. It follows different rules, it aims at different objectives and it operates in entirely different ways that can not only be generally defined but also distinguished as distinct types.

Disinformation is not always based on the false. The intention to mislead is not always explicit, directly expressed or entirely up to the disinformer. Disinformation often uses a “kernel of truth” to generate and support a separate, larger lie being propagated.

An example of factually correct disinformation. The events being reported here are trivial; the terms in which this headline and news story are composed support an essentially disinformational victimhood narrative that Tucker Carlson took on after a protest at his home

Disinformation is not always factual. To the extent that truth claims are always advanced in every statement, disinformation advances the trivial claim that “someone holds this view and it’s OK to do that”, but with such frequency and volume as to render that statement into a conventionalizing force.

In order to communicate a claim like this, trash news and propaganda frequently provides non-factual opinions primarily intended to convey the idea that other people hold that claim.

An example of non-factual disinformation, meant to reinforce negative stereotypes of LGBTQ individuals and use them to further “tu quoque” arguments of liberalism being anti-scientific and anti-traditional. The overall effect is to build a fallacious but compelling narrative — “liberals are the effeminate/threatening/anti-traditional other” — while citing pseudo-science and folk-psychology that reinforces that narrative

Disinformation is not consumed in the same manner as information. Rather than try to provide utility value by apprising them of novel information they did not know, or information they might find useful, disinformation focuses on presenting its consumers with things that they already know; it is most likely a fact that they find completely useless in their daily lives.

Instead of novelty and value, disinformation focuses on spreading, and in doing so, tends to converge on simplicity and the presentation of a pose. Disinformation’s tendency to do so could be termed its postural nature.

An example of purely postural disinformation. This presents an implied, righteous authoritarian enactor of anti-protester violence; you get to feel like the cop, not the protester. By sharing this, you position yourself to get positive feedback from other people who also want to identify with that, creating a minimal group affiliation.

What adaptive value is sharing the above meme intended to provide — what does it mean about the people sharing it? Going sheerly by the information presented, which is that the person with whom it’s identified is stating “I like joking about violence to protesters”, there is extraordinarily little. It’s not even an important person who is presented as making this statement; it’s someone using a pseudonym on a pseudonymous trash news page.

When you throw the idea of “usefulness” and “meaning” out the window, what remains is what it makes the person sharing it — the kind of pose that it enacts upon the sharer. This is the only paradigm in which this type of statement by this kind of identity in this context makes sense; instead of novelty and value, disinformation focuses on spreading a divisive narrative, and in doing so, tends to converge on simplicity and the presentation of a pose. This tendency to could be termed its postural nature.

Disinformation does not always mislead. Disinformation operators frequently intermingle “good content” in with their disinformation or simply make anodyne, easy-to-agree with statements that are simply not truth claims at all. This may be done to build confidence and trust from their followings or simply elicit statements from their audiences that deepen and rehearse existing disinformation narratives. The operational aspect of building trust with an audience is the primary goal in what we might term contextual disinformation — non-fact-based or trivially true information that is mobilized to support a disinformational narrative.

This is an important divergence from traditional paradigms of propaganda and disinformation. From the traditional perspective, it makes no sense to simply pass on information from an opposing viewpoint unmodified and unfiltered. In a disinformation pseudo-community, however, contextual cues and symbology often override basic readings; this encourages participation and internalization of disinformation narratives, sometimes more powerfully than directly stating them could attain.

An example of purely contextual disinformation; a fact is being presented by a prolific disinformation vendor, for a dominant, community-driven misunderstanding, which is that Trump’s wall is actually getting built.

Poses are not only offered for people to take on in the disinformation ecosystem; they are also offered as opportunities to define oneself by rejecting or accepting a thing.

Here, another trivial fact — “Trump tweeted this” — is mobilized to produce two disinformational narratives: the idea that the Democratic Party is corrupt, and a more basic, otherwise farcical idea that “Cadet Bone Spurs” Trump is a credible arbiter of the patriotic

Presenting the bare fact of this statement being made offers an opportunity for audience members to “pose” against or for the statement in replies and comments.

Disinformation is driven by groups. For a victim of disinformation, the feeling of entering a disinformation community is like finding a place full of people who don’t make you feel bad about your opinions. Unlike real news, disinformation is constantly crafted to support a consistent narrative. Inconvenient facts do not occur in disinformation-world; the omnipotent leader is never fallible and the disinformation source is never wrong.

As should be immediately obvious, this is by design. From the initial encounter with radicalizing disinformation, the victim essentially travels down a predetermined path of carefully engineered feedback designed to inculcate a carefully crafted, simple, usually emotionally negative idea.

By creating or simulating positive feedback, disinformation communities create a pseudo-community effect that reinforces and entrenches authoritarian, divisive or disinformation-based beliefs

The social media “game loop” accelerates the radicalization process. First popularized by James Hopson, the concept of a game loop helps conceptualize the different stages of investment of input, reward, and anticipation of future input that drive a social network.

When you use social media, you’re participating in a cycle of variable reward, social feedback and stimulated action designed to addict users and drive frequent, intense usage — a game loop.

Using variable reward timing drastically deepens conditioning; rewards also translate into social capital that builds anticipation, in turn increasing the level of commitment that is made by a user in their social media. As people post more disinformation, they get rewards that make them want to post more disinformation.

This isn’t fascism, it’s worse. When an addictive, interactive structure like social media is used as an armature for disinformation and propaganda, the result is a radicalization process that proceeds much faster and goes deeper, for more people, than any previous process in history.

A comparison of the timing of authoritarian and escalating fascist actions between Nazi Germany during the rise of Hitler and America during the rise of Trump

A systems view of disinformation resilience

A system of information consumption, production and access determines how Americans access the information we need to vote and participate in society.

  • We produce information for public consumption in a societal institution we call “media”, both traditional (i.e. newspapers and magazines) as well as social (e.g. Facebook, Instagram, Twitter).
  • Americans consume information by reading and watching media. Using the information gleaned from it, we make decisions on who we vote for, what we consent to and how we participate in civic society.
  • Our institutions of governance offer access to media (and to citizens) that provides them with data that they shape into stories — literally, fragments of narrative embodied in journalists’ writing — that are consumed by Americans, re-starting the cycle.

This is an information economy — a system within which information is produced, distributed, and consumed.

A normal flow of information, informed consent and institutional access in a democratic society

In most cases, the information economy model expresses the simplest configuration of government, media and citizenry for relatively trivial decisions:

  • Access: The US Federal Reserve says it is increasing or decreasing interest rates
  • Information: Financial media report on and analyze the Fed’s interest rate moves for their audiences
  • Consent: Audiences support or oppose the Fed’s existence and heed or ignore the Fed’s actions, depending on how they feel about the information they’ve received.

In a more complex but traditional case, the Dreyfus affair can be considered a classic example of the dynamics at play in the flow of data through an information economy:

  • Access: Alfred Dreyfus, a military officer, is imprisoned for five years for espionage; after new evidence comes to light implicating another officer, Esterhazy, a show trial is conducted, and Dreyfus has additional charges and jail time tacked on. Emile Zola learns of the case.
  • Information: In a purposefully provocative broadside entitled J’Accuse (“I Accuse”), Zola lays out a complete evidentiary account of the case, naming names and implicating specific people, and accuses the military of covering up Dreyfus’ innocence; a trial ensues
  • Consent: The polarization produced by the Dreyfus affair produces profound change in a number of early 20th century movements and political institutions; Hertzlian Zionism itself arguably springs from the Dreyfus affair. The death of the president in the middle of the affair produces a re-alignment of the political landscape along pro- and anti-Dreyfus lines; the result is essentially a radical re-casting of the left and right political lines that lasts until the World Wars. These new social institutions follow different rules of information production and access, in turn shaping the kind of information that can be brought to light from their actions.

In the modern, non-traditional case that pertains today in America, applying an information-economical view makes the fundamental issues at play in propaganda and disinformation into patterns of the whole, rather than isolated dysfunctions of specific parts.

The effect of disinformation on the system of information flows in a democratic society
  • Access: The Trump administration took early steps to shut down official government data flows, reduce scientific cooperation on climate change and reduce government transparency on White House visits and the conduct of foreign policy (e.g. Trump’s secret, unrecorded meetings and phone calls with Putin). A coordinated pattern of diminishing and diluting media access has taken place, at first by giving White House press briefing access to trashnews institutions like InfoWars and Daily Caller, then proceeding further through a series of decreasingly competent press secretaries, then eventually ending White House press briefings altogether.
  • Information: Media, both social and traditional, must cope with a diminished flow of government data to draw from — not only scientific data but also the raw data of press briefings, interviews and research. In order to maintain their commercial and social viability, media produce an increasing volume of non-factual claims, sensationalist news and appeals to emotion. The portion of the media chiefly concerned with this form of information over truthful, useful information comprises a disinformation ecosystem that exists alongside “real” media.
  • Consent: Based on misleading information provided by the disinformation ecosystem, an afflicted minority of the general populace provides manufactured consent for essentially nonsensical or illogical policy proposals intended to buttress or enrich compromised institutions; for instance, an impractical $20 billion wall, or a continued Federal government shutdown. When institutions enact policies that the un-afflicted majority doesn’t consent to, they experience diluted consent, often expressed as symbolic renunciations of institutional power: not my President, not my America, not my state or tribe.

With an organismal model of the downstream consequences and potential angles of attack onto the problem, we can begin at last to define what we consider “victory”.

Desired end-state and strategic priorities

In a game with real stakes including people’s lives, to start out responsibly,, we need to define, at least initially, what bad outcomes are which we want to avoid.

  • Ideological over-correction: An America that swings hard-left, with another authoritarian cult of personality destroying our international standing and diminishing our national readiness for disaster or war, is an equally bad if not worse outcome. A leftist Trump is still a Trump.
  • Totalitarian-control state: Speech restrictions, conduct restrictions and “thought policing” are not only potential popular overreactions to disinformation spread, they are also methods by which social networks and the regulatory bodies that control them may attempt to resolve the symptoms of disinformation spread without addressing its root causes.
  • Imbalance of separation of powers: Restricting Trump’s ability to appoint judges, interfere in legislative elections, or attack his own branch’s departments may also risk setting in place permanent constrictions on executive ability or Presidential authority with grave long-term consequences.
  • War: America’s nuclear command authority rests solely in the hands of the sitting POTUS; it is called “the President’s weapon” for this reason. Any scenario for de-escalation or removal of the threat that Trump poses to America must come to grips with whether or not it makes Trump — a psychologically unstable, physically unfit, emotionally vulnerable, impulsive and demonstrably rash decision-maker — more or less likely to use a “wag the dog” military exercise or even nuclear weapons deployment to salvage his presidency.

Our overall desired end-state is relatively obvious: we want disinformation not to be a problem anymore. Disinformation ceases to be a problem when two things happen — a social-psychological change, and a political change:

Disinformation ceases to be a problem when its adaptive value for society at large outweighs the negative consequences of belief in it. Once we accept that the news cannot be un-faked and disinformation is a permanent feature of 21st century life, the question becomes: Does it help us figure things, or talk things out, or make jokes, or feel better? Does it make things better or worse?

It becomes incumbent upon us as citizens, and thus consumers and potential targets of disinformation, to figure out where it fits into life.

Thus, if a non-negligible minority of the Republican party believes the Democratic party is an evil Jewish-led conspiracy to destroy America, but they are so firmly rooted in that outlandish belief that they are willing to go to war because of it, we cannot be said to have any kind of ideal state of affairs whatsoever. In fact, we have arguably failed at citizenship, specifically that part that involves open, rational civic discourse with other citizens, in such a scenario.

On the other hand, if the people who hold that belief are a negligible minority whom the greater majority define themselves by abjectifying or rejecting — for instance, rejecting David Duke to symbolically distance a white candidate from white supremacy — then even David Duke could be said to have some positive benefit from his contribution to public discourse. Like Al Pacino’s Scarface or a political effigy, David Duke is the bad guy whom people can burn symbolically to define themselves positively; even a white supremacist can serve a part in a kind of positive, low-entropy, easy-to-sustain default state of affairs in this way.

Figures like David Duke in a society where disinformation is effective are dangerous; in a society where disinformation is largely ineffective, David Duke takes on his proper place as a kind of discursive public toilet, or garbage can, into which people deposit the negative things they cast away to sustain themselves.

Disinformation is no longer a problem when it is no longer a significant causative factor in our politics: Disinformation could be said to have “lost”, as opposed to winning today, when a majority of one political party in our two-party system no longer bases political decisions on disinformation narratives.

Thus, if 70% of the Republican party still believes that a $20 billion wall in the middle of the desert is an idea worth fighting for, despite all empirical evidence to the contrary, we do not have a win state, especially when the majority of the Supreme Court and the Executive branch of the Federal government are held by Republicans. Winning, here, can take several forms:

  • Destroying the Trumpist base: creating a situation where fewer Republican voters hold disinformation-based beliefs
  • Destroying the Republican party: ensuring fewer Republicans are elected, so that the Republican base’s nonsensical beliefs matter less
  • De-linking Republicans from their base: maneuvering and controlling Republicans in power so that they no longer act based on the Trumpist beliefs of their bases

There are a number of ways to bring this state of affairs about — marginalizing disinformation producers, degrading disinformation spread and absorption as a cultural practice, even outright banning or prosecuting disinformation providers — but these are ethical means questions; our ideal ends are essentially adaptive in nature.

Other parts of this series

Part I: Assumptions

Setting forth basic working assumptions for the conduct of information war countermeasures and resistance

Information war strategist, activist, startup entrepreneur

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store