State Run Media versus Platform Media

This is an excellent and highly relevant set of topics that sits at the intersection of international relations, technology, and information security. My “take” is that this represents one of the most significant and challenging fronts of geopolitical competition in the 21st century.

Here’s a breakdown of my analysis on each component and how they interconnect.

The Core Shift: State-Run Media vs. Platform-Mediated Discourse

This is the fundamental battlefield. The competition has evolved from a relatively clear-cut model to a complex, hybrid ecosystem.

· The “Old” Model (State-Run Media): Outlets like Russia’s RT/Sputnik or China’s CGTN/Global Times represent a centralized, top-down approach.
· Strengths: Message discipline, clear attribution (even if denied), and the ability to execute long-term, coherent narratives aligned with state objectives.
· Weaknesses: Often lacks credibility with skeptical audiences in democratic nations. Their reach can be limited to those already sympathetic to the message or within their own media ecosystem.
· The “New” Model (Platform-Mediated Discourse): This is where the real battle is won and lost today. It involves using social media platforms (X/Twitter, Facebook, Telegram), video sites (YouTube), and encrypted apps (WhatsApp) as the primary vector for information.
· Mechanics: It uses a mix of official state accounts, covert bots and troll farms (e.g., IRA “Internet Research Agency”), co-opted influencers, and the organic amplification by real users who engage with and share the content, often unaware of its origin.
· Strengths:
· Plausible Deniability: Attribution is deliberately obscured.
· Amplification & Virality: Algorithms designed for engagement can inadvertently favor inflammatory or emotionally charged disinformation.
· Tailored Messaging: Different micro-targeted audiences can receive different, even contradictory, narratives designed to exploit their specific biases.
· Eroding Trust: The goal is often not to make everyone believe one lie, but to create a “firehose of falsehood” that leads to general cynicism—”Nothing is true, and everything is possible.”

My take: The most effective disinformation campaigns seamlessly blend the two. A narrative might be seeded by a state-backed outlet, picked up and fragmented into thousands of social media posts by bots and trolls, then go viral and be reported on by genuine users and even fringe domestic media, making the original source irrelevant. This creates a “swarm” of information that is incredibly difficult to counter.

Disinformation Campaigns, Attribution, and the Attribution Problem

This is the critical challenge. You can’t respond effectively to an attack if you can’t confidently say who launched it.

· The Goal of Campaigns: Modern disinformation is rarely about convincing you of a specific “truth.” Its goals are more insidious:

  1. Sow Discord: Exacerbate societal divisions (race, religion, politics, vaccines, gender).
  2. Undermine Democratic Institutions: Erode trust in elections, the media, science, and government.
  3. Paralyze Response: Create so much noise and confusion that forming a coherent public consensus or policy response becomes impossible.
  4. Project Strength / Mask Weakness: Create a false image of domestic unity or external invincibility.
    · The Attribution Problem: Attributing a cyber or information attack is difficult. Technical evidence (IP addresses, malware signatures) can be faked or routed through third countries. Tactics are copied by non-state actors. States hide behind cut-outs and proxies.
    · My take: Attribution now relies on a “whole-of-sector” approach. Tech companies provide data on coordinated inauthentic behavior. Intelligence agencies provide classified signals intelligence (SIGINT) and human intelligence (HUMINT). Academic researchers analyze patterns and narratives. Only by combining these pieces can a credible public case be made.

Diplomatic Response Mechanisms: A Good Start, But Not Enough

The EU’s East StratCom Task Force and the proposed US-UK Joint Counter-Disinformation Taskforce represent a crucial recognition that this is a national security threat requiring a coordinated government response. However, they face immense challenges.

· EU’s East StratCom (EEAS):
· Strength: It’s a well-established clearinghouse. Its Weekly Disinformation Reviews and public database of cases are invaluable for researchers, journalists, and policymakers. It focuses on prebunking and debunking.
· Weakness: It is chronically under-resourced, has a limited mandate (primarily focused on Russian disinformation towards Eastern Europe), and its impact on the broader information ecosystem is arguably limited. It’s often “preaching to the choir.”
· US-UK “Joint Counter-Disinformation Taskforce”:
· Potential Strength: Combining the intelligence and diplomatic power of two key allies could lead to faster, more impactful attribution and more coordinated responses (e.g., joint sanctions on disinformation actors).
· Potential Weakness: It risks being politicized domestically. In the US, any federal effort to counter disinformation is immediately attacked from some quarters as a “Ministry of Truth” seeking to censor political speech. This creates a major vulnerability that adversaries eagerly exploit.

Overall Synthesis and Take

  1. Asymmetric Warfare: This is a highly asymmetric conflict. Autocratic states, with their centralized control and no need to respect free speech, are naturally adept at weaponizing information. Democratic societies, built on open discourse, are inherently vulnerable. We are playing defense on a field the adversary designed.
  2. The “Marketplace of Ideas” is Broken: The classic liberal idea that the truth will eventually win out in a free and open debate assumes good faith actors and a level playing field. It doesn’t account for AI-generated content, hyper-sophisticated micro-targeting, and botnets that can artificially manufacture the appearance of consensus or outrage.
  3. Responses Are Still Too Siloed: Effective counter-measures require unprecedented collaboration between:
    · Government: For diplomacy, attribution, and sanctions.
    · Tech Platforms: For enforcing Terms of Service, de-platforming bad actors, and altering algorithms to downrank—not amplify—disinformation.
    · Civil Society & Media: For digital literacy, prebunking, and investigative journalism. Right now, these groups often work at cross-purposes, hampered by mistrust and different incentives.
  4. The Future is AI: The next frontier is the use of Generative AI to create convincing deepfakes (video/audio) and tailor persuasive text narratives at an unimaginable scale and speed. This will break our already strained attribution and response models.

Conclusion:

We are in a sustained, ongoing information conflict. The state-vs-platform dynamic is the battlefield, disinformation campaigns are the weapon, and the goal is to shape the strategic narrative to weaken adversaries without firing a shot. While initiatives like East StratCom and the US-UK taskforce are necessary and positive steps, they are ultimately tactical responses to a vast, strategic challenge. Winning, or even competing effectively, will require a societal shift that prioritizes resilience (through critical thinking and digital literacy) as much as it does deterrence (through attribution and cost-imposition). The core tension—protecting open societies from those who would exploit their openness to destroy them—remains the central dilemma of our information age.