How an Illuminati Conspiracy Theorist Became a Viral Geopolitical Authority
How a Beijing-based lecturer who believes the Illuminati runs the world became a viral authority on geopolitics—with help from pro-Kremlin ideologues and Turkish propaganda accounts
In late February 2026, as American military operations against Iran intensified, millions of social media users found themselves receiving geopolitical analysis from a man who believes the Illuminati runs the world and that Hannibal Barca—one of antiquity's most documented military commanders—never existed.
Jiang Xueqin—a Yale-educated, Beijing-based educator who runs a YouTube channel called “Predictive History”—had made three predictions in a 2024 lecture: Donald Trump would return to the presidency, he would initiate conflict with Iran, and the United States would lose. Two had come true. The internet declared him a prophet.
Within days, a Canadian teacher at a private high school in Beijing—someone whose YouTube lectures argue that Freemasons and Jesuits secretly control global events—had been transformed into one of America’s most viral foreign policy analysts. Major outlets profiled him. Influencers with millions of followers platformed him. His conspiracy-laden lectures were treated as serious strategic thinking.
The question isn’t whether his predictions were accurate. It’s how someone peddling theories about shadowy secret societies became treated as a credible authority—and what the amplification machinery that made it happen reveals about modern information warfare.
The Conspiracy Theorist as Analyst
In his “Secret History” lecture series—the same YouTube channel that brought you his Iran predictions—Jiang lays out an elaborate conspiratorial worldview. Freemasons and Jesuits conspire to control global events. Secret societies pull the strings behind major geopolitical developments. Hidden hands orchestrate what appears to be organic political movements.
When Jiang looks at geopolitical events, he sees coordinated plots by shadowy cabals rather than the messy interplay of institutional interests, domestic politics, and strategic miscalculation that tends to characterize actual international relations.
His intellectual influences are revealing. Jiang publicly declared that Alexander Dugin is “the world’s greatest geopolitical strategist.” Dugin is the Russian ultranationalist philosopher whose 1997 book Foundations of Geopolitics explicitly advocates “introducing geopolitical disorder into internal American activity, encouraging all kinds of separatism and ethnic, social and racial conflicts, actively supporting all dissident movements—extremist, racist, and sectarian groups.” The book is taught at Russia’s General Staff Academy as a framework for undermining Western democracies.
The Amplification Architecture
A dedicated satellite account (@pred_history) appeared on Twitter with the sole purpose of reposting Jiang’s content. Created in Australia but claiming Beijing origins, it generated nearly 1.8 million views in the first week of the Iran conflict. The account was verified in March 2026—curious timing for an account whose entire purpose is amplifying a single conspiracy theorist’s lectures.
Social media listening tools confirm what happened next. The keyword “Jiang Xueqin” exploded in March 2026—coinciding precisely with the start of American military operations. In the first 11 days of the month alone, his name appeared in 131,400 posts generating 1.8 million engagements. Jiang’s lecture was recorded in 2024. It sat relatively dormant for over a year. Then, the moment American military operations began, his content went supernova.
Data confirms the top influencers driving this surge: Clash Report, George Galloway, and Jackson Hinkle’s Legitimate Targets podcast. This isn’t a random assortment of accounts. It’s a coordinated network spanning Turkish propaganda operations, British anti-war activists with Kremlin sympathies, and American influencers whose content consistently aligns with anti-Western narratives.
Dugin himself began retweeting Jiang’s predictions. American pro-Kremlin influencer Jackson Hinkle, whose 3.7 million followers make him a mass-reach amplifier, featured Jiang on his podcast. After that, Hinkle posted sensational clips like “IRAN HITS U.S. BASES HARD.” Turkish account @clashreport published viral claims from Jiang in multiple languages, often based on single, unverifiable sources.
Analysis of engagement patterns by X commentator Green Beret Nap Time reveals roughly a third of accounts amplifying Jiang’s content claim to be U.S.-based, but large numbers originate from Indonesian, Indian, and Pakistani Android apps—locations commonly associated with bot farms.
This is the machinery that transformed a conspiracy theorist into a prophet. Not accuracy. Infrastructure.
What The Predictions Actually Were
Predicting Trump might return to office in 2024 wasn’t exactly going out on a limb. He was leading in polls for months and remained the dominant figure in Republican politics. Predicting potential conflict with Iran? Tensions have been building for decades. The possibility of military conflict has been discussed in foreign policy circles since the 1979 revolution.
The third prediction—that America would lose—remains speculative, drawing heavily on Vietnam analogies. The base rate for these kinds of predictions is quite high. If you predict major geopolitical tensions will escalate and conflicts will be difficult, you’ll be right fairly often simply because that describes most international crises.
The Five-Message Campaign
The layered architecture produced content promoting five core messages to different audiences: The U.S. is losing. American soldiers are dying for elite interests. Civilians are being killed. The war represents religious extremism. America will be humiliated like Vietnam. Together they created a narrative ecosystem designed to demoralize American audiences and fracture political coalitions—anti-war progressives, isolationist conservatives, Muslim audiences worldwide, anti-establishment communities.
Modern influence operations don’t require every participant to be a conscious conspirator. A Beijing educator genuinely believes his conspiracy theories. A Russian philosopher sees geopolitical advantage. An American influencer gains followers. Academic leftists find ideological confirmation. Bot-like networks inflate metrics to trigger algorithmic promotion. Each has different motivations; together they create a self-reinforcing machine.
What We’re Actually Watching
The Jiang phenomenon reveals how easy it is to manufacture viral authority when you combine surface credentials, conspiratorial content that naturally aligns with state propaganda narratives, and sophisticated amplification infrastructure. Someone who denies Hannibal Barca existed and believes the Illuminati controls global events isn’t doing rigorous analysis—yet millions of people treated him as a credible authority.
The prophecy machine doesn’t run on accuracy. It runs on exploitation of how information spreads in fragmented media ecosystems where verification mechanisms have collapsed.





