When Domino's Deliveries Predict Airstrikes
And what that tells that about both the past and future of warfare.
At 6:59 p.m. Eastern time on June 12, 2025, a Twitter account called Pentagon Pizza Report noticed something unusual. Four pizza establishments near the Pentagon had experienced sudden surges in Google Maps “popular times” activity—the kind of late-evening spike that suggests senior officials working through dinner. Hours later, news broke of major Israeli strikes against Iran. The pizza shops had predicted a geopolitical event before cable news.
When Defense Secretary Pete Hegseth recently joked on Fox News about ordering random pizzas “just to throw everybody off,” the comment drew mockery from critics who saw it as unserious trolling. But the remark reveals something more interesting: a sitting defense secretary now openly acknowledges that adversaries can monitor operational tempo through commercial data aggregation, and he’s thinking about countermeasures. This represents a remarkable shift in how institutions approach operational security—one that traces back thirty-five years to a Domino’s franchise owner who noticed something odd about late-night deliveries.
On August 1, 1990, Frank Meeks delivered twenty-one pizzas to CIA headquarters in Langley, Virginia. It was a one-night record. Hours later, Iraqi forces invaded Kuwait, launching what would become the Gulf War. Meeks later told the Los Angeles Times that delivery drivers operated as inadvertent intelligence collectors: “The news media doesn’t always know when something big is going to happen because they’re in bed, but our deliverers are out there at 2 in the morning.” The anecdote became Washington lore—a quirky story about how pizza orders betrayed crisis management. But it remained just that for decades, an anecdote requiring human observation and local knowledge.
The transformation from amusing anecdote to systematic intelligence collection happened gradually through the datafication of everything. Google Maps began aggregating anonymous location data to show users when restaurants were crowded. Delivery apps tracked order patterns to optimize driver routes. What companies designed for consumer convenience became exploitable for operational intelligence. In August 2024, an X account launched to systematically monitor this data—no longer requiring a chatty Domino’s owner, just algorithmic observation of publicly available commercial information. The Pentagon Pizza Report’s June 12 prediction wasn’t luck; it was pattern recognition applied to metadata exhaust.
The pizza problem represents a single thread in a much larger fabric. In 2018, a fitness tracking app called Strava revealed the locations and internal layouts of secret military bases worldwide because soldiers jogged with GPS-enabled devices. The company’s heat map, intended to showcase global athletic activity, lit up forward operating bases in Afghanistan and Syria like beacons against empty desert. Analysts identified troop movements, mapped facility perimeters, and even spotted a lone cyclist at Area 51. A 2023 investigation purchased individually identified health and financial data about active-duty military personnel from data brokers with minimal vetting—using Singapore-based servers to demonstrate how easily foreign actors could do the same.
More recently, journalists tracked more than 3 billion mobile phone location signals near U.S. military installations in Germany, mapping personnel movements from barracks to bars and identifying devices inside Büchel Air Base, where nuclear weapons are reportedly stored. They bought this data legally from American brokers—the same brokers who sell to anyone with money and minimal questions. The reporters weren’t Russian GRU officers or Chinese Ministry of State Security operatives, but they easily could have been.
What makes this particularly vexing is that even aggregated data poses security risks. A foreign adversary doesn’t need to identify specific individuals if they’re simply looking for peak and low activity periods at sensitive facilities. Parking lot satellite imagery analyzed by hedge funds to predict retail earnings can just as easily reveal when defense contractors are working overtime before weapons system deliveries. HVAC power consumption patterns, cell tower activity near government buildings, delivery app traffic—all of it generates metadata trails that sophisticated actors can analyze to understand operational tempo without ever penetrating classified networks.
The Defense Department’s 2018 policy restricting wearable fitness trackers in deployed settings represented early recognition of this problem, but it addresses only the most obvious vulnerability. Preventing soldiers from wearing Fitbits in combat zones does nothing about the dozens of apps on their phones that continuously broadcast location data, or about Google’s aggregate traffic information showing when Pentagon parking lots fill up, or about food delivery platforms revealing late-night work sessions. The practical solution cannot be “leave your phone at home” for everyone handling sensitive information—modern operations require connectivity.
This is where Hegseth’s pizza joke becomes more than a quip. His comment demonstrates awareness that traditional operational security—keeping information classified, limiting access, securing communications—no longer suffices when commercial surveillance capitalism generates exploitable intelligence as a byproduct of normal activity. The countermeasure he describes, ordering decoy pizzas to introduce noise that degrades signal quality, reflects actual information security tradecraft for the digital age. It’s the same logic behind why intelligence agencies sometimes conduct random equipment movements or vary routine schedules—not paranoia, but recognition that adversaries systematically monitor patterns.
The challenge facing defense institutions is genuinely difficult. Democracies operating in open societies will always generate more observable metadata than authoritarian regimes that control commercial data flows and restrict civilian technology. Chinese military facilities don’t light up on Strava because Chinese citizens don’t have the same app ecosystem generating constant location broadcasts. Russian intelligence exploits open-source intelligence about U.S. operations while maintaining opacity around their own activities. This asymmetry creates a structural disadvantage.
Yet the alternative—comprehensive data controls that would prevent Google from aggregating location information or ban delivery apps from sharing traffic patterns—conflicts with both commercial interests and democratic norms around information freedom. Recent bipartisan legislation like the Protecting Americans’ Data from Foreign Adversaries Act represents early attempts to address these tensions, but comprehensive solutions remain elusive. The problem keeps expanding faster than policy can respond.
What’s striking about the evolution from Frank Meeks’ 1990 observation to Hegseth’s 2026 comment is the timeline itself. It took thirty-five years for the defense establishment to move from unaware to publicly joking about countermeasures. That progression—from ignorance to recognition to active mitigation—suggests institutions are learning to operate in an environment where every Domino’s order potentially broadcasts intentions. Whether decoy pizzas actually work matters less than the acknowledgment that operational security now requires thinking about commercial metadata the same way previous generations thought about encrypted communications.
The impossibility isn’t the task itself but rather maintaining the pretense that secrets can be kept the old way. When Google Maps aggregate data predicts airstrikes and journalists can legally purchase troop movement patterns, the traditional boundaries between classified and open-source intelligence have collapsed. Hegseth’s joke, intentionally or not, signals that someone is paying attention to this new reality.



