Context in History · 8 of 8

Social Media and the Attention Economy

How social media platforms monetized human attention, creating algorithmic feedback loops that shape behavior, belief, and public discourse.

Social media did not simply give people a platform to speak — it built an economy around the act of paying attention. Facebook, Twitter, YouTube, and TikTok discovered that human engagement could be measured, predicted, and sold, and that the most reliable way to capture attention was not to inform but to provoke. Outrage, fear, and tribalism proved far more profitable than nuance or accuracy. By the mid-2010s, the attention economy had become the dominant business model of the internet, reshaping everything from journalism to politics to mental health.

In the Middle East and North Africa, the story of social media carries particular weight. The Arab uprisings of 2011 were initially celebrated as proof that social media could democratize political participation — that Twitter and Facebook could topple dictators. The decade that followed complicated that narrative considerably. The same platforms that enabled protest also enabled surveillance, disinformation, and state repression. Egyptian authorities used Facebook data to identify and arrest activists. Saudi Arabia deployed troll farms on Twitter to silence dissent. The algorithmic amplification of sectarian content contributed to polarization across the region.

What connects the attention economy to artificial intelligence is the underlying logic of optimization. Social media algorithms were early AI systems — recommendation engines trained to maximize engagement regardless of consequences. The AI systems now being deployed at scale — chatbots, content generators, search agents — inherit this logic. They are built to satisfy, to keep users engaged, to generate responses that feel helpful. Whether those responses are true, or whether they reinforce existing power structures, is secondary to the optimization function.

Shoshana Zuboff’s concept of “surveillance capitalism” captures this dynamic: the extraction of human experience as raw material for prediction and profit. Social media was the laboratory; AI is the factory.

As AI systems become the primary interface through which people encounter information, who controls the optimization function? And what happens to public discourse when the systems mediating it are designed not to inform but to engage?