Invisible Rulers: The People Who Turn Lies into Reality
Introduction: Power, Influence, Lies, and Truth
PART I
1.The Mill and the Machine — 21
2.If You Make It Trend, You Make It True: Influencers, Algorithms, and Crowds — 44
3.Gurus, Besties, and Propagandists: How Influencers Shape Culture, Politics, and Society — 78
4.The Crowd: Contagion, Consensus, and the Power of the Collective — 111
PART II
5 Propagating the Big Lie — 139
6 Agents of Influence — 179
7 Viruses, Vaccines, and Virality — 218
8 The Fantasy Industrial Complex — 267
9 The Path Forward — 309
Acknowledgments — 361
Notes — 365
Index — 425
INTRODUCTON
In late December 2014, Patient Zero walked in through the park gates of the Happiest Place on Earth, kicking off what would come to be known as the Disneyland measles outbreak. Over the next month, more than 120 people in California fell ill, including a dozen infants. Nearly half of the infected weren’t vaccinated. Dozens of the sick wound up hospitalized. The outbreak quickly spread to neighboring states, and by the end of the year, more than six hundred cases of measles were recorded across the country. It was a shockingly high number for a disease that had been declared eliminated in the United States fourteen years earlier.
I had no way of knowing, back then, that this incident would trigger one of the opening skirmishes in a war for power, influence, and reality itself—a conflict in which the ends justify any means, and one that has ripped America apart, threatened our very democracy, and consumed my own life for the past decade, leading to death threats, congressional subpoenas, lawsuits, and legal bills running into seven figures.
When the Disneyland outbreak started, I was the mom of a twelve-month-old. As it happened, just one month prior I’d been poring over a decade’s worth of California school immunization statistics. The effort had been inspired by an unfortunate San Francisco parenting rite of passage: getting my kid onto preschool waiting lists before his first birthday. While pregnant, I’d joined a few mommy boards and quickly gotten fed up with the tendency of anti-vax threads to rehash debunked nonsense. Links to the blogs of grifters selling homeopathic oils were presented as counterpoints to safety studies in medical journals. There was a whooping cough outbreak happening, even as celebrity pediatricians were telling their vaccine-hesitant clients that it was OK to “hide in the herd,” they knew the diseases were dangerous but assured these parents that other people were still vaccinating, so overall rates would keep the unvaccinated protected. I’d read about California schools with vaccination rates “lower than South Sudan,”² and I didn’t want my son to wind up at one.
And so I found myself scrutinizing tables that showed the rates of “personal belief” exemptions—opt-outs justified not by any medical or even religious concern but by the simple desire to opt out. More than ten years of data from across the state showed an unmistakable trend over time: a steady decline in classroom vaccination rates.
It had been nearly twenty years since a fraudulent study by Andrew Wakefield asserted a link between vaccines and autism, instilling fear in a generation. Since then, study after study had confirmed the safety of the measles-mumps-rubella (MMR) vaccine, and the media had long since stopped giving airtime to the vaccine-autism conspiracy theory. And yet the trend suggested that a small but growing minority of the California public was increasingly skeptical of the safety of routine childhood shots. The opt-outs, meanwhile, driven primarily by fear instilled by bad information, didn’t only impact the families of those who claimed them. They impacted everyone. They impacted other people’s kids.
I wrote a quick blog post about my frustration: “How California’s Terrible Vaccination Policy Puts Kids at Risk.” Then I did something I’d never done before: I called my local state assembly representative to ask whether the policy might be changed.
When I made that first call in November 2014, the assemblyman’s staffer said no: the anti-vaccine movement was a well-organized political force, and there was no appetite for a fight over school vaccine opt-outs. But when the Disneyland outbreak began, I called back, and the reaction I got was completely different: the outbreak had highlighted the extent to which anti-vaccine misinformation—increasingly visible on social media—had exacerbated hesitancy. The public was angry. The state senator from Sacramento, Dr. Richard Pan, happened to be a pediatrician, and he was introducing legislation to eliminate the personal-belief exemption in an effort to raise classroom vaccination rates (which would impact community coverage more broadly). The San Francisco staffer asked whether I might cover more broadly). The San Francisco staffer asked whether I might like to help support the legislation as a parent voice and suggested I call Senator Pan’s office. So I did.
I had no idea what I was getting into.
The next few months would upend my understanding of politics, propaganda, and social movements, as I got a firsthand experience of how a new system of persuasion—influencers, algorithms, and crowds—was radically transforming what we paid attention to, whom we trusted, and how we engaged with each other. It was down the rabbit hole of the anti-vaccine movement, joining its groups, following its influencers, and spending hours each night looking at its memes, becoming an observer of what felt like an alternate reality… while simultaneously trying to grow a countermovement, in real time, to push back against all of it. It was obvious to me that this was a new battlefield, that the rules of engagement had changed. It was a war of memes, not facts. Winning was going to take more than quoting some vaccine safety statistics.
Dr. Pan’s staffer had connected me with a handful of other moms who’d also called asking if they could be useful. They included a former senate staffer, a graphic designer, a professor who specialized in vaccine law, a nurse and longtime community organizer, a public relations pro, and a conservative activist accustomed to generating bipartisan consensus in the overwhelmingly blue state of California. I had a fair bit of quantitative analysis and data science experience from years on Wall Street and a solid understanding of social media from my then current gig in venture capital. Together, we had to figure out how to grow a pro-vaccine parental countermovement virtually overnight. So we set up a Facebook page and a Twitter (now doing business as X) account under the name Vaccinate California. We wanted to make sure that people searching for information about the bill found us, so we started posting memes featuring cute kids (our own) with our logo, alongside arguments making the case for why school shots mattered.
A hashtag had begun to emerge around the number assigned to Senator Pan’s bill: #SB277. From the start it was largely dominated by anti-vaccine activists, many of whom got nasty in the replies to any comment in support of the bill. Some were from California, but what quickly became apparent to those of us waging this war was that many were from out of state; anti-vaccine advocacy, it seemed, was a core part of their identity, and they’d been at this for a while. They deluged our Facebook posts and Twitter replies, accusing us of being pharma-paid baby killers. That was just the beginning of the attack.
Some dug through our backgrounds, families, and past jobs. Others found and posted our addresses on social media (a tactic known as doxxing).
A few particularly delusional trolls took photos of my baby son that I'd shared on a personal tumblr and Facebook the prior Halloween and began reposting them to Twitter, accusing me of not really vaccinating him, being a devil worshiper (one possible interpretation of a Maleficent costume, I suppose), and other bizarre and creepy things.
I set my personal social media accounts to private, but they'd already taken the photos they thought would be most exploitable. Anti-vax activists made videos calling us "medical fascists," splicing us into footage of Adolf Hitler leading rallies; on two occasions, they followed outspoken supporters of the bill down the streets of Sacramento, posting their photos and location to incite both online and offline harassment. They barraged the practice ratings of physicians on our side with one-star reviews and called to scream at their receptionists. It quickly became clear that cute kid memes were not going to be enough; this was going to be bare-knuckle-brawl politics. They wanted to intimidate us into simply bowing out of the fight.
The fight to pass the bill, which began in January 2015, turned out to be a six-month battle that left me deeply concerned about the growth and reach of the anti-vaccine movement. But what fascinated and alarmed me most was the mechanics of how the battle was fought, because I could tell even as it was happening that the tactics and dynamics were applicable to far more than one small political fight in California. There were bots and trolls, coordinated harassment brigades and doxxing, and Facebook pages run by God-knows-who. There were YouTube channels and secret Facebook groups with tens of thousands of members, dedicated to mobilizing online armies, gaming trending algorithms, and selectively editing videos to recruit new adherents. There were precision-targeted ad campaigns—including ours! that focused on reaching handfuls of constituents in specific zip codes; Facebook's powerful targeting tools made them possible, though the company did little to verify who was running the campaigns and offered no easy way for audiences to know who was behind them. There were viral memes and commandeered hashtags. These were the tools—sometimes weapons that by 2020 would be the norm for networked activism in an increasingly polarized America, but only five years prior were hardly known at all. It is sometimes easy to forget how rapidly our online world has changed.
It felt like I was seeing the future, while simultaneously participating in it: virality determined what people talked about, and virality was a function of how well influencers, algorithms, and crowds could make something capture the public's attention.
Public opinion on any given issue would be shaped by content currents that flowed across high-speed, frictionless social networks that curated what their users saw and, increasingly, whom they connected with. The accuracy of the message, or whether it came from a credible source, was largely irrelevant. What mattered was whether the content captured user attention and generated engagement: whether it made audiences want to participate, by liking, sharing, or joining a group around whatever the cause happened to be. Public opinion" no longer even seemed like the right phrase—it implied a need to persuade a mass group of people, a majority of society, even as that seemed increasingly impossible. The online world was rapidly devolving into factions.
Three points quickly became clear to me. First, the anti-vaccine opposition was well networked and really understood social media. Second, this relatively small group—most of whom sincerely believed that vaccines caused autism and the government was covering it up—had influence that belied its size. While the pro-vaccine position was still the dominant opinion in the "real world"—approximately 85 percent of kids in California were vaccinated—we were a very small minority in the online conversation. Third, public health officials and institutions absolutely did not understand the importance of the internet in shaping social movements... or community beliefs.
So, to understand how the opposition organized and attracted new members, I began to study their content and networks online. I wanted to understand who was influential in their community—what kind of rhetoric and content they used, what topics they talked about, how they engaged with the broader crowd of activists and persuaded them to act. I followed Robert F. Kennedy Jr. and Del Bigtree and the other propagandists for the movement, as well as the contrarian doctors who sold medical exemptions and "homeopathic vaccines" to the fearful. I observed the automated bots
that incessantly posted anti-vaccine memes to the #SB277 hashtag on Twitter, allowing the anti position to dominate share of voice in the conversation. I noticed that longtime die-hard "vaccine truthers"—the most conspiratorial members of the community, deeply convinced that the government and "Big Pharma" were colluding to cover up a link between vaccines and autism—had suddenly begun to conceal those deeply held beliefs and instead emphasize new talking points that refocused the conversation around “parental rights" and "medical freedom." School immunization requirements were government tyranny and an affront to "health choice," they claimed, as they began to reach out to libertarian and Tea Party activists to grow their movement. The anti-vaccine faction also, unfortunately, liberally relied on harassment: sending online brigades of angry activists from all over the world to barrage California legislators whom they saw as being on the fence or as having disrespected their movement; targeting doctors who testified in legislative hearings in favor of the bill with negative reviews and threats to staff; and doxxing ordinary parents advocating for their own "parental rights." It was certainly my first experience being doxxed and harassed for expressing a political point of view.
I began to write about these dynamics, posting on my little blog and on Twitter, sending notes to the California representatives considering the bill—many of whom were trying to figure out why they were being harassed and whether the overwhelming amount of anti-vaccine activism online meant that the public was, in fact, opposed to the bill. It was polling well among their constituents, they observed, but the social media conversation was overwhelmingly toxic. How could those two things be reconciled?
And that, it turned out, was one of the key challenges of my second point: there is an asymmetry of passion on social media. There is some scholarly debate about whether conspiracy theories are on the rise—they have always existed, after all, as we shall see—but in today's information environment, relatively small groups of true believers (a few thousand people) can leverage the tactics of networked digital activism to produce the kind of sensational content and attract the initial engagement that algorithms subsequently boost. Their zealotry, their constant posting and fervent commentary, gives the impression that they are far more numerous than in fact they are.
By contrast, the overwhelming majority of people in California– millions of us–vaccinated our children and then simply went on about our lives; we didn't get on social media to tweet or post about it. Vaccinate California had to figure out how to make people do just that. We had to pull the silent majority off the fence and into the public conversation. However, while we were slowly growing an audience willing to call their representatives and post about the importance of shots for school to Facebook friends, there were no obvious figureheads or influencers with massive audiences who could blast the messages out to their followers. Getting reach was difficult.
The anti-vaccine side had charismatic leaders who spoke like revivalist preachers, people like Robert F. Kennedy Jr. but also wellness gurus with large Instagram followings who produced beautiful lifestyle content (and occasionally argued that vaccines had no place in “authentic, holistic health”). They understood what resonated on social media; meanwhile, the pro-vaccine side got an occasional boost from the rare positive comment by a celebrity.
As we learned the ropes of online activism—building the plane while flying it it also became clear that public health institutions were markedly disinterested in what was happening on social media. Even as savvy conspiracists were expanding their movement (while we were just beginning to grow ours), authoritative institutions such as the Centers for Disease Control and Prevention (CDC) seemed completely flatfooted. They very obviously did not understand how social media worked or what got engagement; while the anti-vaxxers were going viral with slick YouTube content and emotional first-person testimonials about vaccines causing allergies (false), SIDS (false), and autism (false), scientists were posting statistics-heavy PDFs and blog posts to get accurate information out. The public health experts believed that the public would ultimately trust what they said because they were, after all, the authority figures—the people with PhDs and MDs. They believed that what happened on social media didn't matter. As a CDC employee put it to me at the time, "Those are just some people online."
Meanwhile, the Southern California crunchy moms who had long been the pillars of California's anti-vaccine movement were bringing entirely new groups—from the Tea Party to the Nation of Islam to the Church of Scientology and its celebrities into the fold with their liberty rhetoric and social network outreach. Within a few years, this big tent of strange bedfellows would include local militias, sovereign citizens who believed the US government itself was illegitimate, and conspiracy theorists convinced that elected officials were secretly members of satanic pedophilic cabals. The growth of this movement and the recasting of public health interventions as government tyranny may have seemed fringe at the time, but it would have profound implications as the world fell into a global pandemic in 2020.
The pro-vaccine side did ultimately win passage of Senate Bill 277, and California moved to a medical-exemption-only policy for school immunizations. But the way that the fight had played out the tactics for shaping public opinion, the actors involved, the influencer-algorithm-crowd interplay that drove virality and commanded attention, the harassment brigades was a harbinger of a profound shift and more upheaval to come.
The anti-vaccine movement's loss left its adherents more convinced than ever that they had to continue growing their numbers and dominating the online conversation. They began fund-raising on GoFundMe and canvassing wealthy donors, soliciting money for ad campaigns to drive people to their Facebook groups. They created networked state-level "Medical Freedom" pages committed to boosting each other's messaging and growing a movement. Meanwhile, Vaccinate California continued to exist, but as a side project—our volunteers had lives to get back to and kids to raise. Some were burned out by the toxicity. There was no obvious source of funding. No one was responsible for continuing to grow a pro-vaccine parent movement.
I was captivated by the tactical dynamics of what I'd just seen and participated in. Although I also had a full-time job and my baby son, I began to spend increasing amounts of time at night examining how influential figures grew their audiences on Twitter and Instagram, how information moved across networks of Facebook pages. I tracked how conspiratorial communities seemed to be cross-pollinating with each other through nudges from platform-recommendation algorithms.
As I would soon come to learn, others had begun to notice similar dynamics playing out around some pretty high-stakes issues. The Islamic State was growing a virtual caliphate. Fake news stories were increasingly going wildly viral. Russia, it seemed, was using something called the Internet Research Agency to create mass confusion and spread propaganda to advance the Kremlin's interests, particularly in Ukraine. It was also creating bots and fake accounts, taking the old Soviet Cold War strategy of "agents of influence"—people who tried to nudge public opinion in a direction that benefitted the cause or country on whose behalf they worked—into the virtual realm.
And yet, even as demonstrably adversarial actors gained a foothold on social networks, it was unclear whose responsibility it was to stop them. Tech platforms, which controlled the infrastructure upon which this was occurring, wanted to build products and make money. They didn't want to have to decide who was real or fake or how to balance user expression against harmful content and harassment campaigns. While a handful of people in government, academia, tech, and civil society saw the crisis coming, most did not take these early signs seriously. It was, after all, just some people online. It wasn't real life.
By 2015, we were already entering a world in which truth was determined by popularity, viral nonsense trended daily, and partisan polarization was actively exacerbated by armies of trolls, foreign and domestic. I felt that things were going off the rails: for example, because I'd been watching anti-vaccine videos, YouTube's recommendation engines began to suggest I might also be interested in chemtrails, flat Earth, and 9/11 conspiracy theories. The trending algorithms on Twitter could reliably be gamed with bots, and the trending feature on Facebook was promoting outright false articles. But perhaps most disturbingly, our social norms seemed to be resetting. Sensational and extreme ideas were increasingly visible, while speaking against them seemed to require spinning up a counterfaction capable of resisting personal harassment intended to push opponents out of the conversation. Ordinary people began to treat this as normal partisan behavior, even as we had more power than ever to shape public opinion by clicking on the "like" and "share" buttons, pushing outrageous stories into the feeds of friends and family. We could all increasingly retreat into online groups in which we saw only what we wanted to see, or what platform algorithms thought we wanted to see, and very little else.
In late December 2019—five years after the Disneyland measles outbreak started—Patient Zero walked out of the Huanan Seafood Market in Wuhan, kicking off what would come to be known as the COVID-19 global pandemic. Or so we thought. Four years after that, we would be debating whether he'd really walked out of the Wuhan Institute of Virology, and every single facet of a pandemic that killed over six million people—with over one million of them within the United States—would be a matter of irreconcilable debate contested by identitarian factions that could agree on only one thing: the others were lying to you.
The same California anti-vaccine groups and influencers that Vaccinate California had fought against in 2015 were early to pick up on the COVID story and played prominent roles. The pandemic that emerged four years after the Disneyland measles outbreak revealed the full extent to which the influencer-algorithm-crowd trinity—the new means of shaping public opinion—has reallocated power, transformed our understanding of the world, and impacted our capacity for collective action. A global rumor mill propelled theories about the vaccine's safety, efficacy, and toxicity to mass audiences, exposing the challenge of telling rumor from fact in the era of virality. Theories traveled far faster than facts could be known. Although the tropes were nearly identical to those that had long circulated about diseases and vaccines past, effective counterspeech was largely absent. Platforms—the curators and recommenders of stories—had to decide what to surface. COVID laid bare and accelerated the public's loss of trust in institutions, media, and authority figures.
Creating narratives is no longer solely the purview of elites. This power — for centuries, almost wholly dominated by media, institutions, and authorities — has been upended. Ordinary people can influence what their friends see, what their communities talk about, and what their country focuses on more easily than ever before. We are no longer the passive recipients of mass media messages. Instead, we all have the power to shape public opinion, to wield the tools that engender virality, to spread messages that reach and potentially influence millions. In this book I will investigate what that means.
Some of the change is positive: a proliferation of voices, an abundance of creativity, the formation of new communities that bring people together independent of geography. But there are also serious consequences. A new system of incentives has given rise to novel forms of propaganda and tools for manipulation. Shared reality has splintered into bespoke realities, shaped by recommendation engines that bring communities together, filled with content curated from the media and influencers that the community trusts. Very little bridges these divides. This has profound implications for solving collective problems or reaching the kind of consensus on which democracies depend.
This is not a book about social media. There are enough of those
Rather, my focus is on a profound transformation in the dynamics of power and influence, which have fundamentally shifted, and on how we, the citizens, can come to grips with a force that is altering our politics, our society, and our very relationship to reality. For sure, companies and governments must bear their burden of figuring out how to regulate this new space, and how to restore trust and shore up institutions, but we as citizens have a responsibility to understand these dynamics so we can build healthy norms and fight back. This is the task of a new civics.
Understanding the forces driving a revolution requires a theory. And so, in Part I we set forth a theory about how the new power dynamic works, examining how what we may see as discrete influencers, algorithms, and crowds have a combinatorial power that enables them to intersect and build on each other to create not only powerful social movements but bespoke realities. We'll delve into a menagerie of influencers, the modern opinion leaders who not only process news and information for their audiences but also work to shape public opinion directly by creating their own social media content for clout and profit. We'll look at the curation and recommender algorithms of social networks that decide what we see and shape what we create. We'll consider the public itself: the virtual crowds, factions, and fandoms that are now active participants in online rumor mills and propaganda machines. And we'll explore how the three members of this influencer-algorithm-crowd trinity influence and respond to one another, with platforms offering the capacity for attention and reach-influence-as-a-service-to creators, who produce content carefully tailored to both the algorithm and their niche audiences.
Part II then examines how the rise of this new system of influence has transformed politics, war, Great Power propaganda campaigns, your local government, and even your relationships with your friends and neighbors. It takes a clear-eyed look at the role of experts and institutions and why they are now so distrusted. And, rather unexpectedly, it tells the story of how studying this system and exposing its worst manifestations led to multiple congressional subpoenas and lawsuits for me personally—acts of political retaliation by elite lawmakers, political influencers, and niche propagandists angry at the academics who laid bare how the invisible rulers of a new communication ecosystem attempted to delegitimize a free and fair election.
Today, who controls the internet controls reality. We have all been given the capacity to influence. We all have the ability to persuade communities, amplify messages, create viral conspiracy theories, and instigate real-world protests. We have profound power but no commensurate responsibility. Very few of us have reckoned with what that power means.
Before delving into the why of the theory and the how of the way our society is being changed and undermined, let's first look at the way one person—let's call him "Guitar Guy"—descends step by seemingly inconsequential step down the rabbit hole. (While Guitar Guy is not modeled on any particular person and is admittedly drawn as a caricature, his journey illustrates the way that many have been drawn into bespoke realities.)
Guitar Guy is in his early forties and lives in a suburb of a midsize American city. He learned to play guitar as a kid, mostly through taking in-person classes, reading books, and making friends with others who jammed in the basement at the local music store. He had a band and toured for a bit. Today, he's a music teacher at a few local elementary and high schools and supplements his salary by teaching some private lessons as well.
When Guitar Guy was just starting out, he advertised his lessons with fliers posted in the local coffee shop and library. But as the internet took off, he started posting ads on Craigslist to reach potential new students and set up a blog on Blogger. The content was nothing fancy, just some tips on becoming a professional guitarist, funny stories about being a teacher, and discussion about shows he seen. Honestly, though, no one really read the blog, and he gradually abandoned it. However, through his writing, leaving comments on other people's blogs, and joining some message boards, he did manage to connect with other musicians, all over the world. That's the best part of the internet, he thinks; it's helped him meet others who share his passion for guitars and fingerstyle. In 2006, shortly after YouTube launched he decided to create an account and began posting videos of himself playing in his living room at night.
In 2009, Guitar Guy joined Twitter, a “microblogging” platform that let him post 140-character bursts of thought. In 2011, he created an account on the then photo-sharing, now also video-sharing app Instagram (which would later be acquired by Facebook/Meta). In 2020, he joined Tik Tok drawn by its effortless “duet” video stitching technology, which let him add his own riffs directly on top of videos posted by other musicians— asynchronous jam sessions facilitated by nothing more than an app... magic. He now regularly posted content to TikTok, Instagram, and YouTube, sometimes Facebook and Twitter too, and over the years he managed to attract something of a following: tens of thousands of guitar-loving people followed him on each platform.
Guitar Guy didn’t aspire to be an influencer as he set up his accounts. Making money wasn’t on his agenda. He was just a guy who wanted to post his art and chat about it with people who shared his interests. But it turned out he was pretty relatable, and people liked both him and his content. And, of course, the social media platforms made it so easy to get started: a laptop, a webcam, a good microphone, and a niche. He created a little setup in his garage and would go out to it for an hour each day after dinner to sit and play. At first he uploaded short videos and then checked back later to see what other people had to say about them. As social media platforms added new features—and as his audience grew—he got into livestreaming. He posted his streaming schedule to his YouTube profile—Monday, Thursday, Friday, 7 to 8 p.m. PT. His followers would get a little notification telling them to come watch his performances and demos. They could chat with him, and he would respond between songs. He could also cross-post the content—upload a copy of the same stream—to other platforms later on. He could make short clips of the best parts, creating his own little hit reel for platforms with audiences who preferred shorter videos.
Those hours spent livestreaming were some of the best of his week. He made real connections with fans who often felt like friends. He could say whatever was on his mind—even if it wasn’t music related. He found a community and a culture. And the icing on the cake was that he was also making some extra money: his stream watchers could tip him, and he was earning some revenue from ads that ran before his videos played. He'd started getting approached for sponsorships by companies who wanted him to tout their pedals or straps; even a local brewery had reached out, hoping he'd talk about their beer on Instagram. Some of the companies offered a few thousand dollars for a post.
Then one night, as Guitar Guy was getting ready to do his biweekly YouTube livestream, he decided—quite innocently!—to discuss a roiling controversy in guitar land: rock idol Eric Clapton's latest comments about COVID vaccines. Clapton had described experiencing some personal side effects after getting the jab and appeared on conspiracy theorist YouTube channels claiming that the government and pharmaceutical companies were using mass hypnosis and subliminal messages to make the public consent to getting the shots. That part was a bridge too far for Guitar Guy, but some of Clapton's other complaints—like refusing to perform at venues requiring his audience to show proof of vaccination—had kind of resonated. That argument, about freedom of choice, felt compelling. Guitar Guy had gotten vaccinated but didn't love the idea of mandates. Adults could decide if they wanted to risk getting sick in a bar. Musicians did have to earn a living, and those who didn't have social media, who relied solely on in-person gigs, were really struggling.
A lot of his friends, in fact, had a hard time making ends meet as the pandemic dragged on. He decided to chat with his audience about it, get their take. Maybe play Clapton's protest song.
As he started streaming, chatting about Clapton's arguments about hypnosis and lockdowns, he noticed a significant jump in the number of likes, comments, and reactions from his audience. A few were complaining, but others were cheering him on, using the "tipping" feature to drop him a few bucks in real time.
Hmm, he thought. This is interesting.
To keep the discussion flowing, hopefully maintaining the high rate of likes and comments (and tips!), Guitar Guy kept talking about the controversy, going into related stuff he'd seen on TikTok. He'd been on the fence about the boosters thing, he confessed, and influencers he followed,
particularly some of the fitness people, were also hesitant. In fact, he added, some of them thought mRNA vaccines were quite dangerous. After all, we just didn't know enough about them—this was the first time the technology had been used on humans!
Guitar Guy had no medical credentials or scientific expertise, and neither did the fitness accounts he paid attention to and cited. Neither did Eric Clapton, for that matter. But his forty thousand fans were now hearing a perspective on vaccines from someone they had come to like and, more importantly, to trust. They could speak back to him, have a conversation about their own feelings on the matter and he listened. Some shared the livestream with their friends: "Check this guy out, follow him, he's saying the kind of stuff you never hear on TV." Others tried to push back—"come on, not you too with the conspiracy theories!" but as debates got heated in the comment threads, they eventually gave up and just left.
The controversy Guitar Guy unwittingly kicked up, it turned out, was great for his engagement numbers, both during the livestream and after. A lot of people shared the video, and he got a bunch of new followers. He continued skirting controversy—he didn't want to get outright banned from the platforms, obviously, and he knew they had policies against vaccine misinformation. But it wasn't that hard to pivot ever so slightly to other controversial topics, culture war stuff that platforms didn't have policies against: parents' rights, LGBTQ commentary, gun control, abortion, even Big Tech's control of information. Being provocative kept the engagements rolling in and the follower counts climbing up. And the higher the count went, the better he did financially.
Guitar Guy now had a big decision to make. It didn't take a rocket scientist to figure out that engagement was highest when he was talking about controversial political topics and weird online theories—not guitars.
His livestreams were now regularly getting reposted to other platforms, sometimes in hashtags that he honestly didn't love conspiracy theory stuff that creeped him out a bit but this still translated into high view counts and more income. The problem was that the right-wing and left-wing crowds each seem to be looking for stronger ideological commitment, probing in the comments to see if he really was "one of them." Truth be told, he'd never really been an ideologue, never fit neatly in one bucket, but he was feeling pressure to align more vocally with some online political faction.
He decided to keep leaning into controversy. His content shifted: instead of playing for most of the stream, he now played one or two songs and then talked, tying the lyrics into some outrageous thing that had happened that day.
Over the course of the next few weeks, his YouTube and Instagram follower counts spiked. He checked his YouTube channel analytics dashboard again and saw that a higher percentage of viewers were finding his videos from the platform's suggested videos.
Recommendation algorithms, perhaps noticing his sustained high engagement, were suggesting his content. More and more of his videos were getting high view counts, particularly ones where he was adding controversial but not highly moderated!-keywords. And people were following him, but not the same kind of people who used to follow him. No, judging by the comments, he now had an increasingly large percentage of followers who knew him as the "guy who wasn't afraid to show some support for Eric Clapton" or "the guy who made the TikTok sea shanty about Big Tech censorship." He used to have a fairly nice fan community that wanted to talk about what was still his main love: guitars. But now he had a big and still-growing following—he'd just crossed 480,000 YouTube subscribers! and those new folks were there for a reason. Sure, they liked music, but they were more likely to share the provocative stuff. If Guitar Guy wanted to keep his engagement rate up, he had to give them what they wanted.
The influence-as-a-service machine was humming: Guitar Guy was producing content, which recommender algorithms in turn boosted and made even more popular among highly specific audiences, delivering more followers and engagement (which equals money) to the creator, who in turn produced more content. As Guitar Guy produced highly engaging—and increasingly inflammatory content in response to these incentives, his audience wasn't just getting larger; it was changing. It was being skewed away from the original community of guitar enthusiasts and toward those who came for the controversy. It had become pretty clear that they weren't there for a nuanced discussion of the topics he'd been posting and writing little ditties about. They wanted red meat.
Guitar Guy was actually a little bit concerned about continuing down this rabbit hole, which is what, deep down, he knew it was. Sure, he had some honest concerns about the COVID restrictions, the kind of stuff he talked about in his first few controversial videos, but his new followers were now regularly alluding to more far-out theories in the comments: Jeffrey Epstein, microchips, Bill Gates, the New World Order. He may have accidentally stumbled into becoming a culture war influencer, but these followers would spell out what staying in this new territory entailed. He was a bit worried. Would they turn around and attack him if he pushed back against the most extreme voices, tried to restore some sanity or nuance to the conversation? He was not wrong to be nervous: he was now beloved by a particular faction, and expressing views counter to their other politics very well might lead to his being harassed. But if he said something they wanted to hear, well... they'd give him unconditional support and would share his content far and wide, helping him to continue to grow his follower count—and his bank account.
You might be wondering, what happened to Guitar Guy's original audience. What must they be thinking? Have they fled his channel, or are they still watching?
Some dropped off after realizing that the vague conspiracizing was more than a passing thing. Some pushed back, trying to get at the nuance inherent in controversial culture war topics, but got fed up with constantly having to do battle with the influx of new supporters and left. But some simply listened. They too got into conversations with the new supporters. And they were not necessarily turned off by what Guitar Guy or the new community members had to say; some perhaps even found them persuasive.
It is possible that if you are reading this book, you have at some point in recent years asked yourself, “Is it just me or has everyone else lost their mind?” or “How did we get here?” It doesn’t matter if you are conservative or progressive, because the phenomenon has affected people across the political spectrum.
The aim of this book is to guide you through the evolution of this carnival hall of mirrors, this world of echo chambers, grifters, and keyboard culture warriors, to explain just how we got here and how we might get out.
មតិយោបល់
ប្រកាសផ្សាយមតិយោបល់