The Diaries of A Digital Detox
Dopamine Detox Week One
Shawn Klemme - April 14th, 2025
At a glance, the light catching them perfectly, one could easily be forgiven for thinking the ridges in the glass ran inches deep; yet, to the touch, they felt smooth as wet slate. I began paying significantly more mind to the cracks on my phone’s screen in the days since deleting social media off of it. What seemed like a relatively simple and easy decision, had bounced around in my head for months; each pro and con lining up in my head like dominoes in what felt like an anxious trap. The dominos stood there, stable but stretching back, creating a list so long, I couldn’t even rationalize the pros and cons of parting from my algorithms.
On one hand, the 187 grams in my pocket was arguably the most influential tool ever invented. Capable of connecting people, regardless of the distance, in a matter of seconds, in ways that most people never even bother to understand. A functional library of Alexandria, hugged tightly to your thighs while you walk; a super-computer more powerful than those used to land a man on the moon! The device responsible for communicating with my boss at work, his boss, the company execs in Dallas, my family, old classmates, strangers from bars, and everything in between, had become as ingrained in my daily life as eating, to the point where even being physically separated from it, even if the environment I was in didn’t permit its usage, gave me anxiety.
The silence of everyday life had been replaced with a dull, constant stream of dopamine, being pumped directly into an ear (and out the other). This constant replacement of information with new information came in more forms than I could count. When a stillness would come about, any number of icons on the screen would transport me into an algorithm, capable of showing me things I wasn’t even sure I wanted. Perfectly balancing content that I want to see, content they think I’m most likely to interact with (positively or negatively), and content that pays their bills, the tech companies that had once made me feel so connected with the world, were now making me feel like a slave to it. I was letting every piece of new information I gathered be fed to me, rather than taking the initiative to seek it out myself. This problem was most acutely pointed out to me in a documentary that would later go on to be the deciding factor in my separation with most algorithms.
I remember finishing Netflix’s The Social Dilemma and sitting for a moment in stunned silence. This 2020 documentary-drama hybrid peels back the curtain on how social media algorithms hijack our attention and manipulate our behavior – all while we blissfully scroll. It’s not just another tech scare story; it’s a collective confession from the very designers and executives who built these platforms. As I watched former Facebook, Google, and Twitter insiders describe the monster we’ve unwittingly invited into our lives, I felt a mix of validation and alarm. The film’s core message lands with a personal resonance: the apps we love are ingeniously engineered to love us back – or more precisely, to hook us, keep us, and profit from us, often at the expense of our well-being.
Early in The Social Dilemma, ex-Google design ethicist Tristan Harris poses a question that frames the entire film: “Is there something that is beneath all these problems that’s causing all these things to happen at once?”. The “problems” he refers to include tech addiction, data theft, fake news, polarization, and more – a cacophony of digital age crises that seemingly erupted in unison. The documentary argues that yes, there is a unifying cause: the toxic business model of algorithm-driven social media.
At the heart of the film is the revelation that social media platforms aren’t passive tools but active influencers. Every swipe and notification is designed to pull you deeper into the app. Harris bluntly describes it as “a race to the bottom of the brain stem”sustainable-markets.org – a competition between platforms to exploit the most primitive parts of our psychology. By triggering dopamine hits through likes, tags, and endless feeds, these apps create habit-forming loops that are incredibly hard to escapesustainable-markets.org. Tech companies have essentially turned our own biology into a lever for their profit: as one observer notes, “Your brain doesn’t have a dopamine switch” to simply shut off this craving on commandsustainable-markets.org.
Crucially, the film makes it clear this isn’t an accident or mere byproduct – it’s deliberate. A former Facebook engineer, Justin Rosenstein, admits “We’re all living in a feedback loop that’s designed to keep us there.” Every feature, from the Like button (which Rosenstein himself co-invented) to auto-play videos, serves one main goal: maximize user engagementsustainable-markets.org. In the documentary, several insiders outline how their companies’ entire growth strategy boiled down to three objectives:
Engagement – keep users scrolling as long as possible.
Growth – induce users to invite friends and spread the platform to new users.
Advertising – monetize that engagement by showing as many ads as possible.
Everything you see on a social feed is curated by algorithms relentlessly optimized for those goals. If a tweak in the feed keeps us staring at the screen 5 minutes longer, that design change wins. As Tristan Harris explains, we’ve shifted “from a tools-based technology environment to an addiction- and manipulation-based technology environment. Social media isn’t a tool waiting to be used. It has its own goals, and it has its own means of pursuing them by using your psychology against you.”. Hearing this, I couldn’t help but reflect on my own habits – the times I’ve opened an app intending just to check one thing, only to find myself spellbound by an endless scroll. It’s disconcerting to realize how often my mind’s course has been quietly rerouted by an AI that knows what will catch my eye.
One of Harris’s most striking observations in the film was about how we’ve started using our phones as digital pacifiers. “We’re training and conditioning a whole new generation of people that when we are uncomfortable or lonely or uncertain or afraid, we have a digital pacifier for ourselves,” he says, “kind of atrophying our own ability to deal with that.”. That line hit home. How many times have I instinctively reached for my phone in a moment of boredom or anxiety, swiping for comfort? The Social Dilemma forces us to confront the possibility that these impulses aren’t purely self-soothing behavior – they’ve been encouraged by design. The apps provide immediate relief from negative feelings (a funny video to lift sadness, a flurry of notifications to quell loneliness), reinforcing the habit loop. Over time, like any addiction, our baseline ability to tolerate discomfort erodes. We become dependent on the quick fix the phone provides. The documentary brings a sobering clarity: in the attention economy, our attention is the product, and there’s always an algorithm ready to lure it back whenever it wanders.
Several quoted experts drive this point home with chilling clarity. As early Facebook investor Roger McNamee puts it, social media creates “2.5 billion Truman Shows” – each of us trapped in a personalized reality curated by algorithms. In these tailored worlds, we’re fed content that confirms our biases or keeps us emotionally engaged, and “over time you have the false sense that everyone agrees with you because everyone in your news feed sounds just like you. Once you’re in that state, you’re easily manipulated.”. I found that metaphor incredibly powerful. It’s easy to recognize other people as being in a bubble, but the film made me pause and wonder: what if I’m in my own bubble too, unaware it’s even there? Each of us lives in a feed-backed echo chamber where the algorithm shows what it thinks we want (or what will provoke us). The result is an environment where truth becomes subjective and outrage becomes common currency – fertile ground for misinformation and division.
Psychological and Societal Costs: From Self-Worth to Democracy
The Social Dilemma doesn’t just warn in abstract terms; it illustrates the human toll of these algorithms on both personal mental health and society at large. One storyline in the film follows a teenage girl whose self-esteem crumbles under the pressures of social media. She posts a selfie and anxiously awaits the dopamine rush of likes. When a cruel comment about her appearance appears, we see the sting it leaves. This fictional vignette reflects a very real phenomenon: young people (and let’s be honest, adults too) chasing approval online and feeling empty or “not enough” when the feedback falls short. Former Facebook executive Chamath Palihapitiya candidly summarized this cycle: “We curate our lives around this perceived sense of perfection because we get rewarded in these short-term signals – hearts, likes, thumbs up – and we conflate it with value, we conflate it with truth. And instead what it is is fake, brittle popularity, that’s short-term and that leaves you even more vacant and empty than before you got it.”. Those words “vacant and empty” encapsulate the hollow aftertaste of a social media binge that I, and likely many of us, have experienced. You get the hit of validation, but it never lasts. In fact, it feeds a deeper insecurity – a need to constantly go back for more external validation. The film links this to alarming trends in mental health: anxiety, depression, and even self-harm among teens have skyrocketed since social media and smartphones became ubiquitous. (At one point, Harris notes a staggering statistic: suicides among preteen girls (ages 10–14) rose by 151% in the last decade). When an industry insider flat-out says “These services are killing people and causing people to kill themselves”, the gravity of that statement is enough to make anyone stop and think. It was plain as day to him – a former Facebook and Pinterest executive, Tim Kendall – that something about the social media experiment has gone horribly wrong in terms of human well-being.
On the societal level, the documentary leaves no doubt that the stakes are equally high. What do you get when billions of people live in algorithmically curated realities? You get an erosion of shared truth and a fragmented society vulnerable to manipulation. One expert in the film, data scientist Cathy O’Neil, reminds us that “algorithms are opinions embedded in code… optimized to some definition of success. If a commercial enterprise builds an algorithm, it’s optimized for their definition of success – usually profit.” . This means our interests – like seeing truthful news or balanced viewpoints – take a backseat to what keeps us engaged. The result has been the viral spread of misinformation and conspiracy theories with unprecedented ease. Renée DiResta, a misinformation researcher, points out “The platforms make it possible to spread manipulative narratives with phenomenal ease, and without very much money.”. We’ve all seen how one outrageous rumor or fake story can rack up millions of shares before facts have a chance to catch up. The film directly connects this to real-world crises: political polarization, hate groups, and even mob violence can incubate in the petri dishes of our feeds.
Perhaps the most jarring revelation is how aware the platforms themselves are of this dynamic. The Social Dilemma cites an internal Facebook report from 2018 which found that 64% of people who joined extremist groups on Facebook did so because the algorithm steered them there . Let that sink in – not because a friend invited them or they sought it out, but because Facebook’s recommendation engine (the “Suggested Groups” sidebar, for example) nudged them toward extreme content. This isn’t a passive mirror of society’s existing divisions; the algorithm is actively fueling radicalization. In the film, you see a dramatized portrayal of this: a teenage boy gets pulled from watching standard videos into increasingly extreme propaganda, guided at each step by an AI persona deciding what to show him next. It’s a bit unsettling how the dramatization visualizes the algorithm as a trio of virtual avatars analyzing the boy’s every click – unsettling because it’s not far from reality. As tech insiders acknowledge, “there’s only a handful of people at these companies who understand how these [algorithm] systems work, and even they don’t fully understand what’s going to happen with a particular piece of content”. In other words, we’ve created machines that even their creators struggle to control, and those machines are shaping the information environment of society. The documentary draws a line from this to the breakdown of civic discourse and trust. When everyone is fed a different version of the truth, how do we even begin to agree on basic facts, or address collective challenges?
For me, one of the most haunting quotes came from Jaron Lanier, a pioneer of virtual reality and prominent critic of social media. In The Social Dilemma, Lanier laments: “We’ve created a world in which online connection has become primary, especially for younger generations. And yet, in that world, anytime two people connect, there’s always a sneaky third person who’s paid to manipulate those two people. So we’ve created an entire global generation of people who were raised within a context where the very meaning of communication – the very meaning of culture – is manipulation.”. Hearing that sent shivers down my spine. It suggests that an entire generation might normalize this idea that whenever we talk or share something online, there’s always an unseen influence, an algorithmic puppeteer in the middle. It made me question: What does authentic connection even mean in an age when our interactions are intermediated by profit-driven algorithms? If culture is built on communication, and communication is being manipulated, where does that leave our culture in the long run?
Enter Surveillance Capitalism: We Are the Product
After laying out how social media algorithms exploit human psychology, The Social Dilemma essentially asks why these systems exist in the first place. Why would companies design technology that addicts and divides us? The answer, which the documentary drives home and which scholar Shoshana Zuboff explicitly articulates, is simple: because it’s ridiculously profitable. Zuboff – a Harvard professor and author of The Age of Surveillance Capitalism – appears in the film to explain that the dominant business model of Big Tech is built on trading human behavior as a commodity. In her words, “It’s a marketplace that trades exclusively in human futures.” Tech companies have figured out how to sell certainty to advertisers – the certainty that you will behave in a predictable way – and in turn, advertisers pay handsomely for the chance to influence your next click, view, or purchase. Zuboff observes, “This is what every business has always dreamt of – a guarantee that if it places an ad, it will be successful. They sell certainty. In order to be successful in that business, you need great predictions. Great predictions begin with one imperative: you need a lot of data.” .
That line unveils the engine driving all this surveillance: the hunger for data. The more information a company can gather about us – our clicks, dwell time, location, friends, biometric data, everything – the better it can predict what we’ll do next and how we can be nudged in one direction or another. Zuboff coined the term “surveillance capitalism” to describe this system. As she writes in her book, “Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data.”. Our lived experiences, in other words, are being harvested as data – what we feel, what we fear, where we go, who we talk to, how long we look at something – all of it is captured as grist for the mill of prediction. “Digital connection is now a means to others’ commercial ends. They accumulate vast domains of new knowledge from us, but not for us.”. In the context of social media, this translates to a kind of quiet user surveillance happening 24/7. As The Social Dilemma chillingly recounts, “everything they’re doing online is being watched, tracked, and measured” – every single action. The film underscores that these companies know far more about us than we realize: “They know when people are lonely, depressed, looking at photos of your ex at 2AM. They know what you do in secret. They know you better than you know yourself.”. It’s a disconcerting thought that somewhere in a server farm, an AI might have a model of me more accurate than my own self-image.
The goal of all this data collection isn’t just to predict our behavior, but increasingly to shape it. The Social Dilemma drives this point home by explaining that platforms don’t sell our data outright (it’s too valuable to them). What they sell is the power to influence our behavior that comes from their data-driven models. One expert in the film notes, “It’s not about our data being sold. They build models that predict our actions, and whoever has the best model wins. All the clicks we’ve ever made, every linger, every like – it all goes into making a more accurate model. With that model, they can predict which videos will keep you watching, which emotions will trigger you.”. This is a critical insight: the product is behavioral prediction. Your news feed isn’t just showing you what is – it’s crafting an environment to elicit a response that aligns with someone else’s goal (whether that’s an advertiser wanting you to buy something or a political group wanting you to adopt a viewpoint). Zuboff describes this as companies “tuning” human behavior as if we were instruments, through a combination of omniscient data tracking and clever algorithmic nudges. The documentary gives the example of how simply tweaking the timing or frequency of notifications can significantly change user behavior. It’s analogous to a lab experiment on a massive scale, with us as the unwitting subjects.
What struck me is how The Social Dilemma and Zuboff’s analysis together reveal a kind of invisible hand directing human behavior – not the Adam Smith kind that leads markets to efficiency, but a digital hand that leads our attention and actions toward profit. It’s a new species of power. In the past, a company could advertise to you, but it was a one-way proposition – you still had lots of agency to decide. Now, through constant A/B testing and AI optimization, companies can iteratively learn what makes you specifically click or hesitate or binge, and adjust in real-time to exploit that. This happens beneath our awareness; as one quoted line goes, “The greatest exercise of power is when people don’t realize it’s being exercised.”. That is the crux of surveillance capitalism – it works best when it doesn’t feel like coercion at all. We all like to think we’re immune to advertising or that we use social media how we want, but when I learned about these mechanisms, I had to wonder: how much of my own thinking and behavior is truly mine, and how much has been engineered by these digital systems?
Reflecting on Our Algorithmic Dependencies
By the end of The Social Dilemma, I felt two things equally strongly: enlightened and unsettled. Enlightened, because I finally had a framework to understand those nagging feelings I’d had about social media – why it’s so hard to put down, why it sometimes makes me anxious, why people I know seem to live in alternate political realities online. Unsettled, because the problem feels so vast and deeply embedded in our lives. The documentary doesn’t let us off the hook easily; it shines a light not only on Big Tech’s manipulations but also on our own complicity. We are, after all, the ones who keep scrolling, keep clicking, keep allowing our behavior to be tracked.
Writing this, I found myself checking my phone every few minutes, ironically tempted by the very distractions I’m describing. It’s almost comic, except it points to how reflexive this behavior has become. As I reflect on my algorithmic dependencies, I have to ask: At what point did I give up chunks of my free will in exchange for convenient feeds of entertainment and validation? The answer, it seems, is buried in countless tiny concessions – every time I clicked “Allow notifications,” every time I chose the recommended video, every time I glanced at my phone because it buzzed with some trivial alert. Each of those moments was a thread, and together they’ve woven a web that’s hard to untangle from.
Yet, there’s a silver lining in awareness. Tristan Harris and others aren’t luddites calling for us to abandon technology altogether; they’re calling for humane technology that respects users. And Zuboff’s insights, while alarming, also empower us with knowledge. If this is the age of surveillance capitalism, understanding it is the first step to challenging it – as citizens, as consumers, and as individuals with agency. The film ends with some practical suggestions (like turning off notification alerts, fact-checking before sharing, not clicking on recommended content blindly to regain a measure of control. I’ve started adopting a few of these: I purged a bunch of apps from my phone, tamed the endless pings, and deliberately set aside phone-free time to retrain my brain that it’s okay to not be connected to a feed every second. These are small acts of resistance, but they feel meaningful.
More than anything, The Social Dilemma left me with a lingering question that I now pass on to you: How do we want our relationship with technology to look? If the status quo isn’t serving our mental health or our society, what part can each of us play in forging a healthier digital future? The documentary, paired with Zuboff’s eye-opening exposition of surveillance capitalism, is a wake-up call – one that urges us to step back from the algorithmic slot machine and reclaim our attention, our data, and maybe even our destiny. It’s not comfortable to confront these dilemmas, but personally, I’d rather face them head-on than continue down a path of imperceptible, algorithm-driven change in my behavior . After all, now that I know what’s happening behind the screen, the real question is: What will I do about it?
Still, I won’t pretend this has been easy. The boredom creeps in hardest during work hours—those long stretches of dead air I used to fill effortlessly with a podcast, a YouTube rabbit hole, or some half-thought meme feed. Now, in the absence of that noise, I’m forced to sit with the silence, and let my mind pace the room. It’s uncomfortable. Restless. But it’s also… clarifying. I’m grateful I’m not facing it alone. My partner—brilliant, grounded, and just as committed to this detox—is walking this path beside me. Her struggle manifests differently. Where I wrestle with the lull of workdays, she finds herself most tempted during quiet evenings at home. We’ve had long conversations about it, trading strategies, calling out each other’s cheat moments with the kind of tenderness only love can manage. In a weird way, the difficulty has brought us closer. We're not just unplugging—we're choosing to be present. And that’s a whole different kind of connection.
Sources:The Social Dilemma (Netflix, 2020); Shoshana Zuboff, The Age of Surveillance Capitalism (2019), James Governale, What is The Social Dilemma (Medium.com, 2020).