The Social Architecture of Digital Power
How Social platforms govern us through code—and why “using them better” isn’t the answer
Have you ever thought about what happens in the second between opening the app and seeing the first post?
You don’t. No one does. It’s invisible. It’s instant. It feels natural.
And that’s exactly where they get you.
Because in that second—in that infinitesimal fraction of time—an act of power takes place: more sophisticated than any censorship, more effective than any propaganda, more pervasive than any surveillance you can imagine.
Call it architecture. Call it code. Call it an algorithm. But call it what it is: governance.
“Code is law.” Today, code also scripts desire.
The Social Architecture That Decides For You
Social platforms aren’t tools. They aren’t spaces. They are environments.
And like any environment, they have an architecture. But it’s not one you see. It’s one that passes through you. It draws cognitive spaces, lays down paths of attention, rewards specific emotions. It offers the feeling of choice while pre-configuring what is thinkable, visible, and desirable.
Take the feed. It seems simple, right? A list of content. But it’s not a list. It’s a cascade of decisions made on your behalf, thousands of times per second, across billions of people simultaneously.
The system gathers signals. How long you paused on that video. Which posts you watched without reacting. What you searched at 2 a.m. when you couldn’t sleep. Then it computes probabilities: who will react, how they’ll react, how long they’ll stay. Then it ranks: this up, this down, this never. Only at the end—after all these decisions—does it hand you the content.
It feels like choosing. You’re only choosing among what’s already been chosen for you.
Langdon Winner asked in the ’80s whether artifacts have politics. The answer is yes. Autoplay isn’t a “feature”—it’s a political choice about attention control. Algorithmic vs. chronological feeds isn’t a technical issue—it’s a redistribution of power. Every default is an implicit constitution. Every interface is a hidden manifesto.
And the twist—if we can call it that—is that it works. It doesn’t force you. It makes you more likely. It’s not a cage with bars. It’s a soft cognitive cage whose walls are built from your own habits.
Michel Foucault had a term for this kind of power: not disciplining from above, but producing subjectivity from within. The algorithm doesn’t tell you what to do. It makes you become someone who wants to do it.
Social Roulette
But why? Why build machines this sophisticated just to “show you content”?
Because they aren’t showing you content. They’re extracting you.
Shoshana Zuboff calls it surveillance capitalism. The term is almost too gentle. It evokes cameras, spies, someone watching you. The reality is subtler—and total. They’re not watching you. They’re shaping you.
Every click, every pause, every scroll, every moment when you do nothing—it’s all signal. It’s all data. It’s all behavioral surplus: raw material to mine for patterns, predictions, certainties.
Platforms don’t sell attention to advertisers. They sell something far more valuable: behavioral certainty. The capacity to say, “if I show this to this kind of person, there’s a 73% chance they’ll respond like this.”
You’re not the customer. You’re not even the product. You’re the abandoned mine, exploited for free raw material—24/7, forever.
And the loop is self-reinforcing. The more data they collect, the more accurate the model. The more accurate the model, the longer they retain you. The longer you stay, the more data you generate. Nick Srnicek calls it platform capitalism: control the platform → control the data → control the market → extract endless rents without producing real value.
But there’s a deeper layer.
Julie Cohen calls it “modulation,” not persuasion—modulation. Continuous micro-adjustments to your behavior, so small you don’t notice them, so constant they become you. Algorithmic power doesn’t argue. It optimizes. It doesn’t convince. It modulates.
Every time the system shows you something and you react, it’s testing a hypothesis. Every time you react as predicted, the hypothesis strengthens. The profile converges. After enough iterations, the system knows you better than you know yourself. Not because it’s magical, but because it’s seen millions of people like you react to millions of stimuli like that.
Frank Pasquale calls it the “black box society.” The asymmetry is total: they see everything about you plus millions like you. You see only what they decide to show you. They can predict. You can only react.
It’s not surveillance. It’s predictive architecture. Surveillance looks at what you did. Predictive architecture knows what you will do.

Inside the Social Web
Attention is only the beginning.
Platform competition has moved beyond “share of time.” Now it targets share of identity. Because a user who identifies with the platform is more predictable, more loyal, more monetizable. They don’t just want to use the service. They want to be part of it.
Byung-Chul Han is clear: we’ve shifted from the disciplinary society to the performance society. The disciplinary society said “you must.” The performance society says “you can.” But “you can” is more oppressive than “you must,” because it makes constraint feel like freedom.
On social media you’re not forced to perform. You’re made to want to perform—because you’ve internalized the metric. Visibility becomes existence. Algorithmic recognition becomes social recognition.
Jodi Dean calls it communicative capitalism: value isn’t in what you say but in the fact that you keep saying. Content is irrelevant. Engagement is everything. And engagement is unpaid labor disguised as self-expression.
Franco “Bifo” Berardi describes cognitive and affective exploitation: platforms continuously extract emotional labor. Every post is labor. Every like is labor. Every comment is labor. It doesn’t feel like labor because it feels like you.
Identity becomes compulsory performance. Erving Goffman said social life is theater, but at least there were backstages to remove the mask. Today, there are no backstages. The stage is everywhere. The performance never ends.
And then there’s time.
Hartmut Rosa writes about “social acceleration.” Jonathan Crary about “24/7: late capitalism and the ends of sleep.” Platforms don’t compete for market share. They compete for life share. Time is the final extractive frontier. Unlike attention, time doesn’t regenerate. When it’s gone, it’s gone.
Infinite scroll isn’t a bug. It’s the product. It removes the natural stopping point. No conclusion. No satisfaction. Only the promise that the next swipe might be worth it.
Natasha Dow Schüll studied slot machines for years. The pleasure isn’t in the win; it’s in the “machine zone”—that hypnotic flow state where time disappears and only the mechanical gesture remains. The feed is a perfect slot machine. Pull-to-refresh is the lever. Variable reward is the content. The machine zone is that state where you scroll without even knowing what you’re looking for.
Why You Can’t “Use Social Better”
Someone always says: “Just use it mindfully.”
No. That’s not enough.
This is what most people miss. It’s not a problem of use. It’s a problem of architecture. And architecture always beats individual will.
Tristan Harris and the Center for Humane Technology have documented the techniques: intermittent notifications, variable rewards, infinite scroll, autoplay, counters engineered to trigger social anxiety. It’s not “accidental addiction.” It’s designed dependence. Every interface element has been A/B-tested on millions to maximize time spent.
You’re not using social wrong. Social is using you right.
The problem isn’t “willpower.” It’s that you’re up against teams of hundreds of engineers, billions in R&D, and ML models trained on billions of people to find exactly which button in your brain keeps you there five minutes longer.
You’ll win a few battles—set limits, disable notifications. The architecture adapts faster than your resistance.
David Lyon calls it “surveillance culture”: no longer coercive, but seductive. It offers service in exchange for data. It “knows” you. It “anticipates” you. It “helps” you. Meanwhile, you’re normalized to tracking; privacy becomes a fair price for convenience.
Self-surveillance turns voluntary—desired, even—because surveillance is packaged as care. The system “cares” for you. The system “protects” you. The system “knows” what’s best.
But the system has no human interests. It has metrics. Metrics don’t measure your well-being. They measure your engagement. Maximum engagement rarely aligns with maximum happiness. Often, it’s inversely correlated.
Who Controls the Protocols, Controls Everything
So where is power, really?
Not in content—content is interchangeable. Power lives in protocols, in the data-rails—the invisible infrastructures deciding what can be seen, who can speak, how success is measured.
Benjamin Bratton’s “The Stack”: platforms as stacked layers of governance—hardware, software, protocol, interface. Control the lower layers and you control everything above. Control the standards and you wield a power no government has ever held.
Code has no democracy. No appeal. No interpretation. Code executes.
Manuel Castells saw it early: in the network society, power is the capacity to control flows. The controllers are the “switchers”—the actors managing inter-network junctions. Platforms are the global switchers of information.
Same pattern in finance. Who controls the indices? Who sets S&P 500 inclusion? Who defines settlement protocols? Small private actors performing governance while seeming not to. Infrastructural power is invisible. Susan Leigh Star: infrastructures are invisible until they break.
And that’s exactly what’s happened with social platforms. They’ve become infrastructure—essential, taken for granted—so their power is uncontestable.
Timothy Mitchell shows how measurement systems create the realities they claim to measure. Metrics—engagement, reach, virality—don’t “measure” success; they define it, creating a world where only certain kinds of content can succeed.
Geoffrey Bowker and Susan Star call it “infrastructural ethics”: ethics embedded in code, not just use. How the algorithm classifies is a moral choice. Those outside its categories become invisible. Algorithmic invisibility is the new censorship.
That’s why “use social well” discourse fails. The unit of analysis isn’t individual use; it’s architecture. That’s where proprietary interests become systemic: metrics → ranking → exposure → consent.
Tokenization in finance looks like “efficiency”: faster settlement, fewer frictions. But it moves rules into code. Control the code—operational standards, oracles, protocols—and you control capital flows. Human intermediaries become redundant. Intermediation is in the protocol.
Same with social. Distribution is encoded in APIs and models. Power shifts from content to ranking protocols. Control the standards, metrics, training datasets, and brand-safety criteria—control the visibility market.
Vitalik Buterin and crypto-governance literature suggest alternatives: transparent governance, fork-ability, low-friction exit. The contrast is sharp. Web2: costly exit, ineffective voice, de facto monopolies. Web3: exit via fork, on-chain governance, protocol transparency.
It’s not “blockchain yes or no.” It’s about portability of rules, interoperability, and being able to migrate identity, social graph, and history without losing everything.
Primavera De Filippi calls it “lex cryptographica”: when rules live in code, those who control code make the law. Powerful—and dangerous. Code doesn’t forgive. No recourse. No interpretation.
Hence the need for a constitutional layer: minimal, non-negotiable principles above the protocol layer—rights code must respect even when it could violate them.
Part 6: The Response — Constitutionalizing Code
So what do we do?
It’s not about “punishing” innovation. It’s about making it accountable when it shapes the collective cognitive environment—just like other high-impact domains: climate, finance, pharma. When systemic externalities exist, we need standards, audits, rights.
Jürgen Habermas taught that rational communication needs structural preconditions: informational symmetry, absence of coercion, transparent rules of the game. Platforms violate all three. They pose as “town squares” yet are proprietary spaces with total informational asymmetries and pervasive mechanisms of cognitive coercion.
We don’t need “digital agoras.” We need to constitutionalize digital infrastructure. An algorithmic constitution rests on three pillars.
First: structural transparency.
Not “open all the code”—that’s useless or harmful. Instead, transparency about classes of signals that influence ranking: engagement, social proximity, recency, brand safety. We don’t need exact weights—those change constantly. We need to know what is measured and why.
Public model cards: model purpose, signals used, known limits, failure modes. Impact assessments on protected classes, misinformation risk, effects on minors. Third-party audits with privacy-preserving data sandboxes.
The right to know “why am I seeing this?” Not a dense technical dump—an honest explanation: “because people like you responded well to content like this,” or “because someone paid to show it to you,” or “because the algorithm predicts it will keep you here longer.”
Second: interoperability.
Portability of social graph, preferences, and history. If you switch platforms, you take your connections with you—no starting from scratch. This reduces lock-in and power concentration.
Open baseline APIs. Verifiable identity. Common protocols for social relations. Not “one platform to rule them all”—that’s worse. Open standards enabling real competition.
Andrew Russell and Laura DeNardis show how technical standards are invisible constitutions. Those who control HTML, TCP/IP, OAuth govern without appearing to. Social platform standards must become public, open, and multi-stakeholder governed.
Third: user rights.
The right to control pace: settable time budgets; soft-locks with conscious override; slow defaults for vulnerable groups—autoplay off, no infinite loops, queue limits.
The right to explicit modes: chronological timelines, thematic timelines—persistent choices, not buried toggles. Freedom you must hunt for isn’t freedom.
The right to provenance: know whether what you see is human, synthetic, or hybrid—device-side attestations, robust watermarking, consistent cross-platform labels. Not to “ban AI,” but to know.
Julie Cohen calls it “informational due process”: procedural rights when algorithms decide for you—like in court, the right to know the evidence, challenge it, and appeal; when an algorithm decides your visibility, you deserve equivalent rights.
Operating Principles
Proportionality: heavier obligations where impact is systemic.
Output-oriented: measure observable effects.
Co-regulation: multi-stakeholder standards and independent audits.
Immediate Practices for Creators & Institutions
Two-track content, healthier metrics, and friction-positive design.
Conclusion: The Political Task of Our Time
Evgeny Morozov warned about “technological solutionism.” Platforms frame disinformation and polarization as bugs to be fixed by better algorithms. They aren’t bugs; they are features of the business model.
The solution is to rewrite the architecture: protocols, governance, interoperability, audits, provenance, and rights over time.
“Power no longer forbids you from doing something. It makes you desire exactly what serves it. And when desire is architected, freedom is an illusion.”








