The Algorithmic Eye
FOLLOW THE ALGORITHM. Imagine entering a room full of clocks where all the hands move chaotically but show the same time: EVERYONE ON PLATFORM, EVERYONE SCROLLING, EVERYONE POSTING, EVERYONE LIKING, EVERYONE CHASING… FEEDBACK FROM THE SYSTEM…
To understand content performance, we tend to follow a linear posting trend, counting interactions within a given time, based on the number of users reached. That’s why we post during peak hours… yes, we all have the same schedules and (often poorly) use time the same way.
We post at peak moments, creating the very peaks we seek; so our content is more affected by the algorithm’s selective effect, and in a spiral, we are not only pushed onto the platform but pushed to post content in the form and style best suited to the algorithm.
And no, it is not just to reach more people. We are subconsciously driven to please the very system that constantly monitors us, even when we do not want it to. At those times, content is more affected by the algorithm’s selection, and in a spiral, we are not only pushed onto the platform but pushed to post content in the form and style best adapted to the algorithm—which, indeed, is stronger—thus reducing the non-linear virality of the content.
This system not only pushes us to create peak moments on the platform but gives an extra boost to the most “algorithm-friendly” content and reduces the viral index, favoring the authorities already present in the network, those with many users and high traffic. Over time, we become—often unconsciously—subservient to an algorithm that shapes our feeds, sharing results, tastes, and creative activity. A cycle that chases itself, ever more closed, ever more homogeneous.

The Hidden Rhythm of Visibility
The feed is a form of algorithmic subjectivation: it shapes us as we feed it, teaches us its language while we believe we’re expressing our own.
Posting at the “right” times seems like a rational strategy: exploiting the peak user flow for visibility. However, these peak moments are crowded traps, where visibility becomes a war of attrition and every post fights for a fraction of attention. So we replicate proven formats, shorten texts, start stories with hooks, compress narrative time.
The platform does not ask for originality. It asks for adaptation—like a conservative society. Any content that dares to deviate, any untested format, any off-script tone gets penalized. Content outside the “safe” parameters is ignored. The feed rewards the standard and cuts out what it cannot already measure.
This strategy, seemingly logical, hides a trap: homogenization. Algorithmic conformity is a form of invisible homogenization: we comply not because we want to, but because it works. The feed becomes a market of predictability, where creativity is treated as an error, and the unexpected is discarded before even being recognized.
Panopticon, Algorithm and Surveillance
Michel Foucault, in Discipline and Punish, describes the Panopticon as a device of control that induces self-surveillance: each individual behaves as if always observed. Online, we talk about self-surveillance citizenship, a society of paranoids. On social media, the algorithm is the invisible warden: it rewards certain forms, styles, and times, forcing users into constant self-control and adaptation. This “society of control” (Deleuze) turns virality into a rule: posting at the right time is no longer a free choice, but an obligation to hope to be seen.
Sherry Turkle describes this process as an externalization of identity: we are no longer who we are, but who we can appear to be. Erik Erikson spoke of identity as a coherent narrative over time. In the digital world, that narrative is fragmented into a series of optimized “self-presentations.” The social profile becomes a broken mirror: many reflections, no depth. Subjectivity dissolves into a series of visible performances, calibrated for validation.
Algorithm, Feedback and the Death of the Unexpected
Norbert Wiener, father of cybernetics, teaches that every digital system lives on feedback: every action generates a signal, and that signal influences the next action. Socials work exactly like this. The like is confirmation. Silence, a warning. Content is evaluated the moment it’s born. So we adapt. But not to reality: to the algorithm.
Couldry and Mejias talk about “structural dependency”: we do not understand the algorithm, but we feel its judgment. Every communicative act is conditioned by the fear of underperforming, of disappearing from the feed, of being discarded before even being seen. The result is a narrative ecosystem where the unexpected finds no space, where creativity is not encouraged, but managed as an anomaly to be normalized. The algorithm is a filter and a mold, an invisible editor and proofreader. Virality becomes a confirmation of the status quo.
Algorithm and the Feed as a Nervous Membrane
Marshall McLuhan foresaw that media would become extensions of our senses. The feed is now our digital skin. It stimulates us, notifies us, excites us, punishes us. Every scroll is a micro-shock. Every click is a neurochemical response.
But the feed is also an emotional territory. It regulates our attention, segments our time, and turns our days into windows of visibility. Publishing is no longer a creative act, but a reflex gesture, almost Pavlovian: seeking a response, a feedback, a small sign of presence.

Algorithm, Visibility and Performative Identity
Roger Silverstone defined mediatization as an existential condition. We are inside the medium. We are made of it. On social networks, every gesture is a positioning. Every post is a symbolic act of existence. But visibility, as a limited resource, is subject to rules we do not control.
This performative model is insidious: identity risks depending on numbers, reach, rhetorical effectiveness. We stop saying what we want, and start saying what works. Thus, the subject is no longer an author, but content. A form to be optimized, a narrative to be packaged.
The network, born as a horizontal space, has become vertical. Access is no longer a right, but a concession. Cyberspace, once a democratic promise, has become privatized territory, subject to surveillance, extraction, and profit logics.
Algorithm, Cognitive Mutation and Sensory Extension
In the midst of this transformation, our cognitive functions are also redistributed. We increasingly entrust memory, research, and sense-making to digital tools. Knowledge is no longer the result of learning, but of access. Memory is no longer constructed, but delegated.
So, even on a superficial level, the reduction in real interactions is making people unable to read each other’s behavior, blurring the empathic imprint of our species. Without indulging in dystopian visions of “genetic architecture obsolescence”, it is undeniable that delegating our cognitive abilities to software is changing the very role of human experience, reducing its function as a historical, collective, biological archive.
Algorithm, Digital Geographies and Infrastructural Power
Behind the illusion of an open and horizontal network lies an increasingly centralized geography of power. Manuel Castells warns us: the network is made up of nodes and connections, and those who own these nodes hold real power.
The major network providers—Amazon Web Services, Google Cloud, Meta, Microsoft—not only manage infrastructures but decide what is accessible, how much is distributed, what remains visible. Platforms do not just filter content: they define the very boundaries of the communicative space.
CDNs, servers, submarine cables, data centers: today culture travels on private highways, and access to visibility depends on infrastructural permissions, not merit or relevance.
Algorithm and Resistance Strategies
In a system where everything is calculated, true rebellion is to become incalculable. Resisting the algorithm does not mean disappearing. It means writing offbeat, posting the unexpected, creating content that breaks the feed’s syntax.
It means reclaiming complexity, error, doubt, slowness. It means not confusing feedback with value, nor optimization with expression.
At a time when identity is codified in formats, recognized by metrics, and rewarded by graphs, to resist is to remember that we are more than our profile. Subjectivity is not a data point to be sold, but a field to be defended.
Decoding the Algorithm
The desire for visibility has made us dependent on a system that returns our algorithmic reflection. But disconnecting is not giving up: it is regaining control.
Follow the Algorithm is not an invitation to obey. It is an invitation to follow it in order to unmask it. Decode to defuse. Analyze to transform.
Only in this way can we reclaim our voice. Only in this way can we remain human in the middle of the feed.
Visit Cybermediateinment.com