The Texture of Digital Memory, Pt. I: Living Against Data
- yuvalkh
- 2 days ago
- 15 min read
Updated: 2 days ago
Author: Yuval Klein

In Others in Mind: Social Origins of Self-Consciousness, developmental psychologist Phillipe Rochat writes:
“I start with the premise that the tendency to approach others is what drives social animals at the core. It forms a primal attractor and from this core derives the basic fear of losing social proximity, a deep universal anxiety about getting separated from others. This fear is captured by the generalized anxiety attached to separation and even worse, at least in humans, the anxiety about being actively rejected by others… [I intend to] shed a new light on humans’ unique expression of such a tendency and its counterpart.”
For my part, I will continue with the premise that the digital ecology changes the ways in which these tendencies are expressed, and importantly, expands their sphere of influence. One often reflexively punishes or reinforces other’s choices, leveraging the social capital of collectively internalized norms. I’m reminded of a scene in the film All We Imagine as Light in which one of the protagonists, Prabha, is informed by a colleague that Anu (her younger roommate and colleague) is having an immodest affair of which no one approves. Soon after, Prabha is in a room with Anu and a friendly doctor; Anu’s demeanor towards him is casual and tender—akin to flirting. Having left the room, Prabha rebukes Anu, bringing to her attention the possibility of being slut shamed and ostracized. When Anu returns to their apartment, Prabha immediately wavers in her role as a stringent enforcer of moral norms and apologizes to Anu. A voice of social consciousness ricocheted off of Prabha like an echo chamber, passing through her-self to Anu.
Social media expands the capacity for policing norms, texturing paranoia with an abundance of vivid details, i.e. copiously produced and widely circulating data. I will argue that this enhanced capacity to leverage the lurid suggestion of social alienation further inhibits people in the way they conduct and construct themselves. There are simply more ways to approach, more ways to actively go about rejecting other people. Digital hyperconnectivity bypasses many of the constraints to approaching someone, namely the necessity of temporal and spatial alignment, the preeminent situated correspondence. And naturally, an enhanced ability to approach is followed by new heights of self-consciousness, as we wrestle with more numerous and ominously coordinated “others in mind.”
There are two books that have significantly aided me in my exploration of self-consciousness in the digital age: Rogers Brubaker’s Hyperconnectivity and its Discontents and Kate Eichhorn’s The End of Forgetting: Growing up with Social Media. Others in Mind provided the opportunity to sediment my ideas and theirs’ with a psychological framework on self-consciousness. Additionally, Roger Postman’s Amusing Ourselves to Death offers rich commentary of a historical antecedent, the television, which will serve to ground my commentary in a wider comparative context.
Two decades ago, Facebook capitalized on the human desire to objectify and be objectified by others. Self-consciousness, according to Douchat, is derived from the human capacity to view oneself from the outside as an object to be scrutinized. The desire to know more about others synergized with the social media algorithms’ incentive to compile information and sell the data to advertisers. As people gained in visibility, they became more vulnerable to various forms of surveillance. For the sake of this article, I will primarily focus on its interpersonal strand. So let’s, for a moment, scale the panoptic tower of social media and direct our eyes towards these willingly surveilled prisoners. After all, do they not want us to leer at them? Do they not desire to be watched? This paradox, of the disjunction between the simultaneous vulnerability and thrill that a stimulated self-consciousness entails, will be at the center of this first part of The Texture of Digital Memory.
In social media, people represent themselves through “digital objects,” which are, as Brubaker notes, “searchable” and “can be accessed anywhere and any time in a private and unobtrusive manner.” This reality necessarily entails an asymmetrical visibility, a digital panopticism, but though “terms like ‘Facebook stalking’ indicate a certain unease about this surreptitiousness… social media surveillance is not surreptitious in another important sense: the targets know perfectly well–indeed ardently hope–that they are being watched, even if they don’t know who may be watching at any given moment.” The suggestion of surveillance can thus arouse our most gleefully voyeuristic instincts, as well as the biologically and experientially ingrained propensities for shame and self-doubt.
The threat of alienation is a primal whisper recast through the vicissitudes of human history. We are born remarkably premature (due to the constraints of the female’s pelvic structure) and thus become wholly dependent upon others. We’re not merely social animals, we’re weak and needy animals. A horse can run from predators a couple of hours after birth. Conversely, the homosapien necessitates constant intention in order to subsist with grossly insufficient bodily devices. The maturation of the human brain, particularly its uniquely capacious cortex, is also incredibly long-winded, thus perpetuating the impetus for close parental guidance. All of this is to say: others are the anchor and impetus of individual identity/self-conception. In the words of Mikhail Bakhtin, “To be means to be for another, through the other, for oneself.”
One’s behavior is modulated–and, of course, sometimes negotiated–in a way that fallibly but quite prodigiously anticipates one’s social role. But social media disrupts this function, allowing deceptive and decontextualized data to misrepresent (or as we will see, overrepresent) oneself, and induce one to miscomprehend others. After all, our capacity for affective relations developed during a prolonged period of direct, situated communication within a finite network of consociates. We, therefore, rely on a variety of paralinguistic cues that are rendered unintelligible in the parameters of text or decontextualized images/visual representations. Furthermore, as Brubaker notes, though people “... are no doubt aware on some level that the picture they form of the ‘lives of others’ is skewed by selective and carefully curated self-presentation… Social media users constantly see – rather than simply hearing about – what others have been doing, where they have been, how they look, and with whom they are hanging out. They are constantly confronted with their own and others’ metrics – numbers of friends, followers, likes, retweets, and so on – and they cannot help interpreting these metrics in comparative perspective (italics mine).” Our means of appraisal effectively collapse when faced with the algorithm’s filtration of reality.
Forgetting Data
Amusing Ourselves to Death, Roger Postman:
“The telegraph made a three-pronged attack on typography's definition of discourse, introducing on a large scale irrelevance, impotence, and incoherence. These demons of discourse were aroused by the fact that telegraphy gave a form of legitimacy to the idea of context-free information; that is, to the idea that the value of information need not be tied to any function it might serve in social and political decision-making and action, but may attach merely to its novelty, interest, and curiosity. The telegraph made information into a commodity, a "thing" that could be bought and sold irrespective of its uses or meaning… We may say then that the contribution of the telegraph to public discourse was to dignify irrelevance and amplify impotence. But this was not all: Telegraphy also made public discourse essentially incoherent. It brought into being a world of broken time and broken attention, to use Lewis Mumford's phrase. The principal strength of the telegraph was its capacity to move information, not collect it, explain it or analyze it. In this respect, telegraphy was the exact opposite of typography. Books, for example, are an excellent container for the accumulation, quiet scrutiny and organized analysis of information and ideas. It takes time to write a book, and to read one; time to discuss its contents and to make judgments about their merit, including the form of their presentation. A book is an attempt to make thought permanent and to contribute to the great conversation conducted by authors of the past. Therefore, civilized people everywhere consider the burning of a book a vile form of anti-intellectualism. But the telegraph demands that we burn its contents. The value of telegraphy is undermined by applying the tests of permanence, continuity or coherence. The telegraph is suited only to the flashing of messages, each to be quickly replaced by a more up-to-date message. Facts push other facts into and then out of consciousness at speeds that neither permit nor require evaluation (italics partially mine).”
It seems to me that digital circulation incorporates both manifestations of content–the telegraphic and typographic–to form an information chimera: an abundance of gratuitous information travels because it can be quickly and systematically commodified, but, information that isn’t meant to be scrutinized simply won’t “burn.” Content that is meaningless, similar to a telegraph’s, has a permanence that lends itself to scrutiny in the same way as peer-reviewed articles or Plato’s Dialogues. Postman writes about how the paralysis of words allows for the edification of posterity, but when information (and politics) takes the form of ephemeral entertainment, it eludes the rigorous scrutiny that reading long-form typographic works necessarily entails. Conversely, Eicchorn is concerned with the fact that words and images which ought to be forgotten circulate because of data hoarders’ suspension of its de jure decay. Postman writes that in an age of Television–and previously, the telegraph–we cease to remember relevant information long enough to sufficiently reflect upon it, whereas Eichhorn writes that our faculty of memory is so remarkably enhanced that we cannot cease to reflect upon certain fragments of information–information that perhaps ought to be forgotten.
Should we “Never Forget”? Are we in a perpetual collective struggle against forgetting? Vanquishing forgetting serves an oft-acknowledged purpose: to arrive at truth, and therefore, allow for a sort of widespread consensus that makes coordination in times of plague or injustice possible. Does human memory herald a documentary impulse or a self-protective one? To Eichhorn, it is the latter: There is, for example, what Freud calls “screen (or concealing) memories,” in his eponymous 1899 essay, which Eichhorn summarizes as such:
“Freud’s essay outlines three different types of screen memories. The first are those that conceal a different event that happened at the same time. For example, one might recall a tree branch falling. Although the incident did happen, it is taking the place of another more significant event that happened during the same timeframe (e.g., being whacked over the head with a wooden spoon). In the second type, a later recollection replaces the memory of a childhood event. Since few people recall anything at all before the age of four or five, they frequently transpose a memory from a later period onto this earlier period. The third type, which Freud mentions only in passing, is the “retrogressive screen memory.” Here, an earlier memory comes to represent a later concern. What all three categories of screen memories suggest is that memories of childhood are especially malleable.”
Indeed, childhood memories are particularly vulnerable to entropy. This allows one to disentangle and reinvent and individuate to one’s contents. In the age of social media, however, representations of past selves remain in circulation, remain present. Falsified childhood and adolescent memories can be inconveniently overwritten by an extraneous faculty of memory, i.e. the circulating documentation of moments and past selves. There is a general consensus in the cognitive sciences that humans have a tendency to integrate information that coheres with their prevailing self-narrative, while scorning that which challenges it. Basically, people don’t tend to incorporate information that isn’t helpful, that challenges their basic assumptions about themselves and others. On first glance, this may appear to be a vile, selfish instinct–and to an extent, of course, it is. In science and scholarship, there are invaluable standards in place to categorically circumvent this self-serving cognitive mechanism; in democracy and rule of law, truth cannot not be sidelined in service of one’s agenda.
If we're merely in a perennial struggle against forgetting, then perhaps these enhanced faculties of memory should be a welcomed development, to which I respond: ehh, too simple. Collectively, humanity must consider which truths are to be preserved and which jettisoned. To ignore the fact that memory is finite, and hence, selective, isn’t productive. In the words of historian and fellow Yuval, Yuval Noah Harari, “The most pertinent question in historical research is the question, ‘What’s important?’ Much more so than the question, ‘What’s correct?’ There are many things that are true but aren’t important.” This is not, of course, to say that a historian must distort memory; the historian stands to benefit from robust and durable documentation (digital storage fails on the latter account). Neither I nor the more eminent Yuval consider relevant but harmful historical evidence to be disposable; that said, many truths are and should be disposable. Even more historically-minded Yuvals have an incentive to forget; alas, so should the laymen!
When documentation exists, when a moment is lingering in the annals of public knowledge, it can be recalled by anyone at any time. It is a substantive object imploring one’s attention, and as other people have the capacity to re-mind one of its existence, one cannot very well forget it! Regrettable moments from past selves–if documented and effectively circulating–can be hoisted up to the present moment, recalling the past with its empirical, disinterested claim to truth. People who have something to gain from putting past selves to rest are most negatively affected by this, claims Eichhorn. For example, people whose traumas–as refugees, victims of sexual violence, or even social awkwardness–are made into digital objects have often regrettable and/or painful moments on loop. I might add that self-consciousness is not merely activated retroactively, but the very possibility of such a nuisance can be gripping. People already spend inordinate amounts of time in what Douchat calls a “reflective-loop,” recalling past events and recursively envisioning future ones. These repetitive patterns are precisely the destructive tendencies that cognitive behavioral therapy seeks to alleviate, and when the ability to “let go” is outsourced to a disinterested world of digital artifacts, the traumatic bases for these memories can hardly be contained. In other words, memory maintenance is intractable; the proliferation of digital objects, troves of data commodified by data “miners,” interfere with the helpful–if sometimes overly selfish and tendentious–censorship of the past.
Eicchorn considers Erik Erikson’s notion of a psychosocial “moratorium” at length. It is a period during childhood and adolescence wherein one is exempt from the consequences of one’s actions. It is an ingrained, though often unspoken rule in various cultures. For example, many nations, excluding the United States, prohibit the publication of young offenders’ names. Minors are, of course, rarely sent to prisons in most places. Adolescents are allowed to experiment, zigzag between “phases,” and break societal rules without enduring consequences–they do not have to carry these mistakes into adulthood. Technology companies grapple for data in order to produce more data, and naturally, young users are implicated in this anti-moratorial incentive structure. As Eichhorn writes, “The real struggle [of our age] is no longer one between forgetting and memory, but rather between forgetting and the rising value of data, even data that once carried no intrinsic value whatsoever.”
Nietzche has a metaphor, which elegantly renders the virtue of forgetting: memories are like windows, and sometimes you have to pull the curtains over them in order to have necessary respite from the outgoing view. Modifying his statement poignantly, Eicchorn writes:
“In the twenty-first century, even Nietzsche’s metaphor takes on a profoundly different meaning. The word “window,” after all, is now more likely to conjure up the window on a screen than the architectural feature to which he originally alluded. But closing the windows on our devices is profoundly different from closing the windows in our home. Closing the windows and drawing the curtains prevents us from seeing out, but it also prevents the world from peering in. When we close the windows on a digital device we can no longer see out (what’s online is closed off ), but this act does not prevent others from peering in. While we take a rest, others can and do continue to monitor us (even taking note of our inactivity).”
Our innate mechanisms of pulling the curtains, as it were, are uniquely challenged in the age of rampant data extraction. Faced with the fact that there is a dynamic technologically mediated extrinsic faculty of memory threatening to overwrite our in-built mechanisms for healthy amnesia, should our ability to forget be protected, or should we fight against the sickness of forgetting with science, history, and honest sobriety? Is there anything inherently valuable about the preservation of truth? To answer these questions, it is first necessary to further sift through the texture of digital memory.
The Data-ed Self
`There’s a scene in Haruki Murakami’s Wind-up Bird Chronicles that struck me recently: The protagonist is running away from a group of people who had just seen someone resembling him beat and hospitalize his public intellectual brother-in-law. He has no recollection of such an event. An enigmatic character reveals himself and helps him escape; he informs the protagonist that, “Those people are always glued to the television set. That is why you are so greatly disliked here.” In other words, you aren’t liked in that world, so you’re being pursued in this one. In the age of what I call digital memoryprints, almost everyone lives with digital duplicates/“data doubles” that can be interacted with and scrutinized on their own terms. There’s an extraneous narrative that is partially manipulated by the individual, and partially constructed in dialogue with people and algorithms.
Postman writes of “the capacity of photographs to perform a peculiar kind of dismembering of reality, a wrenching of moments out of their contexts, and a juxtaposing of events and things that have no logical or historical connection with each other. Like telegraphy, photography recreates the world as a series of idiosyncratic events.” When external narratives of oneself are comprised of idiosyncratic, decontextualized representations, the pendulum of reputation can swing either way. That is, one can curate these representations in a self-aggrandizing way, in effect, becoming what Alice Hardwick calls “micro-celebrities” (this is separate from another paradigmatic case of online curation of the self, to wit the social media influencer, who engineers his own idiosyncratic product of self), or conversely, one’s public image can be eclipsed by the darkness of a single decontextualized (mis)representation of a past self. Brubaker notes that the pendulum of dismembered representations of the self skews particularly positive for those self-entrepreneurs called social media “influencers” who have “cultivated a large social media following not by virtue of preexisting celebrity status but by producing and enacting a digital self that succeeds in engaging – and being consumed by – his or (more often) her followers. They acquire their influence by producing and marketing themselves; the fortunate few can then monetize their influence by promoting others’ products.”
Another marketable product in social media is humor, of which Rochat writes:
“For some, the property of Man is its ability to laugh. Maybe. But at the origins, what causes laughter and humans’ sense of humor are the ridicule and the grotesque that derive from Man’s desperate, often clumsy attempts at asserting and promoting its own person. We do laugh with Moliere or the Marx brothers at the pompous, the snob, the inflated, and even the overly deflated, unassuming individual. Humor derives primarily from a preoccupation with the self in relation to others. It derives from what we are as members of a uniquely self-conscious species.”
It is, therefore, often clips in which people fail to normatively represent themselves that end up surfacing in the social media algorithm, circulating objects of shame which appeals to a universal sense of humor. For example, Eicchorn mentioned the subject and victim of the first viral YouTube video, in which a teenage boy passionately enacts a Star Wars lightsaber duel. The awkward moment of private experimentation became immortalized when a school bully found it and posted it to the new platform. A moment that would have otherwise been forgotten became a prolonged source of shame for the actor. Documentary humor often involves exploitation, selectively forwarding the most excessive and discordant moments; the moments that one would most desire to contain move quicker and farther, whereas the placid moments remain solitary.
Consider two awful shows of which I was shown clips in multiple (junior) high classrooms: the reality TV show ‘What would you do?’ and the street interviews in Jimmy Kimmel’s show. In the former, concealed cameras film strangers in public alongside cognizant actors who manipulatively inhabit the roles of victim and perpetrator(s) of racism, homophobia, pedophilia, or other social taboos; whereas in the latter, strangers are asked trivia questions by a smug reporter, and their most remarkably uninformed responses are then compiled to assemble a montage of stupidity. It seems to me that the price of entertainment in both ostensibly innocuous shows is unreasonable; foisting millions of views upon unsuspecting strangers, and humiliating hoards of people with such whimsicality, respectively, are methodologies for generating cheap and viral humor, producing an asymmetrical trail of shame and laughter. It is precisely behavior deviating from the norm–the uncouth, or conversely, the unambiguously better (prettier, more talented, etc.)–that appeals to consumers of entertainment. Therefore, the further the pendulum of the subject’s representation, the greater the buoyancy throughout folk entertainment algorithms.
The digital selves of public figures, which many are to varying extents, are perched on a pendulum between the extremes of antagonism and heroism. The years of accumulating can quickly (or latently) be overwhelmed by a digitalized/digitally documented moment of excess, be it transgression or sanctification of a social norm. After all, excess is by definition more evocative, more noticeable, more culturally invigorating than the mundane. This fact necessarily entails that we’re vulnerable to being defined by our excesses, tyrannized into placating or inhibiting them, or in some cases, employing them in a self-aggrandizing way. In any case, the stakes of conformity are higher in the age of mass approaching.
People aren’t merely more vulnerable to their excesses, they’re more attentive to them. Summarizing Lewis Mumford’s Technics and Civilization, Postman notes that the invention of the clock–with its capacity to illustrate the modular moment to moment–did not merely make us into time-keepers, but also time-savers, and contemporaneously, time-servers. In this vein, Brubaker reflects upon the movement of self-quantification. Social media gamifies interaction by quantifying, and therefore, inducing comparison between objectified interactions. “Numbers are not only a means of knowing the self,” he writes, “they are a means of governing the self. Social media metrics do not simply reflect the world.” People naturally tend to reflect upon themselves and others as objects, comparatively juxtaposing them–and with metrics, there is a substantial amount of detail to texture these accounts. Narcissism and insecurity can always be substantiated by ever-deepening numerical data. “The quantified self is therefore a restless and insatiable self,” Brubaker writes.
The internet initially offered a terrain amenable to self-exploration and spontaneity, especially for youth, as users could retreat behind spectral cyber identities. In time, reality fatefully superimposed itself. The sanctuary of the domestic and intimate ceases to serve its function when the barriers between its confines and the public collapse. A private space with a curtainless window is not private; a fleeting moment that is captured loses its protective plaque of ephemera. Social media is a place of openness, in which family albums circulate widely, and biographies are exposed; in the spirit of Nietzche and Eicchorn, I will conclude with my own relevant analogy: the doors of the house are unlocked, permeable to any and every one. Any one can enter, interact with the objects strewn around the living room, and lurk around. These objects may also be photographed and circulated. The point being, in an age in which most people are orbited by troves of data, others have an eerie ability to encroach upon the formerly private. Privacy often requires truth to be subdued; this veritable principle consequently precludes the presently incentivized order of rogue data.