Table of Contents

charlie deck

@bigblueboo • AI researcher & creative technologist

Back to index

Lurking: How a Person Became a User

Book Cover

Authors: Joanne McNeil Tags: internet history, technology, sociology, digital culture Publication Year: 2020

Overview

In this book, I trace the history of our lives online to understand a fundamental shift in identity: the moment a ‘person’ became a ‘user.’ This isn’t just a change in terminology; it’s a story about power. The early internet, a ‘cyberspace’ of relative anonymity and creative expression, offered a different kind of existence. Identity was something you could construct with words, a username, and your imagination. Communities were smaller, more idiosyncratic, and often ephemeral. Lurking, in this context, was not a sinister act of surveillance but a quiet, respectful form of participation—of listening and witnessing. I wrote this book to chronicle how that world gave way to the one we inhabit now: a centralized web dominated by a few mega-platforms that demand our ‘real’ names, our faces, and our constant, quantifiable engagement. The word ‘user’ itself, coined by developers, implies a power imbalance. We are subjects in a system we did not design and cannot control, our data and relationships converted into assets. This book is for anyone who feels the friction of this arrangement—the tech professionals building these systems, the everyday person navigating them, and anyone who remembers a different kind of internet or hopes for a better one. It’s a history of the web from the user’s perspective, examining the shift from [[anonymity]] to [[visibility]], from creation to ‘sharing,’ and from community to clash. My goal is to restore the humanity to the ‘user’ and to question the structures that seek to define, predict, and monetize our lives, urging us to imagine an internet built on principles of care, accountability, and genuine community.

Book Distillation

0. Introduction

Lurking is not a sneaky act of reconnaissance but a natural state of being on the internet—a quiet, one-sided connection that respects boundaries. The central tension of our online lives is the transition from being a person to being a ‘user,’ a term that hides the power imbalance between developers and the people on their platforms. The internet has evolved from a space of imaginative anonymity to one of obligatory, authenticated presence, where our humanity is extracted by [[algorithmic modeling]] and served back to us as content.

Key Quote/Concept:

Lurking. This is the act of reading, observing, and being present online without performing or demanding interaction. It is a state of quiet, warm, and indirect connection, a crucial but undervalued mode of participation in digital spaces.

The act of searching online, dominated by Google, has transformed our relationship with memory and information, creating an expectation that everything is classifiable and retrievable. This process has shifted from technical queries to conversational interactions, personifying the internet. Google’s ambition to [[mirror the world]] was a phase of vast data collection, often at the expense of user privacy and context, turning human curiosity into a commodity and a training set for its [[AI systems]].

Key Quote/Concept:

Personification of the internet. The shift from using Boolean search strings to asking conversational questions (‘Dear internet…’) has changed our perception of the internet from a place to a person. This metaphor reveals our emotional investment but also reduces diverse user communities to a single, easily commodified identity.

2. Anonymity

The early internet, or [[cyberspace]], was a place where identity was constructed through words and a chosen username, offering a form of anonymity that was distinct from privacy. This ephemeral world, experienced on platforms like BBSs, Usenet, and services like Echo, allowed for deep, intimate communities to form, often shielded from the broader world. This era valued creative self-presentation and contextual privacy over the singular, verifiable identity demanded by later platforms.

Key Quote/Concept:

Cyberspace. This term differentiates the internet’s past from its present. It was an earnest concept representing a notional, abstract territory entered through a screen, where one’s identity was fluid and constructed, contrasting sharply with the later internet that became an extension of the physical world.

3. Visibility

The rise of the first social networks like Friendster and Myspace marked a fundamental shift from the anonymity of cyberspace to a new era of [[online visibility]]. Identity was no longer just a username but a curated profile with a real name and photo, turning personal relationships into a legible, navigable map. This new visibility created a performance of the self, where users edited their lives into a more appealing, two-dimensional version for public consumption.

Key Quote/Concept:

Fakesters. These were users on Friendster who created profiles for fictional characters, concepts, or cultural icons. They represented a resistance to the platform’s push for ‘authenticity’ and a holdover from the more playful, identity-fluid era of cyberspace, highlighting the tension between user creativity and platform agendas.

4. Sharing

The smartphone era transformed online participation into constant ‘sharing,’ a casual, immediate documentation of life that blurred the lines between online and offline worlds. Platforms like Twitter and Instagram became the primary venues for this activity, where the ephemeral stream of updates formed a user’s identity. This shift created new social dynamics, including [[context collapse]], the asymmetry of attention (‘followers’), and the commercialization of the self.

Key Quote/Concept:

#nofilter. This hashtag on Instagram began as a user-generated proclamation of authenticity, a pushback against the platform’s initial gimmick of vintage photo filters. It ironically became a brand identity for the platform, signifying a performance of realness that is often just as curated as any filtered image.

5. Clash

Centralized platforms like Twitter became arenas for high-stakes social and political conflict. While [[hashtag activism]] empowered marginalized communities to organize and bypass gatekeepers, the same mechanics of visibility and amplification enabled coordinated harassment campaigns like Gamergate. This clash is often monetized by platforms, which prioritize ‘engagement’ over user safety, turning interpersonal conflict into revenue.

Key Quote/Concept:

Outrage. The media often dismisses online collective grievance as ‘outrage,’ a term that frames it as an irrational mob reaction. This framing ignores the power dynamics at play and serves to delegitimize genuine expressions of harm and calls for accountability from marginalized users pushing against the status quo.

6. Community

The word ‘community’ is used by platforms like Facebook to create a flattened, universalist ideal that masks a reality of [[algorithmic sorting]] and control. True online communities are often ad-hoc, specific, and voluntary, whereas Facebook’s ‘global community’ is a coercive gridlock. The platform’s systems, like the early Edgerank, don’t reflect relationships but actively shape them, turning human connection into a calculable, manageable asset.

Key Quote/Concept:

The Shiny Shiny thread. A years-long comment thread on a gadget blog where young users collaboratively reverse-engineered Facebook’s opaque algorithms for ranking friends. This represents a true, ad-hoc user community forming to understand and resist the top-down control of a major platform.

7. Accountability

A more accountable internet requires structures that prioritize something other than growth and profit. Wikipedia, for all its flaws, serves as a key example with its non-profit model, its focus on culling information rather than endlessly generating content, and its transparent, community-debated rules. True accountability comes from [[knowledge activism]] and community-led initiatives that actively work to correct bias and create more inclusive spaces, rather than relying on platforms to self-regulate.

Key Quote/Concept:

The 1–9–90 rule. An observation that in collaborative online spaces, roughly 90% of users are lurkers (readers), 9% are contributors (editors), and 1% are creators. This highlights that lurking is the default, necessary mode of participation for large-scale projects like Wikipedia to function, challenging the platform-driven imperative that all users must constantly ‘engage.’

8. Closing: End User

The final stage of becoming a ‘user’ is to have one’s lived experience, shared in good faith, treated as raw data for exploitation. Our digital lives are scraped for [[facial recognition]] training sets and other projects without our consent, splitting our content from its context and our humanity. A better internet would function like a public library—a civic institution with ‘librarians’ dedicated to care, maintenance, and serving patrons, rather than a commercial space that exploits users.

Key Quote/Concept:

Users need users, people need people. This is the core argument for a better internet. Technology and platforms are not enough. What is needed are noncommercial, localized systems of feedback, mutual aid, and accountability—digital commons shaped and maintained by the communities themselves, guided by principles of consent and privacy.


Generated using Google GenAI

Essential Questions

1. How has the concept of ‘lurking’ evolved, and what does this change reveal about the internet’s transformation?

In the early internet, the era I call [[cyberspace]], lurking was a natural and respectful mode of participation. It was the act of reading, observing, and being present without the pressure to perform or engage. It was a quiet, one-sided connection, a way to witness a community before deciding to join. This form of lurking respected boundaries and was a cornerstone of how many people experienced platforms like BBSs and Usenet. However, the modern, centralized web has reframed lurking. Platforms built on the economy of engagement and [[visibility]] see lurking as a passive, non-monetizable behavior. The imperative to ‘share,’ ‘like,’ and comment has turned lurking from a form of quiet participation into something suspicious, akin to surveillance. This evolution mirrors the internet’s broader shift from a space of creative, anonymous communities to a landscape of quantifiable, data-extractive ‘users.’ The demonization of lurking is a symptom of a system that demands constant performance and converts human connection into measurable data points for [[algorithmic modeling]], fundamentally altering the nature of being online.

2. What is the significance of the shift from a ‘person’ to a ‘user’ in our online lives?

The transition from ‘person’ to ‘user’ is the central theme of my book because it represents a fundamental shift in power and identity online. In the early web, individuals were ‘people’ who could construct fluid identities through usernames and text, participating in communities with a degree of anonymity. The term ‘user,’ however, was coined by developers and inherently codifies a power imbalance. It implies two classes of people: the creators of the system (developers) and the subjects within it (users). This terminology reduces a person’s complex humanity to a set of predictable behaviors and data points that can be managed, analyzed, and monetized. As a ‘user,’ our relationships, expressions, and data become assets for the platform. We are conscripted into performing free micro-labor, like solving reCAPTCHAs to train [[AI systems]], and our online presence becomes collateral held by mega-corporations. This shift is not merely semantic; it is the story of how our digital lives were enclosed within systems we did not design and cannot control, where our personhood is systematically extracted and served back to us as content.

3. How did the transition from the anonymity of ‘cyberspace’ to the mandatory visibility of social media fundamentally alter online identity?

The shift from the anonymity of [[cyberspace]] to the [[visibility]] demanded by early social networks like Friendster was a pivotal moment. In cyberspace, identity was constructed through words and a chosen username; it was fluid, creative, and context-dependent. This allowed for intimate communities to form, shielded from the wider world. With the advent of social networks, identity became tethered to a ‘real’ name and photo, creating a curated, two-dimensional profile for public consumption. This wasn’t just about being seen; it was about performing a version of the self. Relationships were transformed into a legible map of connections, and personal life became a continuous act of self-editing for an audience. This new era of visibility introduced new social pressures and dynamics, such as the performance of authenticity (e.g., ‘#nofilter’) and [[context collapse]], where different social circles merge into one audience. The playful, identity-fluid nature of cyberspace, exemplified by Friendster’s ‘Fakesters,’ gave way to a more rigid, authenticated, and ultimately commercialized version of the self, laying the groundwork for the data-driven platforms that dominate today.

Key Takeaways

1. The Term ‘User’ Codifies a Power Imbalance Between Developers and People.

My book argues that the word ‘user’ is not a neutral descriptor but a term that reveals a power dynamic. As artist Olia Lialina noted, calling everyone ‘people’ hides the existence of two distinct classes: developers and users. The term ‘user’ frames individuals as subjects within a system they did not create and cannot control. Their actions, data, and relationships are resources to be used by the platform for its own ends, primarily profit. This is evident in practices like reCAPTCHA, where users perform free micro-labor to train Google’s machine-learning corpuses, or in how our personal data is harvested for [[algorithmic modeling]]. Recognizing this power imbalance is the first step toward questioning the structures that define, predict, and monetize our lives. It restores humanity to the equation and challenges the idea that we are merely cogs in a machine, urging us to demand systems built on principles of care and accountability rather than exploitation.

Practical Application: An AI product engineer should critically examine the language used in product design and internal communications. Instead of defaulting to ‘users,’ consider terms like ‘participants,’ ‘members,’ or ‘community.’ This linguistic shift can foster a mindset focused on building with people rather than for them. In practice, this means prioritizing features that give people more agency and control over their data and experience, such as transparent algorithmic controls or clear data-export options, rather than designing systems that primarily serve the platform’s data-gathering objectives.

2. The Shift from Anonymity to Visibility Transformed Self-Expression into a Performance for Data Extraction.

The early internet, or [[cyberspace]], was a realm of relative anonymity where identity was a creative construction. Platforms like BBSs and Usenet allowed for expression that was detached from one’s offline, verifiable self. The rise of social networks like Friendster and later Facebook mandated [[visibility]], requiring real names and photos. This fundamentally changed online life from a mode of being to a mode of performing. The self became a curated profile, a product to be presented to an audience. This performance is not just for social capital; it is the primary mechanism for data extraction. Every ‘share,’ photo tag, and status update becomes a data point that feeds [[algorithmic modeling]], shaping our feeds and being sold to advertisers. The pressure to maintain a consistent, appealing online persona creates a constant, low-level anxiety and turns the nuances of human life into a simplified, machine-readable narrative.

Practical Application: When designing AI-driven social products, an engineer should consider the psychological impact of mandatory visibility. Instead of designing for maximum data capture through public performance, explore features that support contextual privacy and ephemeral communication. For example, a product could allow for different profiles for different social circles (like family vs. professional colleagues) or prioritize end-to-end encrypted messaging. This approach respects the user’s need for different forms of self-expression and reduces the pressure of performing a single, monolithic identity, potentially leading to more authentic and less stressful engagement.

3. An Accountable Internet Requires Non-Commercial Models That Prioritize Care and Maintenance.

The internet’s current problems—harassment, misinformation, exploitation—are not bugs but features of a system designed for profit and perpetual growth. My book posits that true accountability cannot come from platforms that are fundamentally driven by engagement metrics. I point to Wikipedia as a flawed but important alternative. As a non-profit, its goal is not to generate endless content but to cull and maintain information. It operates on community-debated rules and values something other than profit. A better internet would function more like a public library than a shopping mall. It would be a civic institution staffed by ‘librarians’—people dedicated to care, maintenance, and serving community needs. This requires building digital commons and non-commercial, localized systems of feedback and mutual aid, where principles of consent, privacy, and genuine community take precedence over [[algorithmic sorting]] and monetization.

Practical Application: An AI product engineer working on a new platform could advocate for business models beyond advertising. Consider subscription models, cooperative ownership structures, or public-benefit corporation status. In the design process, this means allocating resources not just to growth-hacking features, but to robust moderation tools, community management teams, and transparent governance mechanisms. For example, instead of optimizing an algorithm solely for engagement, it could be designed to prioritize user well-being, perhaps by detecting signs of harassment or limiting the spread of sensationalist content, even if it means lower short-term metrics.

Suggested Deep Dive

Chapter: Community

Reason: This chapter is crucial for an AI product engineer as it deconstructs how platforms like Facebook co-opt the word ‘community’ to mask a reality of [[algorithmic sorting]] and control. It contrasts the platform’s coercive ‘global community’ with the ad-hoc, voluntary, and truly human communities that form organically online, like the ‘Shiny Shiny thread’ users who reverse-engineered Facebook’s algorithms. Understanding this distinction is key to building products that foster genuine connection rather than simply managing relationships as a calculable asset.

Key Vignette

The Shiny Shiny Thread

On a London-based gadget blog called Shiny Shiny, a 2010 post questioning a new Facebook feature—a sidebar of ten friends’ avatars—sparked a comment thread that lasted for years. The commenters, mostly young users, collaboratively investigated the opaque algorithm determining which friends appeared. They formed a true, ad-hoc community, sharing theories and running experiments, like creating alt-accounts, to reverse-engineer the system. This vignette perfectly illustrates the clash between top-down platform control and the bottom-up impulse of people to understand and resist the [[algorithmic sorting]] that shapes their social lives.

Memorable Quotes

Humanity is the spice, the substrate, that machines cannot replicate. At its worst and at its best, the internet extracts humanity from users and serves it back to other users.

— Page 9, Introduction

The personification of the internet was always there for the taking (‘Hello world’), but it took users more than a decade of search and social media to activate it.

— Page 15, Search

Cyberspace does not have the power to make us anything other than what we already are.

— Page 34, Anonymity

Sometimes what looks like narcissism online is more a matter of privacy-keeping.

— Page 62, Visibility

Users need users, people need people. To borrow a slogan often expressed by health and disability activists, there should be ‘nothing about us without us.’

— Page 175, Closing: End User

Comparative Analysis

My work in Lurking offers a distinct perspective when compared to other critical examinations of our digital lives. While Shoshana Zuboff’s The Age of Surveillance Capitalism provides a rigorous, systemic analysis of the economic logic driving data extraction, my focus is more on the cultural and experiential history from the ground up—the lived texture of being a person who became a ‘user.’ I trace the emotional and social shifts that accompanied technological changes, from the creative anonymity of [[cyberspace]] to the performed visibility of Instagram. Unlike Sherry Turkle’s psychological focus in Alone Together, which examines how technology reshapes intimacy and solitude, my book is a historical narrative of power. I chronicle the specific platforms, design choices, and user-led movements that created the internet we have today. Where others diagnose the present condition, I excavate its past, showing how the ideals of early online communities were systematically enclosed and commodified. My unique contribution is this user-centric history, arguing that the friction we feel today is a direct result of a decades-long process that stripped people of their agency and turned them into a manageable, predictable resource.

Reflection

In writing Lurking, I aimed to provide a history of the internet from the perspective of the person sitting at the keyboard, watching the world change through a screen. My strength, I believe, lies in chronicling the cultural shifts that are often overlooked in purely technical or economic histories—the feeling of the early web, the social dynamics of the first social networks, the way language itself changed. However, a skeptical reader might argue that I romanticize the early internet. While I acknowledge it was ‘never peaceful, never fair, never good,’ my focus on the creative potential of [[cyberspace]] could be seen as downplaying the toxicity and exclusion that existed even then. My perspective is undeniably shaped by my own journey as a participant, and my opinions diverge from a purely objective account; this is a history with a point of view. I argue that the centralized, commercialized internet was not an inevitable outcome but the result of specific choices that prioritized profit over people. For the AI product engineer, the book’s significance is not as a technical manual but as a cautionary tale. It is a call to remember the humanity of the ‘user’ and to build systems with an awareness of the power they wield, striving for an internet based on principles of care, accountability, and genuine community, not just engagement metrics.

Flashcards

Card 1

Front: What is the author’s definition of ‘lurking’ in the context of the early internet?

Back: A quiet, respectful form of participation—listening and witnessing without performing or demanding interaction. It was a state of warm, indirect connection, not a sinister act of surveillance.

Card 2

Front: What is the significance of the term ‘user’ according to the author?

Back: It signifies a power imbalance between developers and the people on their platforms. It hides the ‘existence of two classes of people—developers and users,’ reducing a person to a subject in a system they did not design or control.

Card 3

Front: What does the author mean by the ‘personification of the internet’?

Back: The shift from using technical search queries (Boolean strings) to asking conversational questions, which changed our perception of the internet from a place to a person. This metaphor reduces diverse user communities to a single, commodified identity.

Card 4

Front: What was [[cyberspace]] as distinct from the modern internet?

Back: A notional, abstract territory where identity was fluid and constructed through words and a username. It valued creative self-presentation and contextual privacy over the singular, verifiable identity demanded by later platforms.

Card 5

Front: What is [[context collapse]]?

Back: The phenomenon on large social platforms where diverse audiences (e.g., friends, family, colleagues) are flattened into a single group, making it difficult to manage self-presentation and leading to social friction.

Card 6

Front: What is the ‘Shiny Shiny thread’ an example of?

Back: A true, ad-hoc user community forming to collaboratively reverse-engineer and resist the top-down algorithmic control of a major platform (Facebook).

Card 7

Front: What alternative model for the internet does the author propose in the closing chapter?

Back: An internet that functions like a public library—a civic institution with ‘librarians’ dedicated to care, maintenance, and serving patrons, rather than a commercial space that exploits users.


Generated using Google GenAI

I used Jekyll and Bootstrap 4 to build this.