Act I — Fragmentation

You are not one person online.
You are many approximations.

A lender does not want your humor. An ad platform does not care about your private grief. A recommender system does not need your full soul. Each system only needs enough of you to sort, price, rank, or predict. A rough month can look like risk to one system, opportunity to another, and drift to a third.

Edit the trace. Watch a life break into machine-readable fragments.
Fragment.
A machine does not meet you in full. It assembles you from what leaks.
Act II — Assembly

A machine doesn't meet you.
It assembles you.

Turn data sources on and off. Watch how a silhouette sharpens, not because it understands a person, but because each stream narrows uncertainty. This is how a data-double gets built: not with insight, but with accumulation.

Location
Purchases
Searches
Contacts
Photos
Browser

You become legible to the extent that your life is repetitive, measurable, and connected to consequences someone cares about.

Assembled Signals
Routine
0
Stability
0
Social density
0
Price sensitivity
0
Novelty appetite
0
A real week

To you, it was just a hard stretch.

Maybe your rent went up. Maybe someone got sick. Maybe you were quietly trying to leave one life and begin another. The systems below do not see that story. They see changed probabilities.

Tuesday · 11:48 PM
Searches for cheaper apartments near a hospital across town.
Wednesday · 7:06 PM
Grocery bill drops. Coupon use rises. Two small subscriptions get cancelled.
Thursday · 12:31 AM
Resume edits. Job applications. A long pause on one salary comparison page.
Friday · 9:14 PM
Three calls home. Longer than usual. One train ride to a neighborhood never visited before.
How systems read it

To you, this might be stress, care, transition, money, worry, love. In documented consumer systems, the same week can become ad intent, a shifting interest graph, or a new recommendation neighborhood.

Google AdsSomeone showing moving, budgeting, local-service, or job-transition intent.
Instagram ExploreSomeone whose recent engagement now resembles people who like relocation, caregiving, or career-change content.
YouTubeSomeone whose next satisfying session may now live near apartment tours, budgeting explainers, and job-search videos.
Translate.
The same life can be compressed into different machine selves, depending on who wants something from it.
Act III — Lenses

Same traces.
Different machine selves.

Switch products. The raw behavior stays the same. The objective function changes. That is enough to produce different machine selves: one built for ad targeting, one for ranking what you discover on Instagram, one for deciding what YouTube should place next.

Model walkthrough

How the current system turns traces into an output.

This is a step-by-step educational reconstruction of the published pipeline. It uses the visible trace on this page and, if enabled, your live browser context. It does not store any information.

Step 1

Signals

Experimental / educational only. No information is stored or sent anywhere by this page. The real systems are proprietary; this interaction mirrors the public stages and signal families they describe.
Google Ads / My Ad Center

Your ad-targetable self.

This version of you is built from the signals Google publicly says it may use for personalized ads and audience segments: activity, YouTube history, approximate areas used, account info, and category-level interests.

Current output
74
High likelihood of responding to urgency and convenience framing.
A hard week can become a marketable mood.
This version of you is not false. It is partial, instrumental, and designed for a goal you did not choose.
Documentation basis: Google Ads audience segments and My Ad Center controls; Meta transparency system cards for Instagram Explore and Feed Recommendations; YouTube recommendation research and engineering explanations.
Closeness, to a system, is often just distance in a space you never see.
Act IV — Geometry

You are being placed near people like you.

Drag the point. This is not literally you. It is a simplified embedding: a way of turning behavior into coordinates. The unsettling part is not that systems do this badly. It is that they often do it well enough to act on.

Similarity map
More novelty
More routine
More frugal
More expressive
Nearest segment
Reliable commuter
To a model, you sit nearest people with stable routes, regular timing, and moderate price sensitivity. That does not define you. It does influence what systems expect from you.
Model translation
Human identity feels narrative. Machine identity often feels geometric: you are whatever cluster your behavior falls closest to.
Incomplete.
There are rooms in a person that data can circle, but not enter.
Act V — Limits

What a system can infer.
And what it cannot keep.

Click the observable trace. Read the missing interior beside it. A machine can often infer the shape of your life. But inference is not presence. Precision is not intimacy.

Observable traces
What remains interior
It can tell where you paused.
Not what you were postponing.
Behavior leaves a shadow. Meaning does not always travel with it. Systems are powerful because the shadow is often enough to classify, rank, or intervene. But a shadow is still not a person.
Mirror

The danger is not that a machine becomes you.

It is that an approximation of you becomes actionable. Good enough to rank you. Good enough to price you. Good enough to decide what reaches you, what suspects you, what invites you, what quietly passes you by. You remain larger than those versions. But they already exist, and they already matter.