You are not one person online.
You are many approximations.
A lender does not want your humor. An ad platform does not care about your private grief. A recommender system does not need your full soul. Each system only needs enough of you to sort, price, rank, or predict. A rough month can look like risk to one system, opportunity to another, and drift to a third.
A machine doesn't meet you.
It assembles you.
Turn data sources on and off. Watch how a silhouette sharpens, not because it understands a person, but because each stream narrows uncertainty. This is how a data-double gets built: not with insight, but with accumulation.
You become legible to the extent that your life is repetitive, measurable, and connected to consequences someone cares about.
To you, it was just a hard stretch.
Maybe your rent went up. Maybe someone got sick. Maybe you were quietly trying to leave one life and begin another. The systems below do not see that story. They see changed probabilities.
To you, this might be stress, care, transition, money, worry, love. In documented consumer systems, the same week can become ad intent, a shifting interest graph, or a new recommendation neighborhood.
Same traces.
Different machine selves.
Switch products. The raw behavior stays the same. The objective function changes. That is enough to produce different machine selves: one built for ad targeting, one for ranking what you discover on Instagram, one for deciding what YouTube should place next.
How the current system turns traces into an output.
This is a step-by-step educational reconstruction of the published pipeline. It uses the visible trace on this page and, if enabled, your live browser context. It does not store any information.
Signals
Your ad-targetable self.
This version of you is built from the signals Google publicly says it may use for personalized ads and audience segments: activity, YouTube history, approximate areas used, account info, and category-level interests.
You are being placed near people like you.
Drag the point. This is not literally you. It is a simplified embedding: a way of turning behavior into coordinates. The unsettling part is not that systems do this badly. It is that they often do it well enough to act on.
What a system can infer.
And what it cannot keep.
Click the observable trace. Read the missing interior beside it. A machine can often infer the shape of your life. But inference is not presence. Precision is not intimacy.
Not what you were postponing.
The danger is not that a machine becomes you.
It is that an approximation of you becomes actionable. Good enough to rank you. Good enough to price you. Good enough to decide what reaches you, what suspects you, what invites you, what quietly passes you by. You remain larger than those versions. But they already exist, and they already matter.