Trying to leave Facebook: why your data still isn’t really yours

By Chris Justice

The idea that “your data is yours” has become part of the digital common sense.

Platforms emphasise transparency. They offer download tools. Privacy dashboards are more visible than ever. The message is reassuring: you are in control.

I wanted to test that assumption in a simple way.

So I tried to leave Facebook properly. Not by simply deleting my account, but by downloading my data to see whether it was truly portable. Could I reuse it? Could I rebuild elsewhere? Could I meaningfully take it with me?

The experience was revealing.

The promise of ownership

When people hear that they “own” their data, they usually picture something practical: a set of information that is structured, transferable, and usable outside the service where it was created. In other words, something that can move with them.

In Europe, data protection regulation reinforces that expectation by giving individuals the right to access their personal data and to receive it in a form that should enable them to transfer it to another provider. The intent is not merely to allow people to view their data, but to reduce lock-in and increase genuine user control.

That is why the language around data access often ends up sounding like ownership. Not ownership in the legal sense of property rights, but ownership in the everyday sense: if this information is about me, and created through my activity, I should be able to take it with me and use it elsewhere.

Facebook does offer a mechanism that appears to support this. You can request an archive of your account, and after a short wait you receive a substantial download containing years of posts, photos, comments, and activity history.

From a compliance perspective, that might look like the end of the story.

From a user perspective, it is only the beginning.

What you actually receive

The archive is comprehensive in volume. It is also, in a very important way, inert.

You can browse it in your web browser. You can click through posts, look at images, and search within the exported files. You can confirm that the information is there and that the platform is, technically, giving you access.

What you cannot do is move the thing that gives the platform its value: your network of connections and interactions. Your friends, groups, shared history, and accumulated context.

And that is not a small omission. For most users, Facebook is not valuable because of any single post or photograph. It is valuable because of what those posts and photographs connect you to: other people.

The archive does not give you a way to take that living network somewhere else. You cannot seamlessly import it into another service. You cannot recreate it without enormous effort. You cannot re-establish the dynamics that made the data meaningful in the first place.

So while the data is accessible, it is not operational.

That is the gap between access and ownership.

Friction as structure, not accident

None of this necessarily breaks any rules. The download exists. The information is there. The obligation is met.

Yet the barriers are substantial, and they are not random. The archive is structured in a way that makes it readable, but not reusable. It is easy enough to browse, but difficult to repurpose. It is technically complete, but practically resistant to migration.

In other words, it does not need to block exit explicitly. It only needs to make exit expensive in time, effort, and uncertainty.

At some point, friction stops looking incidental and starts looking structural.

In response to my experiment, Manuel Singeot captured the issue succinctly:

“Surface compliance is not the same as meaningful portability. If the experience makes it practically impossible for users to exercise their rights, regulators will eventually look beyond the checkbox.”

His observation goes to the heart of it. A system does not need to forbid exit. It only needs to make exit impractical.

That is a very different form of control.

Why this matters to regulators

In France, CNIL is the national data protection authority. It is the public body responsible for enforcing data protection rules. It can investigate complaints, audit companies, impose fines, and require changes.

Similar authorities exist across Europe, and together they are increasingly attentive to the difference between formal compliance and practical outcomes.

More recently, European legislation has gone further. The Digital Services Act focuses on platform responsibility and systemic risks. The Digital Markets Act targets the largest “gatekeeper” platforms and aims to reduce unfair lock-in and barriers to switching.

Both reflect a growing recognition: formal compliance is not the same as practical freedom.

It is no longer enough to provide a download button if users remain effectively locked in.

What this means beyond Facebook

This is not simply a Facebook story.

It is a broader question about digital design and digital power.

Are we building systems that technically satisfy requirements, or systems that genuinely empower users? Are we comfortable relying on friction as a retention strategy? How sustainable is that in an environment where trust, competition and regulatory expectations continue to rise?

Users may not frame it in legal language, but they recognise when control feels real and when it feels symbolic.

Over time, that perception shapes both brand trust and policy response.

The conversation continues

The question of portability is ultimately a question of power: who gets to decide where your digital life can exist, and under what conditions you are allowed to move.

For policymakers, this increasingly connects to a wider debate about digital sovereignty. Not only in the national or geopolitical sense, but in the everyday sense of whether individuals and organisations can operate independently of a small number of dominant platforms.

In our peer groups and conferences, this is exactly the kind of issue we explore together:

  • What does meaningful portability look like in practice, beyond “download your archive”?

  • Where is the line between legitimate product design and artificial friction?

  • How do emerging rules around platform responsibility and competition change what “good” looks like?

  • What does digital sovereignty mean for digital leaders making real decisions today?

If you’re wrestling with these questions, you’re not alone. The conversation continues, and it is one worth having in the open.