If you can’t beat them, jam them

< Blog home

I have been pondering over data, privacy and all the scandals surrounding Facebook, Google, Amazon and the other tech giants. We all know they're collecting vast amounts of information about us and using their monopolistic powers to pervert markets and lay waste whole industries at their whim.

However, this piece is not about them. It is about us. What’s wrong with us? We’re contained inside surveillance capitalism, yet we seem perfectly content to carry on. I read the other day that Facebook posted record profits despite ALL the stuff that’s been going on with them in 2018. Don’t we care at all? Maybe.

Adam Curtis
Adam Curtis, filmmaker

Maybe we just don’t know how to. Maybe Adam Curtis is right. Maybe the world feels too complex and chaotic. He uses the word “HyperNormalisation” - a phrase coined by a Russian historian writing about the last years of the Soviet Union. In the 80s everyone from the top to the bottom of Soviet society knew that it wasn’t working. They knew it was corrupt, knew that the bosses were looting the system, knew that the politicians had no alternative vision. Everyone knew it was fake, but because no one had any alternative vision for a different kind of society, they just accepted this sense of total fakery as normal.

Sometimes this is exactly what the modern tech and media landscape feels like to me. We know it’s not working for us, we know we’re being treated as commodities in a state of perfect paralysis - yet we keep on checking Instagram and binging Netflix or YouTube.

Escaping the surveillance economy doesn’t feel like an actual option to most people. So are we simply lost in a giant digital coliseum, housing billions of people that can get any kind of entertainment they like, 24/7? What can we do to break out?

From actual privacy to practical privacy

Maybe to start getting somewhere we have to embrace the idea that actual privacy is dead, a utopian dream in our current technology and media landscape. But what about practical privacy? What if you were being tracked still, but nobody could tell if the data  was real or synthetically generated on your behalf to obscure? You would in effect reclaim privacy as a practical concept.

A friend once mentioned that every superhero has a kryptonite - their super power always has a fatal flaw. So what is the flaw in Google and Facebook’s near-perfect algorithms and data collection capabilities?

Data itself, of course.

Their super power is also their greatest weakness. It relies on automated data gathering and processing at an almost incomprehensible scale. To quote Adam Curtis himself:

Using feedback loops, pattern matching and pattern recognition, those systems can understand us quite simply. That we are far more similar to each other than we might think, that my desire for an iPhone as a way of expressing my identity is mirrored by millions of other people who feel exactly the same. We’re not actually that individualistic. We’re very similar to each other and computers know that dirty secret. But because we feel like we’re in control when we hold the magic screen, it allows us to feel like we’re still individuals. And that’s a wonderful way of managing the world.

We're jammin'

However, all of this power relies on the idea that the data can be trusted. What if we all passively, at a large scale, began to generate data that looked and felt real? Data that was generated by us - but created sarcastically. Fake, but real. What better way to show your human objection to constant data gathering than by throwing a spanner in the works at source. Let’s call it digital activism.

Parasite
Parasite, by Project Alias

I think we are seeing signs of it already. For example, while gathering thoughts for this piece I came across the Parasite.

Bjørn Karmann and Tore Knudsen recently created Project Alias, a teachable “parasite” that is designed to give users more control over their smart assistants, both when it comes to customisation and privacy.

It sits on your Amazon Echo or Google Home and lets you block them and control them in various ways.

I then found Noiszy. Their tagline is brilliant.

Noiszy
Noiszy's homepage

Noiszy is a browser plugin that creates meaningless web data - digital noise. It visits and navigates around websites, from within your browser, leaving misleading digital footprints around the internet. It obscures you, but if enough people did it, could it actually turn Google or Facebook’s greatest asset into a giant liability.

What if more than 51% of the data they received was nonsense generated by humans in their own name? Would it break the matrix and implode the universe?

Maybe, if enough of us did it, it could actually inspire change. Facebook, Google and Amazon all relies on this data. They package and sell the very same data that we’re talking about obscuring. Maybe we could expose the system with the very same clicks, likes and shares that they crave.

The positive version

When explaining this idea to people I often get the feedback that "I wouldn't want that because it would make the suggestions I get worse and ruin my feeds". That's an interesting observation given that the main reason for the tech giants programming their recommendation systems the way they have is not for your benefit, but rather by viewing your attention as a resource to extract and sell. But despite this initial objection, I felt compelled to also include the positive version of my proposed digital activism.

We could also use these data generation techniques for our own personal good. A friend of mine keeps two YouTube accounts. One of them is only used for Ted talks, stuff for work and other things that represents his best self. On the other one he watches football highlights, music, gaming and other entertainment. But he won’t let that “pollute” his other account because then it instantly takes over and dominates. Recommender systems often crave the easiest fix, that's how their masters trained them. Being in the world of recommender systems that made me think.

If we can build systems to obscure data, we could also build bots that can train these algorithms to show us more of what we’d like, not more of who we are. Until my YouTube account has an “exploration” setting I can switch on, I might employ a bot to be more aspirational on my behalf - so to speak. If left to themselves, these algorithms maximize engagement and time spent, but is that really what we want?

Maybe digital activism is just a first step. We could employ bots to make sure our YouTube and Instagram feed stays balanced, even if that content is not always our first impulsive choice. Maybe if we hid our worst tendencies from ourselves, then actually getting value from all the amazing content on these platforms would be easier too.

I’d pay 3.99 a month for that.

Recommended for you