No Such Thing as Anonymised Data

By Claire Snook | March 23rd, 2020 | Uncategorized
pexels-photo-373543
Photo by Pixabay on Pexels.com

CEO and Co-Founder, Paul Hague, shares his thoughts on why the term ‘anonymised data’ is a misnomer, and what the industry doesn’t want you to know about your personal data. 

We hear about anonymised data all the time. We’re promised by companies that harvest our information, even those that we willingly share it with, that they will anonymise those data points to protect us. 

Even those of us who don’t work with data day in and day out know about laws such as GDPR which are specifically designed to restrict how data is used. 

But the trouble is that anonymised data can still be used to identify someone, and many of the devices that are becoming everyday parts of our lives are intrinsically linked to our personal lives. How can a smart doorbell such as Ring be anonymous? It’s literally stuck on your front door!

Consumer tech is a trojan horse

Our modern lives have normalised having consumer technology around us all the time. Asking Siri for directions, your smart speaker to play your favourite tune, your home hub what the weather is going to be like…we’re living in connected worlds with devices purposefully designed to collect data to understand us better. 

But those very same devices are Trojan horses increasing the hoovering of more and more data. The manufacturers can say that it’s about product improvement all they like, but the truth is that it’s about monitoring and behavioural analytics to create problems you don’t have and solutions you don’t need.

Why use smart locks or smart doorbells anyway?

ring-doorbellThe Ring smart doorbell is a great example of this. Has anyone taken the time to think whether we actually need this technology? Do we honestly believe we have burglars patrolling our house when we are out? 

Or is this instead about trying to shift a business model and reduce cost by being able to deliver when you are out?

Smart locks can create more problems that it solves: is a key really so bad? When I recently went on holiday, the villa when we arrived had a smart lock…that did not work. We had to wait an hour for someone to come along and let us in, with a key!

Baby monitors are continuously vulnerable to hacking. When did we need internet connected baby monitors? What problem does this actually solve? As a parent I would rarely be out on the town leaving my toddler at home alone!

Creating a surveillance world

pexels-photo-267394
Photo by Pixabay on Pexels.com

All of these devices are collecting data every single moment of the day, and although that data may be nominally ‘anonymised’ by the companies collating it, it’s simply not possible to entirely anonymise it. That leaves us open to surveillance and monitoring, driven by believing the hype that we ‘need’ these products. 

It’s time for us to start questioning this assumption that a smart house is the only house worth living in. As individuals and as a society, we need to look at the evidence, and what do we see?

No data is safe for long. 

Smart watches designed for children have serious security flaws and very dubious processes when it comes to storing and sharing their data. Do we want complete strangers knowing where our children are at every minute of the day? What risk does that pose to our children, especially those who are vulnerable? 

The Ring doorbell is a great example of how not to do smart home tech. Their databases have more in common with sieves than a locked filing cabinet, with news coming out every month about the vulnerabilities that their data has. From giving Google and Facebook your data (just handing it over!) to just how much data is logged, it’s a frightening product that many are now starting to avoid. 

Even technology that is designed to keep your loved ones safe is being exploited by those who may wish them harm. When I saw the news about how baby monitors are being hacked, in some cases by criminals who have the absolute worst motives, it made me realise just how vulnerable we are making ourselves. We are putting ourselves in these dangerous positions. 

The public doesn’t know how to protect itself from data crime

These are just a few examples of how smart technology is collecting and in some cases, releasing, data about you that won’t stay anonymised for very long. Most people not working in the cybersecurity world have little knowledge about how technology is hiding in plain sight, watching your every move – and making a note. 

While buying the latest gadget might temper our lust for all things consumer, we must begin to consider the wider implications. We need to start asking ourselves hard questions that we may not like the answers to. 

  • Why this is being sold, sometimes as a clear loss leader?
  • What data could they get from me?
  • What are they going to do with that data?
  • Do I trust this company to look after my data?
  • Do I know which other companies this one has data sharing agreements with?
  • Would I be upset if a criminal gained access to this data?
  • Would I want my family or business to grow more vulnerable because I have this technology?
  • Can I live without it?

If our data is important to us, we need to treat it carefully. Data is quickly becoming a currency and like any currency, if you do not guard it, then you are liable to lose it. 

If smart technologies cannot be trusted, then maybe it is time for the other parts of the industry to step forward. Operators should be offering services that can be trusted, creating a secure environment to stop this constant surveillance and cyber-attacks.

Disagree with Paul? Let us know in the comments below, or email him directly at Paul.Hague@blackdice.io 

Leave a Reply