Fitness trackers and even Samsung televisions are becoming more advanced, and that data can inadvertently reveal sensitive things we never meant to make public
Would you take to Twitter and announce that you’d just had sex? Including details of “intensity” and duration?
No, same here.
But there was a minor scandal in 2011 when it emerged that thousands of people were doing just that without even realising it.
Back then fitness trackers were still novel and one of the leading manufacturers, Fitbit, made all data public by default.
It also provided dozens of different categories for activity – the simple motion sensor could tell that you were moving, but not what you were doing. So you simply told it whether you were skiing, running or having sex, and it worked out how many calories you’d burned.
Unfortunately, because this data was made public, it could be Googled. Use the right search term and you could view long lists of amorous encounters and pore over the stats such as intensity and duration.
It was voyeurism-cum-data analysis, and all very embarrassing for Fitbit.
Similar messes will occur time and time again as we learn to harness data from new and increasingly sophisticated trackers which interact with each other in complex ways.
Currently I’m wearing one which tracks my heart rate and logs the data on a graph. If I share that information publicly, people can tell exactly when and for how long my heart rate rises and falls.
On the surface, that seems harmless. But what if a client sees my nerves before a big meeting and spots an opportunity to renegotiate a contract in her favour? What if my life insurance provider spots signs of diabetes and decides to up my premiums?
Add GPS tracking into the mix and it gets even more dangerous. What if my boss spies me visiting the offices of a rival company on a day off and discovers that I’m job-hunting? Or burglars wait until I’m out before targeting my home ( similar crimes have already been reported ).
Clearly there are lessons to be learned, for us and for the companies which collate this data.
Often they are small start-ups with limited resources, yet we still trust them with precious and sensitive data. Security experts have warned that these companies may focus on developing new features to keep up with a rapidly changing market, rather than ironing-out security issues, yet we barely think twice before signing-up.
A recent FTC study looked at 43 apps which logged health data. It found that 20 sent that data on to third parties (to 70 companies, in total) and only around one in ten bothered to encrypt data.
Clearly it becomes hard to ensure the safety of our data as soon as it leaves our hands.
This sharing can be hugely beneficial, if done responsibly. Strava sells anonymised data on cycling patterns to local councils so that they can better plan new bike lanes, for example. A clever revenue stream which also promises to make life better for commuters.
But other companies may simply sell that data on to advertisers for a quick buck.
A recent survey by PwC found that 82 per cent of Americans were worried about privacy implications of wearable devices, while 86 per cent said that they would make them more prone to security breaches. It’s not just who we grant access to our data that we should be worried about, but the people who’ll take it regardless, the hackers.
“Perhaps it is time to just give up thinking that your life is private,” says a blog post on the Kaspersky website .
“It is well known that your health is someone’s business. Nothing personal. The good news is that at the moment, law-abiding citizens are not forced to use trackers. So all you have to do to keep this particular bit of your life private is to avoid this kind of stuff.”
One way to boost security could be to create a centralised hub which all wearable devices could send data to, such as Apple is currently trying to do with HealthKit. This way devices from any manufacturer can send stats on steps walked, heart rate or blood pressure to a central app that can have lots of resources poured into it to ensure that it’s robust.
The benefit of that is that Apple has the scale and the experience to make a more secure tool than small device manufacturers.
Unfortunately, though, the hardware itself would likely remain vulnerable, as would the connection between it and the smartphone. And manufacturers do love to create their own sites, regardless of the presence of a better, central option, so the data would likely end up in two places anyway.
Currently the market is confused and fragmented. When I go for a run my watch uploads the data to Garmin Connect, but Strava automatically downloads it from there and adds it to my profile. Similarly, there are apps which record the numbers of calories I eat each day and can share it via APIs with all manner of fitness tracking sites. My Wi-Fi bathroom scales create a graph of my changing weight which can be integrated into other sites.
My data is flying all over the world, and it’s hard to keep track.
When our fitness trackers are so spider-webbed with APIs and data-sharing tools, who has the time to research the security precautions they all take, or even read all their T&Cs?
In essence, the concerns over wearables are no different to those we have over our computers: we don’t fully understand them, therefore can’t fully trust them or protect them from hackers. After all, these devices are computers in their own right, just as modern televisions, cars and even heating thermostats are.
This article was written by Matthew Sparkes Deputy Head of Technology from The Daily Telegraph and was legally licensed through the NewsCred publisher network.