Wearables data challenges beyond security and privacy

The wearable technology revolution faces some challenges around communicating the data to and from these devices while keeping it secure and private both in transit and at each end.

As I mention in my recent Nature article (doi:10.1038/525022a), privacy concerns are one of the key issues around the growth of wearables – indeed it is the concern picked up on in the twitter discussions around the article too.

While the article reports some of the measures being taken to address security and privacy, there are a couple of other factors of importance that I’d like to highlight here.


Deconstructing the Google Cardboard
Deconstructing the Google Cardboard


Privacy of health data is particularly pertinent today after breaking news that one of Britain’s leading sexual health clinics has accidentally released the HIV status of 780 patients, bringing home the magnitude of privacy breaches, even if inadvertent. As wearable technologies gather increasingly nuanced data about our bodies, we need to look at concerns around where the data goes and who owns it – and how they deal with it.

It’s a concern that has already been raised around the broader family of devices that comprise Internet of Things, eloquently and vehemently discussed by Bruce Sterling in “The Epic Struggle for the Internet of Things“, in which he frames the technological giants vying for control over our real lives as they have done over our digital. Buying into proprietary devices has just such serious implications for the commodification of our bio-data, as discussed in this excellent “Open Sourcing Wearables” chapter by Romano and Cangiano. By open sourcing wearables, and tying the data collection in with individually owned health data repositories, potentially the privacy and commodification issues around wearables could be somewhat mitigated.

Of course, the stakes can get even higher when it comes to devices that not only collect our data but also start to act upon us. The Muse, for example, is a headband that professes to train your brain. And the video that streams to your headset could end up being controlled by your physiological sensors. The technology is there: four years ago, tech company Sensum premiered a responsive horror film experience that varied how horrifying the scenes were according to how stimulated the audience were – something that was determined by galvanic skin – or sweat – responses collected using wearable sensors. Hook that up to a Google Cardboard  AR or VR app, and you suddenly have a whole new level of impact for entertainment – or advertising.

But these solutions rely on us, the wearers of wearables, being more engaged with understanding the implications of how we treat our data, and how we interact with technology in general. All the privacy and security measures that can be built into our technology can be undone by one stray acceptance of overly broad permissions when we download a new app. It’s something that security guru Sean Newman, our existing preconditioning to click “yes” to permissions are likely to be exacerbated as wearables become more ubiquitous. As Steve Mansfield-Devine reports in this interview with Newman:

“When a user installs an app, the operating system pops up a dialogue detailing the various permissions the app needs to work, such as access to the camera, ability to read or send SMS messages and so on. Whenever an app is updated and the new version requires additional permissions, this process is repeated. The problem is, does the user pay any attention to this warning?

“As the average user, you just say okay, because you want to use the app,” says Neman [si. “And it’s a bit like accepting terms and conditions of any software – nobody reads it all, because you want to actually use the software, and you just trust that someone’s done enough diligence on it. You click okay, and you’re away. Now, if someone has managed to get a malicious app onto the store past whoever’s policing it, then obviously you do have that risk.””

This widespread wanton permissions carelessness is a symptom of a wider issue around our attitudes to our digital lives – that we the risks are invisible, the mitigations are time consuming and confusing, and that we have an optimistic view of our own attention to the problem. A common technological security method to understand levels or risk is the threat model – a rigorous way of looking at the danger posed by a security threat, and the likelihood of someone perpetrating it. There’s a trade off here between how difficult it is to overcome security measures and the payoff for whoever does it when they succeed – it’s the answer to the question: “is it worth it?” In terms of the risks with wearables, threat models are few and far between according to Eva Galperin and Parker Higgins, responding to my question after their talk What the Hell is Threat Modelling Anyway at re:publica this year.

Even were we to fully understand the threats, we might not take any measures to mitigate them – after all, we tend to have at least a vague understanding of the privacy risks of social media and most of us do little to change our behaviour or privacy measures on these sites. According to Ricarda Moll in her presentation at re:publica, in our minds a pervasive “I’ve got nothing to hide” get out clause meets our own misconceptions that lead us to believe that we have more control over our privacy than everyone else, all of whom take less care.

As with all our interactions with technology, the devil is in the detail. As Anupam Joshi, Director of the Centre For Cyber Security at the University of Maryland Baltimore County, says,  “Privacy always costs in money time or convenience. Do people care enough about it?

Aside from these issues, though, what I find most fascinating are the even more hidden implications of wearables – the effect that the data have on our sense of self, and how representative that is. Deborah Lupton, a sociologist at the University of Canberra, has been looking at precisely what implications quantification has on how we perceive ourselves. “The emphasis on digital data is producing this ethos which detracts from other forms of understanding our wellbeing and that’s a real problem because it becomes very reductive”, says Lupton, who proceeded to tell me multiple instances of fitness tracking enthusiasts who became so reliant on the quantification of their activity that if they happened to exercise without measuring it, that exercise also failed to “count” in their head – their digital persona ironically now counting far more than the physical presence that they initially wanted to focus on. Hearteningly, ethnographic research by Sara M Watson at the Oxford Internet Institute suggests that at least some members of the “Quantified Self” community, who generate suites of data to shine a light on their lives, have awareness that their being can’t be reduced to these binary representations:

People in the QS community are actively questioning the construction of fact. While theLatin derivation of “data” “to give” suggests a fact taken as given, the QS community is notnecessarily accepting numbers, words, and images as they stand. They are critiquing the firms whosupport the self-tracking process. They are challenging the notion of a “step” or “fuel points” to which commercial activity trackers assign meaning.

There are further systemtic issues around this reliance on wearables, though, which may not be so self-reflective. In the 1980s there was a phase shift in looking at behaviour change and health promotion with the the UN WHO health charter the Ottawa Charter, which looked at environmental factors and as well as individual factors. The pervasion of  wearables in the fitness arena, and increasingly by companies to make their employees “fitter, happier, more productive”, as Radiohead would put it, seems like taking a step back, focusing only on the individuals in terms of health promotion and ignoring the broader factors, such as nutritional content of available foodstuffs, working hours and unhealthy buildings. For more see Health Promotion International, Vol. 29 No. 4 doi:10.1093/heapro/dau099.

With strong implications for our sense of identity, and the judgement of others, the accuracy of the data we collect could have serious repurcussions. As Mark Twain pointed out, there’s nothing in the world so flexible as statistics. There’s a wealth of research out there looking at the algorithms and measurement methods (see further reading below) that give rise to the graphs and numbers on a fitness tracker dashboard. There are nuances in the way that the data is processed on the device that mean they are not always reflective of reality, for example processing interuptions in accellerometer data can have an impact on how it counts steps and the overall outcomes, as can location and the duration of activity (Heil DP et al. Res Q Exerc Sport. 2009;80(3):424–33).

This kind of data is reaching far beyond our own understanding of ourselves, however. The people of new smart-city development Hudson’s Yard, New York, can volunteer to add their personal data to that already collected about their environment, in attempt by NYU’s Centre for Urban Science and Progress to create a quantified community and provide evidence-backed urban optimisation. If you’re interested in hearing more about evidence-backed policy-making, I recently wrote about this for the European Commission. And fitness tracking data recently made it into court as evidence for the first time.

Will wearables become a bigger part of our lives in the future? Probably. If they do so, these details about how accurate their data is, who has control of it, who has access to it and where it is stored, will become increasingly important. We need to have more, and more public, discussions about the meaning of wearables in our lives – and how to manage it – alongside all the excited projections of what glitzy gadgets might be able to do for us.

Further reading:

Lee IM, Shiroma EJ. Using accelerometers to measure physical activity in large-scale epidemiological studies: issues and challenges. Br J Sports Med. 2014 Feb;48(3):197-201. DOI: 10.1136/bjsports-2013-093154

Müller C, Winter C, Rosenbaum D. Aktuelle objektive Messverfahren zur Erfassung körperlicher Aktivität im Vergleich zu subjektiven Erhebungsmethoden [Current Objective Techniques for Physical Activity Assessment in Comparison with Subjective Methods]. Dtsch Z Sportmed. 2010;61(1):11-8.

Strath SJ, Pfeiffer KA, Whitt-Glover MC. Accelerometer use with children, older adults, and adults with functional limitations. Med Sci Sports Exerc. 2012 Jan;44(1 Suppl 1):S77-85. DOI: 10.1249/MSS.0b013e3182399eb1

The big tech companies bringing AI to medical data and cloud-based healthcare

Review article comparing accellerometers to self reporting in epidemiological studies

Smart watch that monitors how vibrant your workplace is and gamifies high fives

Gamified personal touch


New NSA leaks reveal spyware can be implanted in your tech en route for delivery

The internet is broken. And so is all your stuff. According to new leaked NSA documents, the agency have the capacity to read your screen and keyboard strokes remotely using tiny chips, which they might insert when your gear is being shipped to you.

That might make you think twice about buying hardware off the internet. But, never fear they have other ways of compromising your computer. It’s also just been revealed that they can masquerade as other websites – say your Yahoo! mail or CNN – to send malware to your machine. Once their malware is there, they’ve got unlimited access to you and your doings.

A room full of shocked hackers is a rarity, especially when the topic of conversation is cyber security. But that’s exactly what security expert Jacob Appelbaum got with new revelations about the NSA’s tactics as part of their Tailored Access Operations – TAO – programme at hacker congress 30c3 yesterday.

Referencing simultaneous publication of the information in German magazine Der Spiegel, during his talk “To Protect and Infect Part II” Appelbaum showed slides of some of the technical specs, with some chips – largely undetectable bugs that are read by being bombarded with continuous waves – the size of a thumbnail. You can see the details of all these NSA toys in a marvellous infographic from Der Speigel. What’s quite chilling is that most of this tech dates from 2007-8, which means we are only finding out about tech that’s 5 years old.

The scope of the TAO project is truly vast, encompassing foreign (to the US) embassies and high level politicians, and Der Speigel reports projected targets of 85,000 computers infected with TAO trojans by the end of this year.

None of this would be possible without some serious geekery. The revelations about TAO activities add  weight to previous calls by high profile security specialists and activists at the same conference, including at a joint talk by Appelbaum and Julian Assange on Sunday night, for geeks and nerds to step back and query their consciences about how they earn their crust.

Of course, this might all seem a world away. A few things bring revelations like this home, though, including the fact that the NSA are calling all us – yes I’m one too – iPhone users “Zombies”. There’s a price to pay for seamless design and increasing invisibility of computers, and that price is knowing what is being done to you and your devices.

Information overload: the ‘scape coat solution

Sometimes it feels good just to get away from it all. With a constant information flow – whether it be advertising, social media, traffic noise – sometimes we just want to switch off. But in most urban environments that’s not very easy.

That’s the problem I came to address in last Saturday’s speculative biology workshop at Art Laboratory Berlin, using speculative design to address biologically oriented problems. Run by artist Pinar Yoldas, the workshop aimed at finding innovative ways to address problems that run deep within our world. I teamed up with Claudia Manningel, Oliver Connew and Dustin Carlson for a crack at a design solution.

Our design grew out of two ideas. The first was choreographer Ollie’s penchant for public toilet cubicals, which he liked because he could have his own personal space away from the pressure of public interaction, but he was still in the midst of things. The second was artist Dustin’s response to overburdened senses, the idea of android-esque volume buttons on the ears and nose to dampen the amount of stimulus to which one is subject.

We morphed through many iterations until we came to the idea of a hood that cut out unwanted noise, and communicated how you were feeling through the use of non-verbal messages – namely light. But a hood can have negative connotations, despite efforts against maligning them, so instead we decided to go for its inverse – the collar.

The result? The ‘Scape Coat. It’s a large collar, that uses noise cancelling tech and directional microphones to control your auditory environment, and uses coloured lights to convey your mood. Image

The collar, as you can see from our cardboard mock-up, also acts a little as an enclosure, and it does make you feel safer, but also quite cut off. I have to say that I rather wanted to have one on over the last few days, so I think it’s a winner.