Network Defekts at Internet Yami-ichi, Ars Electronica

Kat will be running Network Defekts, a stall at Internet Yami-ichi (internet black market) at Ars Electronica this year.

Privacy is routinely compromised for the benefit of easy communication on platforms with friendly user interfaces but user-unfriendly Ts and Cs. For Network Defekts, Kat will sell Encryption tokens that elaborate on the level of encryption of different platforms and barriers to secure communication. 

Ars Electronica
7th September 2019

Wearables data challenges beyond security and privacy

The wearable technology revolution faces some challenges around communicating the data to and from these devices while keeping it secure and private both in transit and at each end.

As I mention in my recent Nature article (doi:10.1038/525022a), privacy concerns are one of the key issues around the growth of wearables – indeed it is the concern picked up on in the twitter discussions around the article too.

While the article reports some of the measures being taken to address security and privacy, there are a couple of other factors of importance that I’d like to highlight here.


Deconstructing the Google Cardboard
Deconstructing the Google Cardboard


Privacy of health data is particularly pertinent today after breaking news that one of Britain’s leading sexual health clinics has accidentally released the HIV status of 780 patients, bringing home the magnitude of privacy breaches, even if inadvertent. As wearable technologies gather increasingly nuanced data about our bodies, we need to look at concerns around where the data goes and who owns it – and how they deal with it.

It’s a concern that has already been raised around the broader family of devices that comprise Internet of Things, eloquently and vehemently discussed by Bruce Sterling in “The Epic Struggle for the Internet of Things“, in which he frames the technological giants vying for control over our real lives as they have done over our digital. Buying into proprietary devices has just such serious implications for the commodification of our bio-data, as discussed in this excellent “Open Sourcing Wearables” chapter by Romano and Cangiano. By open sourcing wearables, and tying the data collection in with individually owned health data repositories, potentially the privacy and commodification issues around wearables could be somewhat mitigated.

Of course, the stakes can get even higher when it comes to devices that not only collect our data but also start to act upon us. The Muse, for example, is a headband that professes to train your brain. And the video that streams to your headset could end up being controlled by your physiological sensors. The technology is there: four years ago, tech company Sensum premiered a responsive horror film experience that varied how horrifying the scenes were according to how stimulated the audience were – something that was determined by galvanic skin – or sweat – responses collected using wearable sensors. Hook that up to a Google Cardboard  AR or VR app, and you suddenly have a whole new level of impact for entertainment – or advertising.

But these solutions rely on us, the wearers of wearables, being more engaged with understanding the implications of how we treat our data, and how we interact with technology in general. All the privacy and security measures that can be built into our technology can be undone by one stray acceptance of overly broad permissions when we download a new app. It’s something that security guru Sean Newman, our existing preconditioning to click “yes” to permissions are likely to be exacerbated as wearables become more ubiquitous. As Steve Mansfield-Devine reports in this interview with Newman:

“When a user installs an app, the operating system pops up a dialogue detailing the various permissions the app needs to work, such as access to the camera, ability to read or send SMS messages and so on. Whenever an app is updated and the new version requires additional permissions, this process is repeated. The problem is, does the user pay any attention to this warning?

“As the average user, you just say okay, because you want to use the app,” says Neman [si. “And it’s a bit like accepting terms and conditions of any software – nobody reads it all, because you want to actually use the software, and you just trust that someone’s done enough diligence on it. You click okay, and you’re away. Now, if someone has managed to get a malicious app onto the store past whoever’s policing it, then obviously you do have that risk.””

This widespread wanton permissions carelessness is a symptom of a wider issue around our attitudes to our digital lives – that we the risks are invisible, the mitigations are time consuming and confusing, and that we have an optimistic view of our own attention to the problem. A common technological security method to understand levels or risk is the threat model – a rigorous way of looking at the danger posed by a security threat, and the likelihood of someone perpetrating it. There’s a trade off here between how difficult it is to overcome security measures and the payoff for whoever does it when they succeed – it’s the answer to the question: “is it worth it?” In terms of the risks with wearables, threat models are few and far between according to Eva Galperin and Parker Higgins, responding to my question after their talk What the Hell is Threat Modelling Anyway at re:publica this year.

Even were we to fully understand the threats, we might not take any measures to mitigate them – after all, we tend to have at least a vague understanding of the privacy risks of social media and most of us do little to change our behaviour or privacy measures on these sites. According to Ricarda Moll in her presentation at re:publica, in our minds a pervasive “I’ve got nothing to hide” get out clause meets our own misconceptions that lead us to believe that we have more control over our privacy than everyone else, all of whom take less care.

As with all our interactions with technology, the devil is in the detail. As Anupam Joshi, Director of the Centre For Cyber Security at the University of Maryland Baltimore County, says,  “Privacy always costs in money time or convenience. Do people care enough about it?

Aside from these issues, though, what I find most fascinating are the even more hidden implications of wearables – the effect that the data have on our sense of self, and how representative that is. Deborah Lupton, a sociologist at the University of Canberra, has been looking at precisely what implications quantification has on how we perceive ourselves. “The emphasis on digital data is producing this ethos which detracts from other forms of understanding our wellbeing and that’s a real problem because it becomes very reductive”, says Lupton, who proceeded to tell me multiple instances of fitness tracking enthusiasts who became so reliant on the quantification of their activity that if they happened to exercise without measuring it, that exercise also failed to “count” in their head – their digital persona ironically now counting far more than the physical presence that they initially wanted to focus on. Hearteningly, ethnographic research by Sara M Watson at the Oxford Internet Institute suggests that at least some members of the “Quantified Self” community, who generate suites of data to shine a light on their lives, have awareness that their being can’t be reduced to these binary representations:

People in the QS community are actively questioning the construction of fact. While theLatin derivation of “data” “to give” suggests a fact taken as given, the QS community is notnecessarily accepting numbers, words, and images as they stand. They are critiquing the firms whosupport the self-tracking process. They are challenging the notion of a “step” or “fuel points” to which commercial activity trackers assign meaning.

There are further systemtic issues around this reliance on wearables, though, which may not be so self-reflective. In the 1980s there was a phase shift in looking at behaviour change and health promotion with the the UN WHO health charter the Ottawa Charter, which looked at environmental factors and as well as individual factors. The pervasion of  wearables in the fitness arena, and increasingly by companies to make their employees “fitter, happier, more productive”, as Radiohead would put it, seems like taking a step back, focusing only on the individuals in terms of health promotion and ignoring the broader factors, such as nutritional content of available foodstuffs, working hours and unhealthy buildings. For more see Health Promotion International, Vol. 29 No. 4 doi:10.1093/heapro/dau099.

With strong implications for our sense of identity, and the judgement of others, the accuracy of the data we collect could have serious repurcussions. As Mark Twain pointed out, there’s nothing in the world so flexible as statistics. There’s a wealth of research out there looking at the algorithms and measurement methods (see further reading below) that give rise to the graphs and numbers on a fitness tracker dashboard. There are nuances in the way that the data is processed on the device that mean they are not always reflective of reality, for example processing interuptions in accellerometer data can have an impact on how it counts steps and the overall outcomes, as can location and the duration of activity (Heil DP et al. Res Q Exerc Sport. 2009;80(3):424–33).

This kind of data is reaching far beyond our own understanding of ourselves, however. The people of new smart-city development Hudson’s Yard, New York, can volunteer to add their personal data to that already collected about their environment, in attempt by NYU’s Centre for Urban Science and Progress to create a quantified community and provide evidence-backed urban optimisation. If you’re interested in hearing more about evidence-backed policy-making, I recently wrote about this for the European Commission. And fitness tracking data recently made it into court as evidence for the first time.

Will wearables become a bigger part of our lives in the future? Probably. If they do so, these details about how accurate their data is, who has control of it, who has access to it and where it is stored, will become increasingly important. We need to have more, and more public, discussions about the meaning of wearables in our lives – and how to manage it – alongside all the excited projections of what glitzy gadgets might be able to do for us.

Further reading:

Lee IM, Shiroma EJ. Using accelerometers to measure physical activity in large-scale epidemiological studies: issues and challenges. Br J Sports Med. 2014 Feb;48(3):197-201. DOI: 10.1136/bjsports-2013-093154

Müller C, Winter C, Rosenbaum D. Aktuelle objektive Messverfahren zur Erfassung körperlicher Aktivität im Vergleich zu subjektiven Erhebungsmethoden [Current Objective Techniques for Physical Activity Assessment in Comparison with Subjective Methods]. Dtsch Z Sportmed. 2010;61(1):11-8.

Strath SJ, Pfeiffer KA, Whitt-Glover MC. Accelerometer use with children, older adults, and adults with functional limitations. Med Sci Sports Exerc. 2012 Jan;44(1 Suppl 1):S77-85. DOI: 10.1249/MSS.0b013e3182399eb1

The big tech companies bringing AI to medical data and cloud-based healthcare

Review article comparing accellerometers to self reporting in epidemiological studies

Smart watch that monitors how vibrant your workplace is and gamifies high fives

Gamified personal touch