Design a site like this with
Get started

Surveillance Capitalism in the Internet of Loos: Can the General Data Protection Regulation Counter Digital Dispossession?

Yay I’ve finished the 6th chapter of my book on Internet of Things and the Law (Routledge, forthcoming)!

The IoT constitutes an unprecedented challenge to privacy for a twofold reason.[1] First, it is progressively eroding the area of what can be regarded as private. Traditionally, the home and the body were the most sacred of private spaces.[2] This is being overcome by a world in which smart home and IoT-health are becoming commonplace. The IoT is normalising the idea that ubiquitous cameras, microphones and sensors track citizens’[3] behaviour and transform it into structured data flows that are sent back to our Things’ manufacturers. This is perhaps best illustrated by Amazon’s Echo Spot and Echo Look – respectively an alarm clock and a style assistant – that are equipped with cameras and are designed to be used in the bedroom and even in the bathroom. This has practical consequences. Indeed, to bring an action for breach of the right to privacy,[4] as stated in Campbell v MGN, ‘the touchstone of private life is whether in respect of the disclosed facts the person in question had a reasonable expectation of privacy.[5] To assess whether, at the time of the alleged breach, there was a reasonable expectation of privacy, the question to be asked as phrased in Murray v Big Pictures[6] is: what would a reasonable person of ordinary sensibilities feel if placed in the same situation as the subject of disclosure and faced with the same publicity? Among the factors to take account of, the place where the intrusion happens and the absence of consent play an important role.[7] If one buys a Thing whose ‘smartness’ is intrinsically connected to its sensing and tracking capabilities, and this Thing is designed to be deployed in the bedroom or even in the bathroom, does one retain a reasonable expectation of privacy? The situation is worsened by the principles that, set out in Spycatcher,[8] limit the duty of confidentiality. One of them is that there is no such duty when the information is trivial.[9] IoT companies may argue that the information about when one wakes up or when one goes to the loo is trivial. However, the IoT allows the combination of data from multiple sources in such a way that information that might seem trivial if considered in isolation, becomes personal and valuable once combined with other data that leads to valuable inferences about an IoT user’s preferences and vulnerabilities.[10] Finally, IoT companies may claim that information about our e-commerce habits is not really private, but as seen in Arkansas v Bates,[11] Amazon’s users are concerned about the company revealing sensitive purchases as these are considered private information whose disclosure could potentially harm their reputation or career. Overall, this erosion of the private sphere is alarming because it does not allow for that ‘intellectual privacy’[12] that is necessary to be ourselves and express ourselves freely and creatively. In this sense, the IoT can be regarded as an attack on the self, that self that is – according to Seneca’s famous ‘recede in te ipse’ (‘self-retreat’)[13] – the safe space where virtue, wisdom and happiness are given the chance to grow.[14]

The IoT challenges the right to privacy also for a second reason, which will be the more modest focus of this chapter. Since the IoT ‘could undermine such core values as privacy,’[15] this chapter will critically assess whether the GDPR, the most advanced European privacy law so far,[16] can tackle the data protection issues in the IoT. The core features of the IoT render GDPR compliance difficult, if at all possible. An illustration of this is the conflict between the principle of purpose limitation and IoT’s repurposing. As seen in Chapter 2, ‘repurposing’[17] is a critical characteristic of IoT systems, dependent on their (inter)connectivity and system-of-systems dimension.[18] ‘Repurposing’ can be understood as the phenomenon whereby an IoT system ends up being used for purposes other than those originally foreseen in two scenarios:

  • The communication within the relevant subsystem and among subsystems can lead the system to perform actions and produce information which the single Thing was incapable of or that could not be foreseen by its manufacturers; and
  • Under certain conditions (e.g. an emergency) the system may reconfigure either in an automated fashion or a user-initiated one.

IoT’s repurposing runs counter the purpose limitation principle, whereby personal data has to be ‘collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes.’[19]

This chapter will center on the tensions between the GDPR and the IoT. After an introduction to the GDPR, which will be framed as a ‘data control’ law (VI.2), the chapter will present the main data protection issues in the IoT (VI.3). It will then focus on one of them that is usually overlooked: ‘digital dispossession.’[20] This refers to IoT companies’ (ab)use of intellectual property rights (especially trade secrets) to appropriate citizens’ data and prevent them from exercising their data subject rights, including the right of access.[21] This is part of wider phenomenon whereby the new data economy relies on the commercialisation of data.[22] This is leading to the privatisation of ownership of both the IoT’s infrastructure and IoT data.[23] Digital dispossession will be analysed as a tenet of the theory of surveillance capitalism[24] (VI.4). To understand what practically happens to IoT users’ data, the chapter will move on to analyse Alexa’s data practices by means of a subject access request, engagement with customer support, and text analysis of its privacy policy (VI.5). Finally, it will consider whether the GDPR is fit for the IoT. To carry out this fitness check, the chapter will explore whether the rights to access, to portability, to be informed, and not to be subject to solely automated decisions can be successfully invoked to counter IoT companies’ digital dispossession practices, or whether, conversely trade secrets may give these companies a weapon to nullify the GDPR rights (VI.6).

[1] The relationship between IoT and privacy can be and has been analysed from manifold perspectives. See e.g. Lilian Edwards, ‘Privacy, Security and Data Protection in Smart Cities: A Critical EU Law Perspective’ (2016) 2 EDPL 28; Guido Noto La Diega, ‘Clouds of Things: Data Protection and Consumer Law at the Intersection of Cloud Computing and the Internet of Things in the United Kingdom’ (2016) 9(1) Journal of Law & Economic Regulation 69; Sandra Wachter, ‘Normative Challenges of Identification in the Internet of Things: Privacy, Profiling, Discrimination, and the GDPR’ (2018) 34 Computer Law & Security Review 436; Lachlan Urquhart, ‘White Noise from the White Goods? Privacy by Design for Ambient Domestic Computing’ in Lilian Edwards, Burkhard Schafer and Edina Harbinja (eds), Future Law (EUP 2019).

[2] On how the implementation of human-implantable tracking systems creates a serious risk to privacy see Ian Kerr, ‘The Internet of Things? Reflection on the Future Regulation of Human-Implantable Radio Frequency Identification’ in Ian Kerr, Valerie Steeves and Carole Lucock (eds), Lessons from the Identity Trail. Anonymity, Privacy and Identity in a Networked Society (OUP 2009) 335 and esp. 347 ff.  On the traditional concept of privacy, see also Judee K Burgoon and others, ‘Maintaining and Restoring Privacy through Communication in Different Types of Relationships’ (1989) 6 Journal of Social and Personal Relationships 131; Rosamund Scott, Rights, Duties and the Body: Law and Ethics of the Maternal-Fetal Conflict. (Bloomsbury Publishing 2002).

[3] This chapter refers to citizens and users rather than consumers because, unlike the consumer laws analysed in the previous chapters, data protection law does not apply only to consumers, but to all natural people.

[4] Whilst there is no distinct cause of action for breach of privacy in the UK, there is a tort of misuse of private information that is commonly used in privacy cases. It evolved out of the common law of breach of confidence, but it is now regarded as a stand-alone action. See PJS v News Group Newspapers Ltd [2016] UKSC 26; Wainwright v Home Office [2003] UKHL 53; Google Inc v Vidal-Hall and others [2015] EWCA Civ 311. Unlike the breach of confidence, the misuse of private information does not require a pre-existing relationship of confidence between the parties, as was the case in Campbell (Naomi) v Mirror Group Newspapers [2004] 2 AC 457 (HL).

[5] Campbell (n XX) [21]. The unlawful use of information in respect of which the claimant has a reasonable expectation of privacy is the so-called confidentiality component of the misuse of private information. The other component is ‘intrusion’; indeed, claimants need to prove unwanted intrusion in their private lives or harassment. For this framing of the misuse of private information see PJS (n XX).

[6] Murray v Big Pictures (UK) Ltd [2008] EWCA Civ 446.

[7] Murray (n XX) [35]-[36].

[8] Attorney General v Guardian Newspapers (No 2) [1990] 1 AC 109.

[9] There are contrasting authorities on the matter. On the one hand, in Ambrosiadou v Coward [2011] EWHC Civ 409 [30], the court held that not all information that relates to a person’s private life is protected, ‘the information may be of slight significance, generally expressed, or anodyne in nature’ (similarly, see R (Wood) v Commissioner of Police of the Metropolis [2009] EWCA Civ 414). Conversely, in McKennitt v Ash (QBD) [2005] EWHC 3003) it was observed that the mere fact that information was of a relatively trivial or anodyne nature would not necessarily mean the non-engagement of ECHR, art 8.

[10] See e.g. Paramasiven Appavoo and others, ‘Efficient and Privacy-Preserving Access to Sensor Data for Internet of Things (IoT) Based Services’, 2016 8th International Conference on Communication Systems and Networks (COMSNETS) (2016).

[11] State of Arkansas v James A Bates, Memorandum of Law in support of Amazon’s Motion to Quash search warrant, Case No CR- 2016-370-2, (Benton County Court: 2017).

[12] Neil Richards, Intellectual Privacy: Rethinking Civil Liberties in the Digital Age (Oxford University Press 2015).

[13] Seneca, Epistulae Morales Ad Lucilium, vol I, p VII.8.

[14] For this interpretation of Seneca see Christine Richardson-Hay, First Lessons: Book 1 of Seneca’s Epistulae Morales– a Commentary (Peter Lang 2006) 264.

[15] William H Dutton, ‘Putting Things to Work: Social and Policy Challenges for the Internet of Things’ (2014) 16 info 1.

[16] This is not to suggest that privacy and data protection are synonyms. There is private information that may be breached despite the compliance with data protection laws and, equally, personal data includes also information that is not related to one’s private life. See e.g. Michèle Finck, European Parliament and Directorate-General for Parliamentary Research Services, Blockchain and the General Data Protection Regulation: Can Distributed Ledgers Be Squared with European Data Protection Law? (2019) 15 <; accessed 11 June 2020. Nonetheless, it cannot be denied that a major policy goal of the GDPR is to increase the protection of privacy (see e.g. GDPR, recital 4).

[17] Noto La Diega, ‘Clouds of Things’ (n 2).

[18] On the repurposing of big data drawn from the IoT in smart cities, see Edwards (n 2).

[19] GDPR, art 5(1)(b).

[20] Another way of looking at it is ‘data appropriation’ as we called it in Guido Noto La Diega and Cristiana Sappa, ‘The Internet of Things at the Intersection of Data Protection and Trade Secrets. Non-Conventional Paths to Counter Data Appropriation and Empower Consumers’ 2020 REDC. This chapter draws on that paper.

[21] For a reflection on whether, and to what extent, the concept of ownership can be applied to personal data in the context of the IoT see Václav Janeček, ‘Ownership of Personal Data in the Internet of Things’ [2018] Computer Law & Security Review 1039.

[22] Josef Drexl, ‘Designing Competitive Markets for Industrial Data. Between Propertisation and Access’ (2017) 8 JIPITEC 257.

[23] Edwards (n 2).

[24] Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (First edition, PublicAffairs 2019)

Published by guidonld

I am Associate Professor of Intellectual Property Law and Privacy Law at the University of Stirling, Faculty of Arts and Humanities, where I lead the Media Law and Information Technology Law courses. I am an expert in the legal issues of Internet of Things, Artificial Intelligence, cloud computing, robotics, and blockchain. Holder of a PhD (Unipa), a postdoc (QMUL), and an HEA Fellowship, I have a strong publication and bidding record and my works on Intellectual Property, Data Protection, Information Technology Law, Consumer Protection, and Human Rights have been cited by the EU Court of Justice’s Advocate General, the House of Lords, the European Commission, and the Council of Europe. Outside of the University of Stirling, I am Director of ‘Ital-IoT’ Centre of Multidisciplinary Research on the Internet of Things, Visiting Professor at the University of Macerata, Fellow of the Nexa Center for Internet and Society, Fellow of NINSO Northumbria Internet & Society Research Group, and I serve on the Executive Committee of the Society of Legal Scholars, the oldest and largest society of law academics in the UK and the Republic of Ireland. Most of my publications can be downloaded for free on SSRN, ResearchGate,, and LawArXiv.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: