Yay I’ve finished the 6th chapter of my book on Internet of Things and the Law (Routledge, forthcoming)!
The IoT constitutes an unprecedented challenge to privacy for a twofold reason. First, it is progressively eroding the area of what can be regarded as private. Traditionally, the home and the body were the most sacred of private spaces. This is being overcome by a world in which smart home and IoT-health are becoming commonplace. The IoT is normalising the idea that ubiquitous cameras, microphones and sensors track citizens’ behaviour and transform it into structured data flows that are sent back to our Things’ manufacturers. This is perhaps best illustrated by Amazon’s Echo Spot and Echo Look – respectively an alarm clock and a style assistant – that are equipped with cameras and are designed to be used in the bedroom and even in the bathroom. This has practical consequences. Indeed, to bring an action for breach of the right to privacy, as stated in Campbell v MGN, ‘the touchstone of private life is whether in respect of the disclosed facts the person in question had a reasonable expectation of privacy.’ To assess whether, at the time of the alleged breach, there was a reasonable expectation of privacy, the question to be asked as phrased in Murray v Big Pictures is: what would a reasonable person of ordinary sensibilities feel if placed in the same situation as the subject of disclosure and faced with the same publicity? Among the factors to take account of, the place where the intrusion happens and the absence of consent play an important role. If one buys a Thing whose ‘smartness’ is intrinsically connected to its sensing and tracking capabilities, and this Thing is designed to be deployed in the bedroom or even in the bathroom, does one retain a reasonable expectation of privacy? The situation is worsened by the principles that, set out in Spycatcher, limit the duty of confidentiality. One of them is that there is no such duty when the information is trivial. IoT companies may argue that the information about when one wakes up or when one goes to the loo is trivial. However, the IoT allows the combination of data from multiple sources in such a way that information that might seem trivial if considered in isolation, becomes personal and valuable once combined with other data that leads to valuable inferences about an IoT user’s preferences and vulnerabilities. Finally, IoT companies may claim that information about our e-commerce habits is not really private, but as seen in Arkansas v Bates, Amazon’s users are concerned about the company revealing sensitive purchases as these are considered private information whose disclosure could potentially harm their reputation or career. Overall, this erosion of the private sphere is alarming because it does not allow for that ‘intellectual privacy’ that is necessary to be ourselves and express ourselves freely and creatively. In this sense, the IoT can be regarded as an attack on the self, that self that is – according to Seneca’s famous ‘recede in te ipse’ (‘self-retreat’) – the safe space where virtue, wisdom and happiness are given the chance to grow.
The IoT challenges the right to privacy also for a second reason, which will be the more modest focus of this chapter. Since the IoT ‘could undermine such core values as privacy,’ this chapter will critically assess whether the GDPR, the most advanced European privacy law so far, can tackle the data protection issues in the IoT. The core features of the IoT render GDPR compliance difficult, if at all possible. An illustration of this is the conflict between the principle of purpose limitation and IoT’s repurposing. As seen in Chapter 2, ‘repurposing’ is a critical characteristic of IoT systems, dependent on their (inter)connectivity and system-of-systems dimension. ‘Repurposing’ can be understood as the phenomenon whereby an IoT system ends up being used for purposes other than those originally foreseen in two scenarios:
- The communication within the relevant subsystem and among subsystems can lead the system to perform actions and produce information which the single Thing was incapable of or that could not be foreseen by its manufacturers; and
- Under certain conditions (e.g. an emergency) the system may reconfigure either in an automated fashion or a user-initiated one.
IoT’s repurposing runs counter the purpose limitation principle, whereby personal data has to be ‘collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes.’
 The relationship between IoT and privacy can be and has been analysed from manifold perspectives. See e.g. Lilian Edwards, ‘Privacy, Security and Data Protection in Smart Cities: A Critical EU Law Perspective’ (2016) 2 EDPL 28; Guido Noto La Diega, ‘Clouds of Things: Data Protection and Consumer Law at the Intersection of Cloud Computing and the Internet of Things in the United Kingdom’ (2016) 9(1) Journal of Law & Economic Regulation 69; Sandra Wachter, ‘Normative Challenges of Identification in the Internet of Things: Privacy, Profiling, Discrimination, and the GDPR’ (2018) 34 Computer Law & Security Review 436; Lachlan Urquhart, ‘White Noise from the White Goods? Privacy by Design for Ambient Domestic Computing’ in Lilian Edwards, Burkhard Schafer and Edina Harbinja (eds), Future Law (EUP 2019).
 On how the implementation of human-implantable tracking systems creates a serious risk to privacy see Ian Kerr, ‘The Internet of Things? Reflection on the Future Regulation of Human-Implantable Radio Frequency Identification’ in Ian Kerr, Valerie Steeves and Carole Lucock (eds), Lessons from the Identity Trail. Anonymity, Privacy and Identity in a Networked Society (OUP 2009) 335 and esp. 347 ff. On the traditional concept of privacy, see also Judee K Burgoon and others, ‘Maintaining and Restoring Privacy through Communication in Different Types of Relationships’ (1989) 6 Journal of Social and Personal Relationships 131; Rosamund Scott, Rights, Duties and the Body: Law and Ethics of the Maternal-Fetal Conflict. (Bloomsbury Publishing 2002).
 This chapter refers to citizens and users rather than consumers because, unlike the consumer laws analysed in the previous chapters, data protection law does not apply only to consumers, but to all natural people.
 Whilst there is no distinct cause of action for breach of privacy in the UK, there is a tort of misuse of private information that is commonly used in privacy cases. It evolved out of the common law of breach of confidence, but it is now regarded as a stand-alone action. See PJS v News Group Newspapers Ltd  UKSC 26; Wainwright v Home Office  UKHL 53; Google Inc v Vidal-Hall and others  EWCA Civ 311. Unlike the breach of confidence, the misuse of private information does not require a pre-existing relationship of confidence between the parties, as was the case in Campbell (Naomi) v Mirror Group Newspapers  2 AC 457 (HL).
 Campbell (n XX) . The unlawful use of information in respect of which the claimant has a reasonable expectation of privacy is the so-called confidentiality component of the misuse of private information. The other component is ‘intrusion’; indeed, claimants need to prove unwanted intrusion in their private lives or harassment. For this framing of the misuse of private information see PJS (n XX).
 Murray v Big Pictures (UK) Ltd  EWCA Civ 446.
 Murray (n XX) -.
 Attorney General v Guardian Newspapers (No 2)  1 AC 109.
 There are contrasting authorities on the matter. On the one hand, in Ambrosiadou v Coward  EWHC Civ 409 , the court held that not all information that relates to a person’s private life is protected, ‘the information may be of slight significance, generally expressed, or anodyne in nature’ (similarly, see R (Wood) v Commissioner of Police of the Metropolis  EWCA Civ 414). Conversely, in McKennitt v Ash (QBD)  EWHC 3003) it was observed that the mere fact that information was of a relatively trivial or anodyne nature would not necessarily mean the non-engagement of ECHR, art 8.
 See e.g. Paramasiven Appavoo and others, ‘Efficient and Privacy-Preserving Access to Sensor Data for Internet of Things (IoT) Based Services’, 2016 8th International Conference on Communication Systems and Networks (COMSNETS) (2016).
 Neil Richards, Intellectual Privacy: Rethinking Civil Liberties in the Digital Age (Oxford University Press 2015).
 Seneca, Epistulae Morales Ad Lucilium, vol I, p VII.8.
 For this interpretation of Seneca see Christine Richardson-Hay, First Lessons: Book 1 of Seneca’s Epistulae Morales– a Commentary (Peter Lang 2006) 264.
 William H Dutton, ‘Putting Things to Work: Social and Policy Challenges for the Internet of Things’ (2014) 16 info 1.
 This is not to suggest that privacy and data protection are synonyms. There is private information that may be breached despite the compliance with data protection laws and, equally, personal data includes also information that is not related to one’s private life. See e.g. Michèle Finck, European Parliament and Directorate-General for Parliamentary Research Services, Blockchain and the General Data Protection Regulation: Can Distributed Ledgers Be Squared with European Data Protection Law? (2019) 15 <http://publications.europa.eu/publication/manifestation_identifier/PUB_QA0219516ENN> accessed 11 June 2020. Nonetheless, it cannot be denied that a major policy goal of the GDPR is to increase the protection of privacy (see e.g. GDPR, recital 4).
 Noto La Diega, ‘Clouds of Things’ (n 2).
 On the repurposing of big data drawn from the IoT in smart cities, see Edwards (n 2).
 GDPR, art 5(1)(b).
 Another way of looking at it is ‘data appropriation’ as we called it in Guido Noto La Diega and Cristiana Sappa, ‘The Internet of Things at the Intersection of Data Protection and Trade Secrets. Non-Conventional Paths to Counter Data Appropriation and Empower Consumers’ 2020 REDC. This chapter draws on that paper.
 For a reflection on whether, and to what extent, the concept of ownership can be applied to personal data in the context of the IoT see Václav Janeček, ‘Ownership of Personal Data in the Internet of Things’  Computer Law & Security Review 1039.
 Josef Drexl, ‘Designing Competitive Markets for Industrial Data. Between Propertisation and Access’ (2017) 8 JIPITEC 257.
 Edwards (n 2).
 Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (First edition, PublicAffairs 2019)