Remote learning and digital wellbeing: data protection and copyright issues

Today with Dr Giulia Schneider (Sant’Anna, Pisa), we’ve presented our research on the data protection, privacy, and copyright issues in remote teaching. What happens when US-based proprietary platforms dispossess our students’ and teachers’ data and contents? Can the platforms’ dodgy T&C prevail on the law?

We are thankful to the Nexa Center for Internet & Society (Turin) and its Directors Professor Marco Ricolfi and Professor Juan Carlos de Martin for the kind invitation.

Our key conclusions are:

  1. We need better and more balanced governance of data and contents to maximize students’ and teachers’ digital wellbeing
  2. We are running the risk of a de-fact privatization of universities through (i) disposession of data and learning materials by private platforms, and (ii) the prevalance of contracts on the law (including on the GDPR).
  3. E-proctoring and re-use of data for advertising purposes well illustrate the privacy and data protection issues in remote teaching
  4. The upload filter and the limitations in the teaching exception confirm the tilting of copyrighgt in favour of rightsholders to the detriment of freedom of expression, right to education, and academic freedom
  5. Collective bargaining and outsourcing could improve T&Cs, privacy policies, and licenses (that are used to put in place a form of private ordering of data and contents)
  6. Other solutions include certifications and investments in public, European, and open infrastructures. This could provide the backbone for open-source in-house platforms developed at national, collective, or university level. The reliance on US proprietary platforms is likely to be illegal and needlessly expensive, even leaving fundamental rights considerations aside.

The video of the presentation (in Italian) can be watched here.

Our published research on this topic:

Pascault, Léo and Jütte, Bernd Justin and Noto La Diega, Guido and Priora, Giulia, Copyright and Remote Teaching in the Time of Coronavirus: A Study of Contractual Terms and Conditions of Selected Online Services (June 15, 2020). European Intellectual Property Review (Forthcoming), Available at SSRN: https://ssrn.com/abstract=3652183 or http://dx.doi.org/10.2139/ssrn.3652183

Angiolini, Chiara and Ducato, Rossana and Giannopoulou, Alexandra and Schneider, Giulia, Remote Teaching During the Emergency and Beyond: Four Open Privacy and Data Protection Issues of ‘Platformised’ Education (November 13, 2020). Opinio Juris in Comparatione, vol. 1 (2020), http://www.opiniojurisincomparatione.org/opinio/article/view/163/171, Available at SSRN: https://ssrn.com/abstract=3779238

Emergency Remote Teaching: a study of copyright and data protection terms of popular online services (Part I)

Emergency Remote Teaching: a study of copyright and data protection policies of popular online services (Part II)

Call for abstracts: «Are We Owned? A Multidisciplinary and Comparative Conversation on Intellectual Property in the Algorithmic Society»

I am chuffed! The Modern Law Review is funding a one-day conference to be held in Stirling (IN PERSON!) on Friday 8th October 2021.

Theme: «Are We Owned? A Multidisciplinary and Comparative Conversation on Intellectual Property in the Algorithmic Society»

Intellectual Property (IP) plays a crucial role in allowing uses of new technologies that are detrimental to society and preventing beneficial uses. IP is everywhere and lends itself to monopolise virtually anything. We may think we own ’our’ phone, but it factually belongs to the holders of the copyright on the code running on it, the manufacturers owning its design and the patents on how it works, as well as trademarks on logos, on the way you swipe, etc. What happens when it is no longer just computers and phones to be embedded with software and other IP-protected digital content? In an Internet-of-Things world, these proprietary smart objects are everywhere: in our bedroom, in our bathroom, in our body. Our behaviour becomes heavily restricted by those Terms of Service, Privacy Policies, End-User License Agreements, etc. that cover every aspect of the things we thought we owned. We have become digital tenants, not owning or controlling any object and data around us. To the point that, one can argue, that we no longer own: we are owned (Mulligan 2015; Fairfield 2017).

Thanks to the Modern Law Review grant, there will be no registration fee and the Review has the right of first refusal for papers presented at the conference. Confirmed speakers include Professor Christina Mulligan (Vice Dean of Brooklyn Law School), Professor Marco Ricolfi (Co-Director of the Nexa Center for Internet & Society, University of Turin), Professor Joshua Fairfield (William Donald Bain Family Professor of Law, Washington and Lee University School of Law), Associate Professor Dr Matthew David (Department of Sociology, Durham University), and Amy Thomas (Research Associate at CREATe / University of Glasgow).

We welcome 300-word abstracts on any topic related to IP in the algorithmic society, including:

  • Can AI create art and other copyright materials? Is there anything else beyond the copyright-public domain binary? Is Brexit an opportunity to abandon the “author’s own intellectual creation” originality standard?
  • One of the fundamental principles in IP law is that software “as such” cannot be patented. If every physical object becomes embedded with software, would this mean that every software becomes patentable?
  • A combination of IP rights, contracts, and technological protection measures is allowing companies to ‘own’ our data. Is this justified? How can we counter ‘data dispossession’? Is antitrust the solution?
  • Under the General Data Protection Regulation, data subjects do not have a right to access their personal data if this adversely affects third parties’ IP rights. How will this provision play out in practice?
  • Can a strategy centred on the ‘commons’ and on openness be the solution to the problems of IP in the algorithmic society?

Postcolonial, queer, feminist, posthuman, and critical perspectives are particularly welcome. In order to ensure diversity of speakers, three “widening access” bursaries are available to cover for travel and accommodation costs of colleagues from underrepresented groups (BAME), from remote communities, early career academics (including PhD students), and other colleagues from disadvantaged groups (e.g. precarious workers). Please, do specify in your submission whether you qualify for a bursary.

Deadline for the abstracts: Monday 1st March 2021

Date of the conference: Friday 8th October 2021

Venue: Stirling Court Hotel

Abstracts can be sent to me

Virtual conference “Hate Speech, Digital Discrimination, and the Internet of Platforms”

I’m pleased to share that, on behalf of GenIUS, Italy’s journal of gender, sexuality, and law, I’ve organised the annual conference of the journal, this year dedicated to “Hate Speech, Digital Discrimination, and the Internet of Platforms.”

The conference will be take place on Friday 26th March 2021 at 1-5pm and will be hosted on Teams Live.

Speakers include:

Prof. Kaori Ishii, Professor, Faculty of Global Informatics, Chuo University (Japan): The Japanese legal approach to digital discriminations on platforms

Prof. Enrico Camilleri, Chair of Private Law, Università degli Studi di Palermo (Italy): Hate speech and social network: duty of care and liability

Dr Kim Barker, Senior Lecturer in Law, Open University (England) – former Stirling Law School: Online Misogyny as a Hate Crime: An Obstacle to Equality?

Prof. Luciana Goisis, Associate Professor of Criminal Law, Università degli Studi di Sassari (Italy): Hate Crimes, Social Media and Criminal Law: Reflections on the Recent Italian Legislative Proposal Against Incitement to Discrimination and Hate

Prof. Ann Bartow, Professor of Law, University of New Hampshire (United States): Online Sex Trafficking

Prof. Giovanni Ziccardi, Associate Professor of Legal Informatics, Università degli Studi di Milano (Italy): Algorithms that hate

Prof. Alexandre de Streel, Professor of EU Law all’Université de Namur and Director of the Centre de recherche information, droit et société (Belgium): Towards a better EU law for the moderation of illegal and harmful online content

Spokesperson of the Council of Europe (tbc), Strasbourg (France)

Please register here.

Brochure of the 2021 international conference of GenIUS

Launch of the Scottish Law and Innovation Network

On Wednesday 31 March 2021 at 4-5pm, there will be the online launch of the Scottish Law and Innovation Network (SCOTLIN), a research network funded by the Royal Society of Edinburgh and bringing together academics, practitioners, industry, and civil society with expertise in law & innovation and a link to Scotland.

Logo of the Scottish Law and Innovation Network

I will introduce the network and the keynote speech will be delivered by Professor Hector MacQueen on “Law and innovation in Scotland: some impressionistic thoughts“.

Please email Zihao Li (z.li.6@research.gla.ac.uk) for the Teams link.

The IoT issue of the European Journal of Consumer Law is out!

I am delighted to share the news that the IoT issue of the European Journal of Consumer Law / Revue européenne de droit de la consommation is out!

Cover of the IoT issue of the European Jourmal of Consumer Law

It was a pleasure to coordinate this special issue (EJCL 2020/3), collaborate with the Editor-in-Chief Ellen Van Nieuwenhuyze (CJEU), and with the authors Ugo Mattei (University of California, Hastings College of the Law), Rolf H Weber (Universität Zürich), Megan Richardson (University of Melbourne, Australia), Eliza Mik (Singapore Management University-Melbourne Law School), my fabulous co-author Cristiana Sappa (IÉSEG School of Management), Mirjam Eggen (Universität Bern), Siobhan McConnell (Northumbria University), Federica Giovanella (Università degli Studi di Udine), Damian Clifford (Australian National University-IALS), Rachelle Bosua (University of Melbourne-Open Universiteit), Karin Clark (University of Melbourne), Blanka Vítová (Palacký University Olomouc), David Lindsay and Evana Wright (University of Technology Sydney).

The table of contents can be viewed below and downloaded here

COVID contact-tracing app in England and Wales: functionalities, incentives, risks

Unitelma Sapienza invited me and Dr Shaira Thobani to talk about COVID-19 contact tracing apps in Italy, England, and Wales. This was part of the YouKey Talks organised by Dr Roberto Sciarrone and Mr Stefano Oporti’ to better understand the relationship between our countries in the time of Brexit.

How does the app work

The app used in England and Wales has four main functions. Firstly, it uses the Apple Google exposure notification API To notify you if you’ve been near a risky contact. This is the same API that the Italian app Immuni uses. The idea here is that no central database of individuals and their connections to other people is maintained.

The Exposure Notification system is a ‘decentralised’ contact tracing system, based on the DP-3T protocol developed by a consortium of universities including Turin. 

The second function that the app has is a QR code system for checking in to locations. This is also partially centralised, and is based on code from New Zealand.

When you check-in to a location, which had printed a QR code from the government website, Your phone will store that location in memory. Every few hours it will download a list of compromised places, and your phone compares against them.

While over 1 million QR codes have been generated by UK companies and venues, this system is actually not been used in practice, meaning the government has not triggered many venues in the system at all (after a month of operation, only two signals had been sent out, which may not even have corresponded to real places.)

The third service is a way to order a coronavirus test, and receive the results, along with a countdown timer.

The fourth service is a indicator of the risk area that your postcode is currently in.

How is the uptake incentivised?

Uptake is incentivised by the legal requirements around the QR code system. Scanning a code is an alternative to writing details down. As many companies had a very cumbersome login process, the QR code system is ostensibly much easier.

Furthermore, companies offering certain services are obliged to put up QR codes and two have them present as an alternative to written contact tracing details (which must also be possible to do).

There is also an advertising campaign.

In earlier iterations of the app it was hard to claim self isolation compensation when notified by the app as opposed to by manual contact tracers over the phone. This has been fixed. 

Is the app effective?

We are still awaiting full assessment of the effectiveness of an app.

Initially, people thought that you need 80% of smart phone users, but that number has been revised downwards with further modelling. It is a difficult thing to model because you also need people who know each other to have the app, so if it has a high prevalence among young people then that’s okay. The idea of the app is to stop less vulnerable people ever visiting those older vulnerable people, not to Notify those older people when it’s too late. 

The main challenge the app is faced is that the testing system has not been well integrated with the authentication needed to trigger a positive situation in the app for somebody who has tested positive. That is the most difficult part and it hasn’t received the focus it should’ve done from governments.

Privacy risks

•The earliest version of the app used a different protocol which was centralised. This meant that your phone would constantly emit an identifier which would not rotate, meaning anybody could track you across space once they saw you once, and which could be decrypted into your phone’s ID by the central authority with the master key, which was managed by a combination of GCHQ (Government Communications Headquarters) and NHS England/DHSC (Department of Health and Social Care). 

This would also mean that users of the app would find themselves located in a centralised social network of who saw who. This could be used to deduce political groups, affairs, family, or more. Other people would be uploading data about whether or not they were near you, and that would allow very sensitive country level data to be constructed. There is no good reason to believe, given the epidemiological features of the virus, that this data would be useful for tracking the disease. 

Lastly, the government could install small Bluetooth sensors at places like supermarkets or train stations to enforce quarantine for those who had a phone committing such an identifier, as they would know who it was that was walking past. 

However, in June, the government switched to a decentralised model which does not suffer from these problems, as the numbers emitted are random and rotate, and no one ever uploads information about each other. 

The main risk that remains with a decentralised system, which is also present with a centralised system (as it is an inherent feature of any mobile phone powered contact tracing system) is that if you go around and sniff peoples identifiers and work out to who may correspond (e.g. get them alone and know how to identify them in person while also carrying a Bluetooth sensor) you will be able to find out if they have a test positive during that period. This is because the identifiers that you collected will be sent to your phone for checking later on once they test positive. However, this is a difficult attack to do because they’re phones everywhere and they confuse the signal, and it requires specialist hardware. Furthermore, you only learn something which you may have found out anyway given any social interaction with that person.

Risks beyond privacy

While most of the discussion around contact tracing systems has focused on privacy and data protection, their use also has wider implications for individuals and communities, particularly in the case of mobile apps. These concern legality, moral responsibility and community, autonomy, and democracy, which even expansive conceptions of privacy and data protection may not fully accommodate. (Pila 2020)

Datafication (Brown and Duguid 2000) threatens democracy: when people become the object of technology, and everyday life and experience become grist for capitalist and political mills, important questions arise about what is humanly desirable, what it means to be human, and who gets to decide (Jonas 1979).

It is the very nature of advanced technologies to generate new centres of formal and actual power that elude democratic control and remain largely inaccessible to citizens (Somsen 2009). The result is precisely the types of power asymmetries that breed corporate and political authoritarianism and indifference to individuals’ lives.

Covid-19 provides new opportunities for governments and organisations to consolidate their power at the expense of citizens (Pila 2020).

The importance of civic engagement

Winner (1992) argued that, for a mix of intellectual and social reasons, the design and development of new technologies is an insufficiently democratic activity

The proposal for greater civic involvement in each is compelling given the interests and values at stake.

The second version of the Covid-19 app was developed with input from ‘diverse stakeholders’, including public health and data protection authorities, civil society organisations, and ‘volunteers who provided a patient and public point of view.’

By contrast, the development of version 1 was attributed to ‘a team of world-leading scientists and doctors’, drawing ‘on expertise from across the UK government and industry’, and involving ‘experts from the National Cyber Security Centre

Questions that require a different kind of expertise, and wider opportunities for public involvement in social choices regarding technologies

Brexit

Brexit precluded the app from becoming internationally interoperable As the European commission decided to copy all the data across borders rather than interoperate in a more minimalist way (like we suggested).

Because they are paranoid this might be personal data, or health data, (even though there is a strong argument that it isn’t based on its technical characteristics and inability to identify people) then they did not establish the interoperability agreement through their “gateway” with the United Kingdom. They also did not with Switzerland for the same reason. 

Artificial Intelligence and Intellectual Property: The View of The British and Irish Law, Education and Technology Association (BILETA)

Between September and November 2020, the UK Intellectual Property Office (UKIPO) ran a consultation about the implications Artificial Intelligence (AI) may have for Intellectual Property (IP) policy, as well as the impact IP may have for AI.

This response was prepared on behalf of the British and Irish Law Education and Technology Association (BILETA) by Prof Dinusha Mendis, Dr Felipe Romero-Moreno, Dr Hiroko Onishi, and myself.

BILETA was formed in April 1986 to promote, develop and communicate high-quality research and knowledge on technology law and policy to organisations, governments, professionals, students and the public.

This submission focuses on the maincopyright and trade mark issues in AI.

You may download our submission here.

Countering the Platformization of Education

On 17 December 2020, I will present “Countering the Platformization of Education” with Rossana Ducato at Shifting education from classrooms to online platforms – smooth as silk? a virtual event organised by Legal Hackers Luxembourg

Due to the spread of Covid-19 in the first months of 2020, the activities of most universities and schools across Europe had to migrate online. This rapid shift towards online education has been characterized by the use of third-party service providers (like Zoom, MS Teams, Skype, etc.).

The “platformization” of education, however, raises many concerns about copyright and data protection.

Agenda
18.00 – 18.15: opening by Chris Pinchen (Bee Secure ambassador and organizer of the Privacy Salon)
18.15 – 19.00: “Countering the Platformization of Education” by Rossana Ducato (University of Aberdeen) and Guido Noto la Diega (University of Stirling)
19.00 – 19.30: online apéro

Registration and more information here

AI inventions, AI-assisted inventions, AI-generated inventions and patent law

On 4 December 2020, I have talked about AI and patent law at the 2-day workshop Neuroni Artificiali e Biologici organised by the University of Trento. This is part of the ERC-funded project BACKUP led Professor Lorenzo Pavesi, Chair of Experimental Physics. BACKUP will address the fundamental question of which is the role of neuron activity and plasticity in information elaboration and storage in the brain.

This workshop assessed the legal, philosophical, and ethical issus in AI, robotics, and biotechnologies. Its goal was to start a multidisciplinary conversation to set up a community where we can identify paths for future collaborative research.

The workshop consisted of talks, including mine, roundtable conversations, and a white paper drafted by the PhD students in attendance.

Key takeaways of my talk

We must foster an inclusive and diverse public debate on AI & ethics. This has also practical consequences because immoral AI applications cannot be patented

The widespread use of AI applications to carry out research and produce inventions leads to the risk of over-monopolisation of ideas thus stifling innovation. A way to address the issue is to change the standard to assess the investive step, that is a requirement to meet for an invention to be patentable. Inventive step is assessed from the perspective of the person skilled in the art. This low threshold should be replaced with the higher threshold of the ‘AI-enhanced multidisciplinary team’.

The European Patent Office, the UK Intellectual Property Office and the US Patent and Trademark Office agree that AI cannot be an inventor. Fair enough, but how do we prevent human inventors to make-up humanity? How do we verify that an invention that is presented as human-made is not actually AI-generated? Could an AI system verify humanity?

Not only Zoom. Remote Teaching Digital Platforms, Copyright, and Data Protection

On 25th November 2020, Dr Rossana Ducato and I presented our research on remote teaching, copyright, and privacy at the 83th Nexa Lunch Seminar, research event hosted (virtually) by the Nexa Center for Internet & Society, a joint venture of the University of Turin and Polytechnic of Turin.

The rapid spread of COVID-19 in March 2020 shut down universities in most European countries. Teaching moved online and most universities are currently planning to deliver at least part of their teaching in the coming academic year in a blended form. With the online shift of teacher-student interactions, the choice of the teaching medium has never been more important (Ducato et al. 2020).

The post-pandemic university will have to make a responsible choice with regards to which tools to use to deliver their courses. Digital tools developed and operated by third parties significantly affect teachers’ and students’ fundamental rights and freedoms, including IP rights. Our research sheds light on the copyright issues arising from the use of some popular remote teaching platforms (e.g. Zoom) and it critically assesses whether these concerns remain pertinent in a post-COVID blended learning environment (Pascault et al. 2020).

Our project has analysed so far the terms and conditions, privacy policies and community guidelines of a sample of nine online services used across Europe in order to assess whether the needs of teacher and students are met. The analysis investigates whether sufficient and clear information is provided in order to enable teachers to carry out educational activities and interact with their students without uncertainties as to the potential legal consequences of their use and concerns regarding the protection of their content.

Key takeaways

The shift to online learning exacerbated existing problems, including digital dependency

The system of public HE does not have an infrastructure that can guarantee the fundamental right to education in remote learning

We observe a trend whereby HE institutions adopt third parties platforms that do not have learning as their core or mission.

The COVID emergency left little time for scrutiny: now it’s the time to stop and think again

Crucial to invest in a public infrastructure that allows better control over data and learning content

You can find our research on remote teaching, copyright, privacy here

Create your website with WordPress.com
Get started