20-year Report: Privacy and policy Archives - Biometrics Institute https://www.biometricsinstitute.org/resource_category/20-year-report-privacy-and-policy/ Mon, 16 Jan 2023 11:59:15 +0000 en-US hourly 1 https://www.biometricsinstitute.org/wp-content/uploads/cropped-favicon-150x150.png 20-year Report: Privacy and policy Archives - Biometrics Institute https://www.biometricsinstitute.org/resource_category/20-year-report-privacy-and-policy/ 32 32 20-year Anniversary Report: Sandra Leaton Gray https://www.biometricsinstitute.org/20-year-anniversary-report-sandra-leaton-gray/ Wed, 15 Sep 2021 00:45:55 +0000 https://www.biometricsinstitute.org/?p=9676   Sandra Leaton Gray Sandra Leaton Gray: Biometrics in schools – adoption and privacy concerns 21st century schools can be complex, difficult places to manage and attend, not least because... Read more »

The post 20-year Anniversary Report: Sandra Leaton Gray appeared first on Biometrics Institute.

]]>

 

Sandra Leaton Gray

Sandra Leaton Gray: Biometrics in schools – adoption and privacy concerns

21st century schools can be complex, difficult places to manage and attend, not least because schools have grown in size substantially over the last couple of generations, leading to the need for multiple systems of control and regulation. One of the primary issues for governing bodies and local education authorities is reconciling a need for bureaucratic efficiency whilst acting in loco parentis – ensuring that the children in their care are where they should be, engaged in appropriate activities at the right time, and being fed at appropriate intervals. Within this space, developers have sought to support schools (and monetise solutions to any number of management problems) through the provision of any number of digital products to streamline and enhance school processes, for example for attendance monitoring, assessment and accounting/audit. It is within this commercial framework that we find biometrics proliferating, and with it, associated privacy concerns.

Biometrics started to be adopted by mainstream schools around fifteen years ago, mainly in countries such as the US and UK, the Netherlands, Belgium and France, with products aimed at monitoring expenditure on school meals, as well as access to library books, or occasionally for building access control. Early adoption figures for these functions are hard to come by, but it is estimated that by 2014, at least 40% of the UK school population had been fingerprinted for such purposes, registered for palm vein readers or registered for facial recognition systems (Darroch, 2011, Big Brother Watch, 2014). Some countries and jurisdictions banned the use of biometrics in schools early on, for example states such as Arizona, Illinois, Iowa, Maryland, Michigan and Florida, as well as countries such as Germany. In other countries and regions there were protests, for example as early as 2005, the French ‘Group Against Biometrics’ went so far as to smash palm readers in schools. Therefore the adoption of biometrics in schools has always met with controversy (Andrejevic and Selwyn, 2020).

Privacy concerns are generally two fold. Basic systems used for school meals, library book loans and room access engender concerns centred around the misuse of personal data, the likelihood of mission creep in which data collected for one purpose end up being used for another, and the dangers of mosaic identification with databases being illicitly cross-referenced to identify individuals. This is technically possible, but difficult, as so few data points are collected for fingerprint or facial recognition purposes (although this has increased in recent years). The limited data points also represent a reason why biometric systems prove so unreliable in schools, with many children’s accounts being confused, for example through meal credits/debits frequently applied to the wrong transactions, as a result of schools being poorly educated as to the important role setting the system’s False Accept/Reject Rate properly to ensure fairness, and to avoid the same 10-20 children being regularly confused (much to their chagrin).

The second form that privacy concerns take is related to higher stakes applications where a failure of a biometric system is likely to have serious personal consequences, usually of the disciplinary type. This has been greatly amplified through the introduction of behavioural biometrics to educational settings in more recent years. An example of this is the introduction of virtual proctor software for the remote monitoring of pupils sitting exams at home, which came to attention during the COVID-19 pandemic. These systems track the most minute eye movements, amongst other things, and use a proprietary algorithm to diagnose ‘cheating’. These systems, with their opaque analysis, trained on a limited population, are routinely perceived as oppressive and unfair by many examinees. Another example of high stakes biometrics is the introduction of ‘emotional’ biometrics for behaviour management purposes in the classroom, usually in an experimental capacity, as in the case of a system tested in a Chinese middle school by Hikvision Digital Technology. This latter system was designed to assess whether pupils were paying attention in class through assessing whether their facial expressions were happy, sad, angry, surprised or neutral. It brought resistance from parents and the system was quickly withdrawn. In the academic literature, the introduction of such technologies focused around the audit of the physical body has been described by Swauger (2020) as being representative of a ‘punitive pedagogy’, with power and control being at the centre of the product design, rather than, say, the growth of knowledge, or human flourishing.

In the light of recent developments, the biometrics industry now stands at something of a crossroads. It can continue to develop and test products on what is a captive population in schools, with minimal attention paid to the social consequences of their long-term use. Alternatively, it can decide to involve stakeholders much more closely in the development of products, through collaborative development approaches that involve significantly less commercial secrecy, so proper scrutiny can take place, and products can be revised and adapted as appropriate. By stakeholders, this should mean pupils, teachers and parents, rather than finance departments or senior management teams who might be involved in high-level procurement. There also need to be more extensive training populations, in order to take into account diverse cultural and racial backgrounds, as well as any special educational needs. This builds on the Biometrics Institute’s policy of appropriate use, providing for an ethical approach to technological tools which are having increasingly profound social consequences.

References

Andrejevic, M. and Selwyn, N. (2020) Facial recognition technology in schools: critical questions and concerns, Learning, Media and Technology, 45:2, 115-128, DOI: 10.1080/17439884.2020.1686014

Big Brother Watch (2014) Biometrics in schools: the extent of biometrics in English secondary schools and academies London: Big Brother Watch

https://bigbrotherwatch.org.uk/wp-content/uploads/2014/01/Biometrics-in-Schools.pdf

Darroch, A. (2011) Freedom and biometrics in UK schools Biometric Technology Today 2011 (7), 5-7

Swauger, S. (2020) Our bodies encoded: algorithmic test proctoring in higher education. Chapter 6 in Stommel, J. Friend, C. and Morris, S. (Eds) Critical Digital Pedagogy (Denver, Hybrid Pedagogy Inc)

University College London Institute of Education
Sandra Leaton Gray
Member of the Privacy Expert Group, Biometrics Institute
s.leaton-gray@ucl.ac.uk

Applications and use cases | Privacy and policy | Research and development | Technology innovation

The post 20-year Anniversary Report: Sandra Leaton Gray appeared first on Biometrics Institute.

]]>
20-year Anniversary Report: Terry Aulich https://www.biometricsinstitute.org/20-year-anniversary-report-terry-aulich/ Wed, 15 Sep 2021 00:40:36 +0000 https://www.biometricsinstitute.org/?p=9682 Terry Aulich: Thoughts as Chair of the Biometrics Privacy Experts Group, on the 20-year anniversary of the Institute As Chair of the Biometrics Institute’s Privacy Experts Group, I have been... Read more »

The post 20-year Anniversary Report: Terry Aulich appeared first on Biometrics Institute.

]]>

Terry Aulich: Thoughts as Chair of the Biometrics Privacy Experts Group, on the 20-year anniversary of the Institute

As Chair of the Biometrics Institute’s Privacy Experts Group, I have been amazed at the development of the biometrics technology and industry over the last twenty years. Initially, biometrics had a public image of sci-fi semi-reality, and few understood what would come to pass. Now, more than thirty-five countries issue biometrically enabled passports, mobile phones and doors are secured by biometrics, refugee medicines and food are distributed with biometrics ensuring security and fairness in the process, social media uses biometrics and biometrics have a significant role to play in areas as diverse as marketing and military uses.


At the centre of this rapid development has been the Biometrics Institute. As a strategic advisor to the Board in those days, I watched three very important decisions being made. The first was the decision to ensure that the Institute was independent and a respected organisation. This was done in several ways: the most important was to ensure that constitutional control had to be in the hands of the users; the Institute had to be the source of credible information and ethical practice.


The second was to ensure that privacy was front and centre of everything the Institute did. In practical terms this meant the creation of Privacy Guidelines which are updated every two years in line with social, technical, and commercial challenges.


The third has been the expert committees, conferences and training programmes which have ensured that the Institute is the source of credible trusted expert advice and information.


All three of those developments could not have been possible without the thousands of hours of volunteer experts, a seriously good succession of Board members from many sectors and countries and, most important of all, the long-term guidance and drive of the Institute’s CEO, Isabelle Moeller.


Isabelle especially understood how events needed to be managed and has had the golden touch that turns an international organisation into the biometric family where sharing ideas and knowledge and, in many cases friendships, has become the norm.


The Institute has also worked together with many international organisations such as the UN, INTERPOL, and leading universities. Much work has been on technical matters, anti-crime and anti-terrorism policy, human rights, privacy, and immigration.


Like all new technologies, biometrics can be abused for reasons of greed, authoritarianism, or plain stupidity. This is why the Institute’s work has been so heavily concentrated on ethical considerations, where the human comes before machines. This approach has been richly rewarded as more and more organisations around the world have joined the Institute.


From humble beginnings in Sydney Australia, the Institute has grown with the industry and become multinational, finally basing itself in London but retaining its office in Sydney. It has been one of the great unsung stories of the modern IT era and we look forward to many more years and more organisations realising that fact and taking advantage of all that the Institute offers.


Aulich & Co.
Terry Aulich
+61 407 106 836
aulichterry8@gmail.com
Head of the Privacy Expert Group, Biometrics Institute

Applications and use cases | Privacy and policy | Research and development | Technology innovation

The post 20-year Anniversary Report: Terry Aulich appeared first on Biometrics Institute.

]]>
20-year Anniversary Report: UK Information Commissioners Officer https://www.biometricsinstitute.org/20-year-anniversary-report-uk-information-commissioners-officer/ Wed, 01 Sep 2021 00:38:55 +0000 https://www.biometricsinstitute.org/?p=9686 UK Information Commissioners Office: Biometrics: data protection by design and default With any new biometric technology, building public trust and confidence is essential to ensuring that its benefits can be... Read more »

The post 20-year Anniversary Report: UK Information Commissioners Officer appeared first on Biometrics Institute.

]]>

UK Information Commissioners Office: Biometrics: data protection by design and default

With any new biometric technology, building public trust and confidence is essential to ensuring that its benefits can be realised. Where more sensitive categories of personal data are processed, for example via Live Facial Recognition (LFR) for the purposes of unique identification, the public must have confidence that its use is lawful, fair, transparent and meets the other standards set out in data protection law.

Biometric data extracted from a facial image can be used to uniquely identify an individual in a range of different contexts. It can also be used to estimate or infer other characteristics, such as their age, sex, gender or ethnicity. The UK courts, in the case of R (Bridges) v Chief Constable of South Wales Police and Others, have concluded that “like fingerprints and DNA” [a facial biometric template] is information of an “intrinsically private” character.”[1] LFR in particular can collect this data without any direct engagement with the individual. This means it has greater potential to be used in a privacy-intrusive way.

The UK GDPR requires controllers to take a data protection by design and default approach to help them comply with the UK GDPR’s fundamental principles and requirements, and forms part of the focus on accountability. This approach is explored further below, and is particularly important in the context of LFR because many issues of fairness, necessity and proportionality need to be addressed during the planning and design stage of a system.

LFR: The Information Commissioner’s opinion

The Information Commissioner previously published an Opinion on the use of LFR in a law enforcement context. It concluded that data protection law sets high standards for the use of LFR to be lawful when used in public places. The Information Commissioner’s Office (ICO) has built on this work by assessing and investigating the use of LFR outside of law enforcement. This has covered controllers who are using the technology for a wider range of purposes and in many different settings.

This work has informed the ICO’s view on how LFR is typically used today, the interests and objectives of controllers, the issues raised by the public and wider society, and the key data protection considerations. The Information Commissioner has published a separate Opinion to explain how data protection law applies to this complex and novel type of data processing.

Key requirements under data protection law  

Any use of personal data must be lawful, fair, necessary and proportionate. These are key requirements set by data protection law. Where the personal data in question is particularly sensitive, such as biometric data, there are stronger legal protections. Where the processing is automatic and there is a lack of choice or control for the individual, again there are stronger protections. This means that when LFR is used in public places for the automatic and indiscriminate collection of biometric data, there is a high bar for its use to be lawful.

The Information Commissioner has identified a number of key data protection issues which can arise where LFR is used for the automatic collection of biometric data in public places. These have been identified through the ICO’s investigations, our work reviewing data protection impact assessments (DPIAs) and wider research. These issues include:

  • the governance of LFR systems, including why and how they are used;
  • the automatic collection of biometric data at speed and scale without clear justification, including the necessity and proportionality of the processing;
  • a lack of choice and control for individuals;
  • transparency and data subjects’ rights;
  • the effectiveness and the statistical accuracy of LFR systems;
  • the potential for bias and discrimination;
  • the governance of watchlists and escalation processes;
  • the processing of children’s and vulnerable adults’ data; and
  • the potential for wider, unanticipated impacts for individuals and their communities.

For any use of biometric data, controllers must perform a DPIA for any processing that is likely to result in a high risk to individuals. This includes the broad spectrum of biometric characteristics that may be used to uniquely identify an individual, and monitoring publicly accessible places on a large scale.  

Data protection by design and default

Data protection by design and default is about embedding data protection into everything controllers do, throughout all processing operations. They should therefore consider privacy and data protection when procuring, purchasing or developing any biometric systems. They should also ensure biometric products or services they adopt from vendors have been designed with appropriate data protection and privacy-preserving features built-in. Controllers, not technology vendors, are responsible for this under the law.

It is important that controllers do not deploy “off-the-shelf” solutions without adequate due diligence to understand the technical processing and associated privacy implications. Controllers also should consider whether a biometric system, such as LFR, is designed with data subjects’ rights in mind. For example, they should have the capability to isolate and extract personal data in response to a subject access request, other individuals’ rights or for disclosures to authorised third parties unless valid exemptions apply.

In the specific context of LFR, it is especially important that controllers set out clear processes and policies governing its use, including:

  • The circumstances in which the controller may activate the LFR system;
  • Clear criteria and governance for any watchlists;
  • Well-defined procedures for intervention in the event of a match and clear escalation measures;
  • How data subjects are informed, how controllers will handle complaints, and how they will fulfil the public’s data protection rights; and
  • Processes to continually monitor the impact of the LFR system and assess whether it continues to be fair, necessary and proportionate.

The Information Commissioner recommends that by integrating data protection considerations at the very start of any biometric processing, and documenting any decisions that are made in a DPIA, will help controllers comply with their legal obligations under data protection law. This will also enable controllers to adopt the data protection by design and default approach.

[1] R (Bridges) v Chief Constable of South Wales Police and Others [2019] EWHC 2341, paragraph 59

UK Information Commissioners Office
Steven Wright
Member of the Future Direction Group, Biometrics Institute

Joined in 2020

Applications and use cases | Privacy and policy | Research and development | Technology innovation

The post 20-year Anniversary Report: UK Information Commissioners Officer appeared first on Biometrics Institute.

]]>