Africa Studio/Shutterstock
In dialog with my teenage daughter final week, I identified a information report which flagged considerations over using facial recognition applied sciences in a number of faculty canteens in North Ayrshire, Scotland. 9 colleges within the space lately launched this apply as a way to take cost for lunches extra shortly and minimise COVID threat, although they’ve since paused rolling out the expertise.
Once I requested my daughter if she would have any considerations about using facial recognition expertise in her faculty canteen, she casually replied: “Not likely. It might make issues rather a lot sooner at checkout although.”
Her phrases validate the priority that kids are a lot much less conscious of their information rights in comparison with adults. And though there are particular provisions and safeguards for youngsters beneath a variety of knowledge safety legislations, using facial recognition expertise on kids might pose distinctive privateness dangers.
Learn extra:
Facial recognition is spreading sooner than you realise
Facial recognition applied sciences establish and authenticate individuals’s identities by detecting, capturing and matching faces to photographs from a database. The applied sciences are powered by synthetic intelligence (AI), particularly the expertise generally known as machine studying.
Machine studying predicts outcomes primarily based on historic information, or algorithms, which were fed into the system. So for facial recognition, machine studying predicts the id related to a digital illustration of an individual’s face, or “face print”, primarily based on a database of facial pictures. The software program adapts by this expertise, in time studying to generate predictions extra simply.
Facial recognition expertise is now utilized in quite a lot of methods, corresponding to to confirm the id of staff, to unlock private smartphones, to tag individuals on social media platforms like Fb, and even for surveillance functions in some international locations.
Facial recognition expertise by itself shouldn’t be the issue. Reasonably, the difficulty is the way it’s used and, on this occasion, the very fact the expertise has now infiltrated faculty corridors and focused a weak demographic: kids.
So what are the privateness points for youngsters?
Your face print is your information, so for any facial recognition system it’s essential to know how the picture databases are collated and saved. Though I’ll grudgingly comply with using facial recognition expertise to enter a live performance venue, I wouldn’t be thrilled if my face print was retained for “different industrial functions of the corporate” (a phrase that seems fairly generally within the positive print of ticket gross sales concerning using private information).
If facial recognition expertise is utilized in faculty settings, we’ll want clear info as to if and the way college students’ pictures will probably be used past the aim of the lunch queue. For instance, are they going to be shared with any third events, and for what function? Points might come up, say, if face prints are linked to different information on the kid, like their lunch preferences. Third events might theoretically use this information for advertising and marketing functions.
We’d additionally want info as to how the pictures can be protected. If the scholars’ face prints aren’t correctly secured, or the system isn’t strong sufficient to fend off hackers, this creates cyber-security dangers. It could be doable for hackers to hyperlink kids’s face prints to different information about them, and monitor them.

Privateness considerations are key in relation to facial recognition applied sciences in colleges.
Monkey Enterprise Photos/Shutterstock
The heightened privateness threat surrounding using facial recognition applied sciences in colleges additionally pertains to knowledgeable consent. Though UK information safety legislation specifies that kids aged 13 and over can consent to the processing of their private information, this doesn’t imply they absolutely perceive the implications. For instance, one survey discovered kids between ages eight and 15 had issue understanding the phrases and situations of Instagram.
Kids, mother and father and guardians must be supplied with nothing lower than full info, couched in language kids can simply perceive. Any information topic, together with a baby, has the precise to know precisely how their private information will probably be processed, shared, and saved, and may specify the situations beneath which their consent will apply. Something lower than prudence and transparency will threat jeopardising kids’s privateness.
Normalising the surveillance of youngsters?
These are simply a few of the questions using facial recognition applied sciences in colleges raises. Facial recognition expertise additionally carries different dangers, corresponding to errors, which might, for instance, result in college students being charged incorrectly. And as with all AI system, we must be involved about whether or not the algorithms and information units are free from bias, and have clear, full and consultant coaching information.
Importantly, using facial recognition applied sciences in colleges additionally goes some strategy to normalising the surveillance of youngsters. It’s doable the data they’re being tracked on this method might influence some kids’s wellbeing.
Learn extra:
Police use of facial recognition expertise should be ruled by stronger laws
It’s not shocking that the UK’s information watchdog, the Info Commissioner’s Workplace, has stepped in to research using facial recognition applied sciences in class lunch queues. And in gentle of the inquiry, it’s pleasing to see North Ayrshire Council has paused rolling out the apply.
However as we transfer additional into the digital age, it’s doable using facial recognition applied sciences amongst schoolchildren will resume, and even be taken up extra extensively. If that is to occur, using facial recognition should yield considerably extra advantages than dangers, considering the particular circumstances of utilizing the expertise on kids.

Pin Lean Lau is affiliated with the Curiosity Group on Supranational Bio-Regulation of the European Affiliation of Well being Regulation (EAHL).












