Some Australian schools, including at least one Catholic girls school, are reportedly trialling facial recognition technology that takes the place of teachers doing the morning roll call. The cameras will know if each and every child is present … or not.
The cameras keep scanning through the day to ensure the students haven’t bolted for the fish and chip shop or bailed early for the day.
Presumably, too, they’ll know who has been throwing paper planes behind the teacher’s back – and who decorated the toilet block like a Christmas tree with big paper.
“This takes away a huge sense of freedom for these children,’’ says Dr Niels Wouters, at the Microsoft Research Centre for Social Natural User Interfaces at University of Melbourne.
“Always being surveilled, always being on camera: We’ve all done things behind a teacher’s back and that will no longer exist I suppose,” he says. “It’s an important thing in childhood to explore boundaries.”
He also notes that the identity traits a child uploads on social media or exhibits in the playground can be different to who that child is really. “It’s a good thing in a child’s development that they might present themselves differently on social media, as a form of play, but also as an experimental thing they can explore.”
A child trying out who they are under the constant gaze of an intelligent camera could mean they become forever judged for pulling weirdo faces.
“I think we’ve reached a really crucial point when it comes to facial recognition,” Dr Wouters says. “I think there’s an urgent need to talk about the social impacts.”
His main concern is who will own the data when facial recognition technology become ubiquitous.
“I suspect it won’t be the school itself. Not even the parents. I worry that all the analyses will be sent through to a third-party service provider – and from there it will fall into the hands of the big players of the technology market.” Meaning the likes of Google, Facebook and Microsoft.
This sort of data is used by big companies to retrain their deep learning technologies, to further refine recognition and analysis accuracy.
“It creates revenue for them, but ultimately society gets very little in return,” Dr Wouters says.
Does society care that much? “We did a number of studies where we asked young people what they thought of facial analysis. They say: ‘Do you know what? I don’t really care. I know I’m constantly on camera’.”
In that case, sit up straight, kiddies and smile. Your future may depend on it.
Dr Wouters offers this scenario: What if your facial analysis data ends up with a potential employer.
“What if they can look back and see that in 2018 you were a little bit tired in class. Suddenly there are assumptions that are used to form a hiring decision and you’re not suitable for a management position.”
The technology itself suffers from subjective bias and inaccuracies.
Dr Wouters and fellow University of Melbourne researcher Professor Frank Vetere developed an artificial intelligence (AI) algorithm they call a biometric mirror – a program that provides a character analysis from a single photograph. They trialled it at street level as a public education exercise. “Out in the wild, instead of the lab,” Dr Wouters says.
As they started at the screen, random passersby saw themselves on a screen – along with information about their age, gender, emotional state, attractiveness, aggression levels, weirdness. “We asked people to imagine that our AI thinks you’re aggressive and has forwarded this information to law and order authorities.”
But as Dr Wouters wrote in a university online article: Biometric Mirror isn’t a reliable tool for psychological analysis. Rather, “it only calculates the estimated public perception of personality traits based on facial appearance”.
However, as Matthew Warren, a professor of cyber security at Deakin University, told The New Daily: Facial analysis technology, similar to that trialled by Dr Wouters, is already making hard judgements about student attentiveness and other behaviours in schools in China.
“In that context … look at what’s happening in China with the social mobility index,” Professor Warren says. “Every citizen is evaluated and given a score out of 100 and this affects job prospects and travel options.”
Professor Warren notes that young people don’t understand the concept of privacy, because they live in a culture where they share freely of themselves online. But in the same way old Facebook posts have ruined careers, years after they were posted, the consequences of a poor moment captured by facial recognition can be equally ruinous.
Dr Lisa McKay-Brown is a senior lecturer in learning intervention at the Melbourne Graduate School of Education. She said the facial recognition experiment in Australian schools might actually deter some students from attending.
“There may be students who are having difficulty attending school due to various factors … and thinking that facial recognition will be used to monitor their attendance will be distressing,” Dr McKay-Brown says.
“For some of the young people we work with, just walking through the school gate on any given day is challenging. And for them to know that this technology is being used may be enough for them to refuse to attend.”