A young black man hoping to renew his passport online was stunned when the automated photo checker mistook his lips for an open mouth.

The facial detection system informs people when it thinks the photo uploaded may not meet strict requirements, which include a plain expression and the mouth to be kept closed.

Joshua Bada used a high quality photo booth image to apply for his passport, with a digital code that he had to note down and enter on gov.uk.

The 28-year-old from west London told PA that it was not the first time technology had issues with the size of his lips.

Joshua Bada
Gov.uk passport renewal’s automated check told Mr Bada ‘it looks like your mouth is open’ (PA)

“When I saw it, I was a bit annoyed but it didn’t surprise me because it’s a problem that I have faced on Snapchat with the filters, where it hasn’t quite recognised my mouth, obviously because of my complexion and just the way my features are,” he explained.

“After I posted it online, friends started getting in contact with me, saying, it’s funny but it shouldn’t be happening.”

When asked by the system if he wanted to submit the photo anyway, Mr Bada was forced to explain why in a comment box, writing: “My mouth is closed, I just have big lips.”

The incident is not an isolated case – in April, a black woman shared a post on Twitter of similar struggles.

Cat Hallam, an educational technologist from Staffordshire, stressed that she does not believe it amounts to racism, but thinks it is a result of algorithmic bias.

She became frustrated after the system told her it looked like her eyes are closed and that it could not find the outline of her head.

“The first time I tried uploading it and it didn’t accept it,” she told PA.

“So perhaps the background wasn’t right. I opened my eyes wider, I closed my mouth more, I pushed my hair back and did various things, changed clothes as well – I tried an alternative camera.”

Ms Hallam said she begrudged paying extra to have an image taken in a photo booth when free smartphone photos work for others.

“How many other individuals are probably either spending money unnecessarily or having to go through the process on numerous occasions of a system that really should be able to factor in a broad range of ethnicities?” she explained.

Ms Hallam proceeded with one of the images and received her passport without any further problems.

When posting about the issue on Twitter at the time, the Passport Office tweeted back to her, saying it was sorry the photo upload service hadn’t “worked as it should”.

Cat Hallam
Ms Hallam was told by the system that it looked like her eyes were closed and that it could not find the outline of her head (PA)

The Home Office responded to PA, saying: “We are determined to make the experience of uploading a digital photograph as simple as possible, and will continue working to improve this process for all of our customers.

“In the vast majority of cases where a photo does not pass our automated check, customers may override the outcome and submit the photo as part of their application.

“The photo checker is a customer aide that is designed to check a photograph meets the internationally agreed standards for passports.”

Noel Sharkey, professor of artificial intelligence and robotics at the University of Sheffield, believes lack of diversity in the workplace and an unrepresentative sample of black people is one of the reasons why the error may have happened.

Passports
The online photo checker system tells users whether the image is suitable for their passport, such as lighting and expressions (Steve Parsons/PA)

“We know that it (automated systems) has problems with gender as well, it has a real problem with women too generally, and if you’re a black woman you’re screwed, it’s really bad, it’s not fit for purpose and I think it’s time that people started recognising that,” he said.

“People have been struggling for a solution for this in all sorts of algorithmic bias, not just face recognition, but algorithmic bias in decisions for mortgages, loans, and everything else and it’s still happening.”

Dr Saurabh Johri, an AI and data science specialist at Babylon, said AI is only as good as the data it uses and for some time it has been known that these systems are prone to exacerbate biases in the data.

“In this instance, it’s possible that there is bias in the training data but equally it could simply be a wider limitation of the technology,” he added.

The Race Equality Foundation said it believes the system was not tested properly to see if it would actually work for black or ethnic minority people, calling it “technological or digital racism”.

“Presumably there was a process behind developing this type of technology which did not address issues of race ethnicity and as a result it disadvantages black and minority ethnic people,” Samir Jeraj, the charity’s policy and practice officer, commented.