Age Estimation Technologies
Article
16 November, 2023

Age Estimation Technologies

We’re entering an exciting time in the UK, whereby technology is set to transform the retail landscape and change our approach to age verification processes, both in-store and online. With Luciditi rolling out the first PASS level 5 accredited Digital ID (see how we’re supporting their launch here: https://luciditi.co.uk/uks-first-pass-approved-digital-proof-of-age-card-set-to-reduce-fraud-and-retailer-prosecution), age estimation (AE) technology being actively used by some online retailers, plus upcoming changes to the law around the sales of alcohol, we expect this market to explode very soon. As always, Serve Legal is dedicated to supporting our clients with all things compliance and risk-related, so here are our thoughts on the upcoming tech, with a focus on Age Estimation for retailers:

 

Benefits of Age Estimation Technology

  1. Improved customer journeys.

We’ve all been in a situation whereby we’ve been hoping for a quick and smooth purchase via the self-scan till; however, the red light flashes and we end up frustratingly stuck in limbo whilst waiting for the transaction to be approved. This technology should cut out a significant proportion of those authorisations required for age-restricted products, leaving just those lucky enough to look under 25 and those unfortunate souls with the dreaded ‘unexpected item in bagging area’ for colleagues to address - resulting in a significantly improve experienced for most customers.

  2. Reducing points of friction/confrontation

It can be understandably challenging for a fresh-faced 18-year-old taking their first steps into the workforce to challenge a customer who they deem to be potentially older than them but under the Think25 company policy age. This can escalate in the scenario that the customer doesn’t have ID and we face the potential of the transaction being refused, creating a point of conflict. AE provides a nice ‘computer says no’ get-out clause for the staff member and ensures they do not shy away from the company policy. Below are example comments from thestudentroom.co.uk which reflect the anxiety felt by some young people when having to ID customers:

“Maybe this is just first day nerves and I hope It gets better. But I genuinely hate serving customers, mainly because I fear I'm going to have to ID them”

"I currently work at a supermarket and we use the challenge/think 25 policy, but it is giving me so much anxiety as I struggle to know if someone looks under 25

  3. Removing human interpretations and bias

As with anything, our individual interpretations differ and somebody one person thinks may be 25+, another person might see as a potential underage customer. We also know through hundreds of thousands of Serve Legal audits that the age and gender of staff members can influence compliance performance and experience or personal bias will impact this further. AE will provide a level of consistency across the business. However, this doesn’t mean that the technology is perfect and free of bias - this will need to be a consideration for your choice of AE tool.

 

Points for consideration

  1. Privacy Concerns:

There are some misconceptions within the world of AE that unfairly link it with Facial Recognition (FR) and this leads to concerns amongst end users, deployers and even governments that facial images are being captured and stored. The AVPA, which we are a member of, are doing a great job of debunking this myth and explaining how the technology recognises patterns without storing images. However, businesses choosing AE technology need to ensure that their chosen providers can demonstrate this and will need to reassure customers that this isn’t a case of ‘Big Brother’ extending its grasp.

  2. Bias within technology

Bias can often be seen as just a human trait based on our experiences and influencing factors, however, it can and does exist within technology as well. Whether that be in algorithmic bias – whereby the technology may perpetuate biases/discriminations - or data bias – whereby the algorithms have been trained on skewed data sets. There have been examples of gender and ethnicity biases being perpetuated within AI and this has led to damaging results for the deployers of the technology, including reputational damage. Retailers need to ensure that AE providers can demonstrate robust training, including:

  • Separate training, Q&A, and testing data sets – you can't test your tool using the same data you trained it with.
  • Diverse data sets
  • The Q&A and testing data sets (especially) are highly accurate, free of duplication, validated and verified – in the case of AE this would be that each image has been correctly marked with the individual's age and confirmed through ID.
  • Independent verification of accuracy and bias (no marking your own homework!)
  3. Risk

Remember that the licence risk remains with the retailer/deployer of the technology, not the provider of the technology. Retailers will need to be able to demonstrate to the key stakeholders in the business that the technology has been:

  • Implemented correctly and that lab results are replicated in the real world.
  • Works in situ, with all the external factors that may influence the results (lighting, positioning, customer presentation etc)
  • Performance exceeds that of human-like-for-like transactions.
  • Customers are not being discriminated against
  • The provider can demonstrate protection against spoofing attempts.
  • The datasets have been gathered in a GDPR-compliant manner – this is especially important for tools that are trained to estimate the age of minors.
  4. Regulation

We don’t want regulation to stifle innovation, and at the moment regulation is severely behind the innovation – the EU AI Act is still being debated and the UK is working on a white paper. However, regulation is coming, and this may impact the tools being used depending on the outcomes, so consideration must be given to the longevity of AE solutions. Our key findings from assessing the various pieces of legislation in relation to AI, which AE technologies utilise to train their models, are that any tool using AI which has the potential to discriminate against a protected characteristic (age, gender, ethnicity etc) would be categorized as a ‘high-risk’ and would need to demonstrate accuracy, bias mitigation measures and continuous monitoring to be used (to prevent inaccuracies or biases from being introduced through continuous development).

It should be noted that many of the AE providers demonstrate diversity through the Fitzpatrick scale, however, as explored in a study by Durham University assessing hidden bias within facial recognition, skin tone is not enough for analysing racial bias, therefore these metrics potentially won’t be enough to fulfil the bias mitigation of the protected ethnicity characteristic.

(Yucer, S., Tekras, F., Al Moubayed, N., & Breckon, T. (2022). Measuring Hidden Bias within Face Recognition via Racial Phenotypes. https://openaccess.thecvf.com/content/WACV2022/papers/Yucer_Measuring_Hidden_Bias_Within_Face_Recognition_via_Racial_Phenotypes_WACV_2022_paper.pdf

 

Best practice guidance for retailers

  1. Transparency with customers

Make sure that customers are fully aware of the technology in place and how their data will be handled and stored (or not). There will be a period required for customers to adjust to the changing customer journey and some end-users will need the reassurance of the technology’s intent.

  2. Regular auditing and testing

We recommend that continuous monitoring of the technology is undertaken so that retailers and businesses can retain evidence of due diligence for any Trading Standards audits, plus to reassure stakeholders that quality is never compromised.

Retailers should be able to demonstrate that the new technology is outperforming a like-for-like human transaction.

  3. Gather evidence of bias mitigations

As explored above, the technology implemented needs to be watertight from a regulation, GDPR and bias mitigation point of view. Ensure that providers have a continuous program for testing and monitoring their tool, to ensure that all users are treated equally, bias is not inadvertently introduced within continuous development and that they are abreast of all regulatory requirements.

 

If you’d like to discuss ways in which we can support you with your choice of age estimation technology, please contact info@servelegal.co.uk or visit our website at www.servelegal.co.uk/services/facial-biometric-accuracy-and-fairness-auditing/.

 

Matthew Houliston
Matthew Houliston is Serve Legal’s Director of Data and Systems, having contributed over 14 years of service to the company. His expertise in systems management has been key to driving the company’s technological evolution. Matthew leads research and development initiatives, with a focus on AI and innovative technologies.

Subscribe to our newsletter

male-barista-with-take-away-coffee-min
home-hero
4

Stay Ahead of Compliance Changes.

Get in touch with a member of the Serve Legal Team to discuss how we can support your business.