How IDnow’s latest collaborative research project, MAMMOth, will make the connected world fairer for all – regardless of skin tone.
While the ability of artificial intelligence (AI) to optimize certain processes is well documented, there are still genuine concerns regarding the link between unfair and unequal data processing and discriminatory practices and social inequality.
In November 2022, IDnow, alongside 12 European partners, including academic institutions, associations and private companies, began the MAMMOth project, which set out to explore ways of addressing bias in face verification systems.
Funded by the European Research Executive Agency, the goal of the three-year long project, which wrapped on October 30, 2025, was to study existing biases and create a toolkit for AI engineers, developers and data scientists so they may better identify and mitigate biases in datasets and algorithm outputs.
Three use cases were identified:
- Face verification in identity verification processes.
- Evaluation of academic work. In the academic world, the reputation of a researcher is often tied to the visibility of their scientific papers, and how frequently they are cited. Studies have shown that on certain search engines, women and authors coming from less prestigious countries/universities tend to be less represented.
- Assessment of loan applications.
IDnow predominantly focused on the face verification use case, with the aim of implementing methods to mitigate biases found in algorithms.
Data diversity and face verification bias.
Even the most state-of-the-art face verification models are typically trained on conventional public datasets, which features an underrepresentation of minority demographics. A lack of diversity in data makes it difficult for models to perform well on underrepresented groups, leading to higher error rates for people with darker skin tones.
To address this issue, IDnow proposed using a ‘style transfer’ method to generate new identity card photos that mimic the natural variation and inconsistencies found in real-world data. By augmenting the training dataset with synthetic images, it not only improves model robustness through exposure to a wider range of variations but also enables a further reduction of bias against darker skin faces, which significantly reduces error rates for darker-skinned users, and provides a better user experience for all.

The MAMMOth project has equipped us with the tools to retrain our face verification systems to ensure fairness and accuracy – regardless of a user’s skin tone or gender. Here’s how IDnow Face Verification works.
When registering for a service or onboarding, IDnow runs the Capture and Liveness step, which detects the face and assesses image quality. We also run a liveness/ anti‑spoofing check to check that photos, screen replays, or paper masks are not used.
The image is then cross-checked against a reference source, such as a passport or ID card. During this stage, faces from the capture step and the reference face are converted into compact facial templates, capturing distinctive features for matching.
Finally, the two templates are compared to determine a “match” vs. “non‑match”, i.e. do the two faces belong to the same person or not?
Through hard work by IDnow and its partners, we developed the MAI-BIAS Toolkit to enable developers and researchers to detect, understand, and mitigate bias in datasets and AI models.
We are proud to have been a part of such an important collaborative research project. We have long recognized the need for trustworthy, unbiased facial verification algorithms. This is the challenge that IDnow and MAMMOth partners set out to overcome, and we are delighted to have succeeded.
Lara Younes, Engineering Team Lead and Biometrics Expert at IDnow.
What’s good for the user is good for the business.
While the MAI-BIAS Toolkit has demonstrated clear technical improvements in model fairness and performance, the ultimate validation, as is often the case, will lie in the ability to deliver tangible business benefits.
IDnow has already began to retrain its systems with learnings from the project to ensure our solutions are enhanced not only in terms of technical performance but also in terms of ethical and social responsibility.
Top 5 business benefits of IDnow’s unbiased face verification.
- Fairer decisions: The MAI-BIAS Toolkit ensures all users, regardless of skin color or gender, are given equal opportunities to pass face verification checks, ensuring that no group is unfairly disadvantaged.
- Reduced fraud risks: By addressing biases that may create security gaps for darker skinned users, the MAI-BIAS Toolkit strengthens overall fraud prevention by offering a more harmonized fraud detection rate across all demographics.
- Explainable AI: Knowledge is power, and the Toolkit provides actionable insights into the decision-making processes of AI-based identity verification systems. This enhances transparency and accountability by clarifying the reasons behind specific algorithmic determinations.
- Bias monitoring: Continuous assessment and mitigation of biases are supported throughout all stages of AI development, ensuring that databases and models remain fair with each update to our solutions.
- Reducing biases: By following the recommendations provided in the Toolkit, research methods developed within the MAMMOth project can be applied across industries and contribute to the delivery of more trustworthy AI solutions.
As the global adoption of biometric face verification systems continues to increase across industries, it’s crucial that any new technology remains accurate and fair for all individuals, regardless of skin tone, gender or age.
Montaser Awal, Director of AI & ML at IDnow.
“The legacy of the MAMMOth project will continue through its open-source tools, academic resources, and policy frameworks,” added Montaser.
For a more technical deep dive into the project from one of our research scientists, read our blog ‘A synthetic solution? Facing up to identity verification bias.’
By

Jody Houton
Senior Content Manager at IDnow
Connect with Jody on LinkedIn