Pushing fintech regulation forward – the challenges of AI for KYC and ID Verification

This blog article is part of our blog series “Pushing fintech regulation forward”.

Fintechs have to respond to a rapidly changing environment. They need to adopt new technologies to offer innovative products and services while also staying compliant with evolving regulations.

For many years, regulation has developed to meet the operations of larger, traditional financial institutions. However, even if a fintech is smaller, it still needs to manage similar risks and protect against criminal or fraudulent behavior. This is now reflected in fintechs being subject to the same regulatory oversight.

There have been calls for an overhaul of the fintech regulation market to better respond to this need to grow and innovate quickly while also keeping markets safe.

Whatever changes may happen in the future, complying with KYC and AML regulations will remain key. These regulations are well defined in most countries and have evolved to meet new criminal methods over the past decades. In Europe, this includes the AMLD (now in its sixth iteration) and eIDAS regulations.

Artificial Intelligence (AI) and KYC

Artificial Intelligence and machine learning techniques now form an important part of KYC and AML implementation.

AI offers improved accuracy and speed of customer onboarding. It can be used in many areas, including automated identity verification, biometric matching, and ongoing transaction monitoring. Read more: Leveraging AI as a fintech for ID verification

The acceptance of such technologies is rapidly increasing. The general AML regulations in most countries (such as AMLD in Europe) do not include technology specifics, but many regulators have confirmed the acceptability of AI and machine learning techniques.

The trend is set for this acceptance to expand, giving many fintechs the confidence to adopt such techniques.

AI and machine learning offer many benefits. Such techniques are lower cost than traditional manual processes and much faster. They are also safer, with more accurate predictions/matches and a lower chance of human error.

Accurate and near real-time verification also have advantages for customers. The overall user experience is improved – and, with it, customer conversion rates.

woman going through the steps of id verification with an agent

Challenges of AI and machine learning technology

As with any major change, or adoption of new technology, there are also challenges and difficulties with AI and machine learning. Staying aware of these is key to minimizing the impact. Working with a reliable technology provider will also help.

The first set of challenges relates to the use of AI techniques and machine learning algorithms in general.

Poor data quality or preparation

AI and machine learning algorithms rely on data for training. Access to historical or training data is needed, and results must be checked. Plus, both quality and quantity of data are needed too. This is more of a concern at the start of using a new system or algorithm, since data is limited. But over time, this situation improves as more data becomes available.

Avoiding AI bias

Training needs to be carefully managed and checked to avoid AI bias leading to poor results. AI bias refers to the situation where the AI is wrongly trained to reflect human biases. This could be due to human trainers or poor data. Facial recognition as part of KYC is one area that is particularly susceptible to this.

Such biases are not intentional, of course, and they can be hard to pick up. Methods to avoid bias include preparing truly representative data sets, considering outputs rationally, and reviewing AI results against real data.

Human involvement is still needed

AI can only do so much. Human involvement is needed in both the training phase and in real-time usage. During training, human input guides machine learning in identifying failures such as non-matches or fraud cases. These could be real failures or false positives, and algorithms need to learn from this. Also, remember that KYC and AML constantly evolve to meet new challenges and fraud methods. This means that ongoing updates and training for machine learning algorithms are common.

AI is largely automated when implemented, but human oversight is still needed.

To stay safe and compliant with regulations, adopters of AI need to realize when it does not work. In identity verification, this includes cases where the AI cannot be sure of an identity. Such cases must be identified and flagged for human review.

IDnow’s AutoIdent handles this using specialist staff. Cases can be sent for video-based manual review, completed in real-time, and at minimal disturbance to the user. Read on: Consumer choice for Identity Verification

Managing additional risks introduced by AI and machine learning

Any new method or process introduces risks, which is undoubtedly the case with AI and machine learning in financial services. These could be in any area, and the unknown element is part of the challenge.

The UK FCA carried out a study of this in 2019, involving close to 300 financial institutions. This identified several risk areas, but also highlighted that organizations were well aware of how to address them.

Rather than introducing entirely new risks, the FCA felt that machine learning instead re-enforced already existing risks. These can be managed through appropriate staff training and data validation frameworks.

The risk areas identified included:

  • Insufficient training of staff to use systems.
  • Risks introduced by the complexity of an AI system and challenges in validating and governance of systems.
  • Issues with data quality leading to inaccurate results.

To find out more about the regulatory challenges the fintech industry faces, you can download a guide here.

Challenges of integrating AI into fintech and KYC solutions

Integrating AI into the onboarding process is not necessarily straightforward. There are further challenges that are specific to KYC, verification, and onboarding that adopters of AI should be aware of.

Managing customer expectations and experience

People are used to human interaction. The switch to automation needs to be managed against different experiences and expectations in different markets. This is more likely to cause customer frustration in other AI applications, such as chatbots, but fintechs should still be aware of this possibility with AML and KYC.

Integrating KYC into the onboarding process

AI and machine learning algorithms may be invisible to the end-user, but their effect certainly is not. This includes detecting security features in identity documents and performing live biometric facial comparisons.

Done right, this should be seamless to the end customer. Errors or problems could lead to a lack of confidence in security. Failures or slow processes as part of onboarding could lead to customers abandoning sign up. These situations are clearly not good for brand, reputation, or for customer conversion rates. Problems here could worsen as AI becomes more widespread and customer expectations increase.

Staying up to date with regulations.

This is a major consideration for RegTech companies. KYC and AML regulations are well defined by FAFT and national regulators. However, these regulations are largely technology neutral. Fintech companies, just like banks, need to stay aware of this and be prepared to justify the technologies used.

More and more national regulators are permitting the use of AI techniques. Countries that allow full AI usage currently include the United Kingdom, France, Spain, Belgium, and Finland. Other European countries permit video-based KYC. Germany has long been the most regulated market, and it has now accepted an element of automation combined with a manual review.

The acceptance of full AI is increasing and will eventually pave the way for much easier expansion for fintechs. Until it is fully accepted, though, fintechs will need to stay aware of differences and be able to adapt methods and offerings for different markets. The use of manual live video reviews is a major part of this for KYC, allowing hybrid verification to be used in place of full automation.

Regulatory challenges in the fintech Industry.

To find out more about the regulatory challenges the fintech industry faces, you can download a guide here.
Get your free copy
Fintech handbook EN 10 2021

By

Pushing fintech regulation forward – the challenges of AI for KYC and ID Verification 1

Francisco Martins
Senior Identity Consultant, Financial Sector UK/I at IDnow
Connect with Francisco on LinkedIn

Play