Menu

Cyber Security

Behavioral Biometrics: A Safe Middle Ground for the Fight Against Financial Fraud?

An advanced identity verification tool is reinforcing but not replacing legacy methods

Friday, September 22, 2023

By Jim Romeo

Advertisement

The search for stronger security and identity safeguards has made multi-factor authentication commonplace. Those additional layers of verification beyond passwords can include biometrics, based on fingerprints or other physical characteristics, which raise the anti-fraud bar in customer onboarding and anti-money laundering, but they come with complaints about excessive surveillance and uncertainty about public acceptance. The Worldcoin cryptocurrency has been controversial in part because of its reliance on iris scans.

Might behavioral biometrics be a better answer? The technology is not physically intrusive, but instead centers on metrics such as keystroke patterns and related data and risk analytics.

BioCatch, which emerged from Israel’s high-tech community, has numerous patents, over a decade of analyzing data, and serves more than 25 of the top global banks and 100 of the top 500. As of June, through an enhanced relationship with Microsoft, BioCatch solutions became available in Microsoft’s Cloud for Financial Services.

BioCatch Connect, announced in August, “reimagines the traditional technology approach to fraud fighting and money-laundering investigation,” the company said, as it “puts behavioral biometric intelligence at the center of its artificial intelligence and machine learning models, rather than as a secondary signal.

Moonsense, a newer AI/ML entry, in Silicon Valley, raised $4.2 million in seed funding this year but subsequently closed its doors. Co-founder and CEO Andrei Savu, a former Cloudera and Twitter software engineer and principal at Sweat Equity Ventures, explained that capturing “digital body language” – such as speed of entering passwords or pressure on a touch screen – can aid in detecting fraudulent behavior “without creating additional user friction.”

“Data Doesn’t Lie”

Jason Stiehl, a partner of Crowell & Moring in Chicago, sees movement in the direction of biometrics, but legal risks are a “large caveat,” and the everyday user experience must be frictionless.

Jason Stiehl, Partner, Crowell & Moring

“The most important benefit is that, generally speaking, data doesn’t lie,” the attorney states. “Behavioral biometrics can differentiate users – historical use by an account holder can be overlaid to compare to possible fraudulent [activity].

“Secondly,” Stiehl goes on, “sophisticated bad actors have studied how to manipulate” and work around rules programmed into traditional methods of authentication. “Behavioral biometrics, coupled with AI, allows you to build machine learning models that can combine thousands of pieces of information, essentially creating a digital fingerprint that is not something you can likely duplicate."

A Tool in the Toolbox

While acknowledging the value of behavioral biometrics, Fred Curry, anti-money laundering and sanctions compliance leader, Deloitte Risk & Financial Advisory, maintains that it should be one of multiple strategic and tactical tools.

“When coupled with sound KYC [know your customer] information and well-trained investigators, technology has the potential to support financial crime convictions and serve as a powerful tool in maintaining the overall integrity of our financial system,” Curry says. "But technology alone cannot convict criminals involved in money laundering or other financial crimes because it’s just one of many tools that aid in its detection and prevention.

“Nonetheless, technology’s role in processing and analyzing vast amounts of data has enabled financial institutions and law enforcement to quickly identify illicit transactions, schemes and patterns associated with criminal and unlawful activity. Moreover, transactional and digital banking data – such as from cash, checks, and digital funds transfers – create clear paper trails that investigators and prosecutors can leverage.”

Michael Weil, Managing Director, Deloitte

Michael Weil, managing director and digital forensics leader in Deloitte Financial Advisory Services’ Discovery practice, says, “Leveraging biometrics for customer identity and access management [CIAM] is a good step. However, solely relying on biometrics and CIAM tools can lead to a false sense of security. For example, authenticated end-users are often manipulated through social engineering . . . Threat actors, who usually see end users as the weakest links, continuously adapt their schemes as evolving technologies present new or different attack surfaces.”

Physical and behavioral biometrics together “can add overlapping lines of defense; they can work together to catch opportunistic hoaxers who would have fallen through the cracks of traditional security checks,” says a report in Deloitte’s FSI Predictions 2023 series. They can step up “’liveness detection’ checks that distinguish human consumers from synthetic identities who use stolen or AI-generated content to act as the face of their alter egos.”

Weil suggests that “a multi-sensor strategy that looks at not only physical biometrics, but also indicators like cyber, transaction, communication and open-source data, can provide a broader understanding of threat-actor intentions and enhance customer account and asset safety."

“Cognitive Intent”

Anne Eberhardt, senior director in the New York office of advisory firm Gavin/Solmonese, notes that applying technologies to root out illicit banking activity is nothing new. But data mining and other advances enable transaction monitoring in ways that were not possible a generation ago, and they are now routine instruments in compliance and anti-crime efforts.

Anne Eberhardt, Senior Director, Gavin/Solmonese

However, "the idea of any kind of machine searching for, let alone finding, ‘cognitive intent’ shivers my timbers," Eberhardt says. “For years, technology has provided government with potential access to every transaction processed through a U.S. financial institution. When the Bank Secrecy Act was passed more than 50 years ago, the ACLU [American Civil Liberties Union] joined other interested parties in challenging the law, arguing that it was an invasion of its members’ right of association.”

That argument was rejected by the Supreme Court majority. In a dissent, Justice Thurgood Marshall wrote that government agencies’ ability to access the ACLU’s bank records, for example, could “chill the exercise of First Amendment rights of association on the part of those who wish to have their contributions [to the ACLU] remain anonymous.”

Today, criticism is directed at the Bank Secrecy Act from political right- or libertarian-leaning groups like the Cato Institute on grounds that it “has proven a minor inconvenience for criminals but a major burden on law‐​abiding citizens.” As Cato’s Norbert Michel and Jennifer Schulp wrote in July, the AML transaction reporting framework “is heavily biased toward collecting as much information as possible with little regard for whether the information is useful for investigations and even less regard for the burdens imposed on financial institutions and those who seek their services.”

Suitability of Technology

While behavioral biometrics may have to clear some marketplace hurdles, there is also the question of the technology’s readiness and effectiveness for meeting AML and anti-fraud objectives. Experts are generally upbeat.

“Insights extracted from the devices being used to commit fraud can provide a treasure trove of evidence for law enforcement and prosecutors,” says Ian Holmes, global lead for enterprise fraud solutions at SAS. “For example, the selfies used to validate physical ID [drivers licenses and passports] in the application process can serve as references to the perpetrator. Equally, geolocation information collected from devices can help pinpoint fraudsters’ locations.”

“We are at the start of the transition from detection to action, much like DNA transitioned from first being accepted for identification and then criminal prosecution," says Hal Lonas, chief technology officer of digital ID innovator Trulioo. “The full transition will take a combination of factors, starting with broad acceptance by the public, legal updates to the criminal justice system, and then perhaps even more egregious crimes that could have been prevented with earlier detection.”

Consumer Attitudes

Indicating public receptivity to enhanced security with biometrics, in a SAS “Faces of Fraud” survey of 13,500 consumers in 16 countries, three-fourths said they would accept more delays and checks in transactions for better fraud protection; eight out of 10 were willing to use such methods as facial recognition, hand geometry and voice recognition for payments and transactions; 57% preferred unique identifiers like biometrics over fixed passwords at the time of transactions; and seven out of 10 expressed willingness to share more personal data with service providers in the interest of boosting anti-fraud measures.

As generative AI tools make it easier for fraudsters and criminal rings to get around traditional fraud controls, “employing layered fraud detection capabilities that use the same advanced analytics technologies can help organizations beat the criminals at their own game,” Stu Bradley, SAS senior vice president of risk, fraud and compliance, commented along with the release of the survey results on September 12. “Those who rise to their customers’ expectations can turn fraud prevention into a loyalty builder and, ultimately, a competitive edge that helps them automate and grow their business, while cutting fraud losses.”

Fred Rica, a longtime cybersecurity and risk management adviser who is a partner with BPM in New York, contends, “We can no longer secure data and prevent fraud effectively with our current measures. ‘We’ve always done it this way' will be the first thing to overcome.”

Gadi Mazor, CEO, BioCatch

“The legacy technologies deployed within the banking community simply cannot match the innovation and speed of fraudsters,” BioCatch CEO Gadi Mazor commented with the release of BioCatch Connect. He said it represents “a new generation of intelligence and visualization tools, data science models and machine learning engines to interdict the exploitation [by criminals] of human kindness, emotion, greed, naivety and trust.

“As a fraud-fighting community, we have the opportunity, capability and duty to fight fraud by sharing, learning and going on offense together. We’ve designed, built and tested BioCatch Connect to be the solution to accelerate this collaborative industrial journey.”

Assurance and Prevention

"Advancements with technology allow for layers of control that support customer adoption of products and services, and allowing consumers to feel safe to use the services at any time, from anywhere, and on any device,” says Robert Rendell, global head of fraud market strategy and fraud prevention at AML and anti-crime analytics leader NICE Actimize. What’s more, financial organizations can “manage the total cost of fraud – customer experience, fraud losses, operational expenses, compliance – more effectively and efficiently.”

Cláudio Rodrigues, chief product officer of conversational AI company Omilia, points to the adaptability of behavioral biometrics in view of the moving and morphing targets: “Being able to find the right patterns early enough gives institutions the edge on closing a gap before it is exploited; and making fraud techniques increasingly expensive for the fraudsters. Thus, the key is in prevention – being able to isolate behaviors and close the loop before it can become significant.”

"The prognosis is good,” provided that the behavior-based tools “are used correctly and in conjunction with complementary tools and processes,” says Richard Tsai, head of markets for Trans Union’s TruValidate Solutions. “The consumer benefits from greater convenience along with better security for their accounts. Online businesses benefit by providing protection without sacrificing good customer experiences. However, this type of data does need to be treated like other sensitive PII [personally identifiable information].”

 




Advertisement

BylawsCode of ConductPrivacy NoticeTerms of Use © 2024 Global Association of Risk Professionals