A deep dive with Sam Farao

Just a decade ago, artificial intelligence and fintech were fairly new words for those uninitiated to tech. However, over the past decade they have both had a profound effect on the way business, financial transactions and investments are now conducted.

AI has had such a revolutionary effect on financial service providers (FSPs) and fintechs that they are now relying on AI for algorithmic trading, fraud detection, client finding and optimization portfolio. There are few industries outside of the financial services industry where AI is so relevant and beneficial.

A recent NVIDIA survey of FSPs showed that 83% of respondents agreed that AI is important to their business and the future of financial services. It’s hard to dispute this survey considering how much AI has already changed the industry.

As is normal with any developing technology, the infiltration of AI into the financial space has not been without problems. The main complaint has always been the surprising level of bias that has often been expressed by these algorithms. In an age where calls for social justice have grown louder than ever, the last thing Fintechs need is ‘programmed bias’, because if left alone it is even deadlier and more rigid than the human bias.

Artificial intelligence is based on data; consumer data, spending data, and other data describing people’s behavior. The insights obtained from this data when supplemented with natural language processing and advancements in computer imaging can be of great help to fintechs in making correct financial decisions regarding loans, investments, and investments. wallets.

With the wave of data enabled by Big Data, APIs, and IoT, it’s clear that data is the lifeblood of fintech going forward. However, it also shows why fintechs should become more vigilant about spotting biases in their data, in the way AI is taught, or in programmers themselves.

In this article, Sam Farao, a first generation Iranian immigrant to Norway who has succeeded in becoming one of Norway’s leading entrepreneurs, shares his thoughts on AI biases with a particular focus on the fintech space. Farao is the CEO of Bank, a global fintech powerhouse with a strong focus on payment processing and revenue sharing partnerships.

Farao’s entrepreneurial history and the prejudices he had to face as an Iranian immigrant in Europe made him extremely attentive to all forms of prejudice in the industry and helped him develop systems for it. ” identify and eradicate it to introduce more equity into the system.

Farao describes the attitude Fintechs should have towards AI bias in a pretty powerful way; “We cannot hide behind data to justify a lack of fairness in the way our FinTechs perform their functions. We made these technologies, we have to lead it and when it goes off the rails because of our lack of care, like a good father, we have to reorient it and take our responsibilities.

AI, Fintech and the risk of bias

Fintechs and PSPs are starting to increasingly rely on AI and its machine learning capabilities to process and understand data and make decisions on issues such as creditworthiness, fraud detection / prevention and customer support.

“The data is neutral, but that doesn’t mean it’s innocent. Bias is a purely human phenomenon that can be introduced or inferred by data, consciously or unconsciously, tainting the data with it, ”explains Farao.

Artificial intelligence and machine learning solutions can react to the data presented to them, in real time, by finding patterns and relationships and making decisions about who is creditworthy or not, what is fraud and what who is not. However, the risk of bias is often found in the master data set for these machines to operate.

For example, in 2015, Amazon had to scrapping their AI recruiting tool who showed a bias against women when hiring for software developer jobs. The surveys revealed that the bias arose from the source data that fed into the system; curriculum vitae submitted to the company in the previous 10 years, which were mainly from men. This was indicative of male dominance in the tech space, but it also resulted in the machine disqualifying resumes with mention of “women” and ultimately seeing men as preferable for the role.

According to Farao, master data is just one of the ways biases can infiltrate AI and fintech business.

“I have been a socio-cultural outcast all my life, so I know all forms of prejudice. My goal with Banqr is not only to provide excellent financial and payment processing services, but to do so in the fairest way. I have found that the bias is not often introduced intentionally. Most biases in financial services are either historical or unconscious.

An illustration from Farao’s point is seen in a Haas School of Business Reviews mortgage loans in the United States, which found that online and face-to-face lenders charge African-American and Latino borrowers higher interest rates, earning 11-17% higher profits on these ready. On that basis alone, introducing that same historical data into an AI system only prolongs the bias. Also, when a programmer looks at this data and uses the same indexes and high bars that created those unbalanced results, the bias is heightened.

How Fintechs Can Fight Bias

Farao’s call for more data investigations is something that has been echoed by other fintech leaders as they begin to recognize potential impact bias crystallized by AI.

Tackling these issues is so essential that Farao credits it in part with Banqr’s massive success over the past few years. In Farao’s words, “It’s not always easy to have zero bias, but we must maintain the will to reverse it wherever it can be shown to exist.”

He continues, “We need to see AI as a tool, not just to solve our problems as financial service providers, but as a tool to solve the larger problems of society. The potential of AI to impact millions of people at once makes it a very sensitive tool. Fintechs and financial service providers around the world should make investigating data a priority. We should intentionally look for all kinds of prejudice, racial, gender, and gender in this data. “

Fintech companies need to keep skewing data in mind while working to build effective systems. For example, the larger US PSP data set suggests that 20% of the adult population is underserved for credit. This percentage is mostly made up of minorities and this data suggests that women-owned businesses, despite making up one-third of businesses in the United States earn a disproportionately low share of available credit, attract smaller loans and higher default penalties.

If fintechs knew about them, they would be actively working to prevent this larger dataset from being reflected in their systems. Fintechs can also rely on other bias-conscious technologies to handle this.

Open source toolkits like IBM AI Fairness 360, Aequitas, and Google What-if are available to help FinTech companies measure discrimination in models, suggest mitigation pathways, and test the effect of their. scenario data.

Farao remains very optimistic about the potential of AI in investing, banking, commerce, and fraud prevention if only we can properly introduce fairness into the system.

In conclusion, Farao states, “The amount of ease and precision that AI can bring to our services is extremely desirable, but the only desire we should have that exceeds our desire to be fast and correct is our desire to be fast and correct. to be fair and just.

Source link

About Kristina McManus

Kristina McManus

Check Also

The collapse of Evergrande is unlikely to impact the US CRE market, but could have benefits – Commercial Observer

THELife is not that great for China Evergrande Group. This does not mean the blues …

Leave a Reply

Your email address will not be published. Required fields are marked *