Can fintech eliminate credit discrimination?
Fintech companies, which provide financial services on technological infrastructure, use proprietary statistical models that leverage AI methodologies and machine learning to assess the borrowers’ credit risk, significantly accelerating decision-making and service with enhanced precision.
Banking has traditionally relied on face-to-face interactions with the clients at the physical branch or office. The personal encounter has provided numerous advantages to both parties. The client had an opportunity to get to know the service provider, while the latter could use the first-hand impression to mitigate the bank’s exposure to a range of risks, in particular, fraud, forgery, impersonating, and money laundering.
While using personal impression to mitigate risk is a legitimate aid, the assessment of clients’ financial soundness and business potential must rely on objective parameters, which are free of religion, gender, ethnicity, and other irrelevant parameters. This may seem the right way to do business today, but it has not always been so.
The not so far history shows that discrimination in the extension of credit in the world’s biggest economy became illegal only 45 years ago. Indeed, it was only in 1974 that the USA enacted a law stipulating that no client will be denied credit because of his/her gender, race, ethnicity, family status, or age. In other words, until 1974, women’s access to credit was not equal to men’s. Worse still, only in 1988 did the USA pass a law that revoked the requirement imposed on female business-owners to have their husband sign their credit application.
The legislation is a critical instrument of society, but even the strictest laws cannot eliminate the inherent intuition-based impression that the credit manager forms in the first meeting with a prospective customer. Can a financial service provider ignore all irrelevant prejudice when considering the credit application of a client from a different background, appearance, or gender? The multiple lawsuits filed against service providers for denying credit with no solid business grounds show that discrimination is deeply rooted in society.
Small businesses owned by women are especially prone to the consequences of gender discrimination. Research conducted by Fundbox shows that credit applications filed by women are similar to those of male business owners in terms of scope and quality. However, financial institutions approve only 39% of the applications filed by women, compared with 52% when the application is made by a man.
The information revolution, as well as the evolution of databases and the increased availability of credit rating, are some of the building blocks of fintech companies. Using advanced technologies and models, they often bypass the need for a physical encounter with a prospective customer. Fintech companies that provide financial services such as loans have built statistical models, which comprise state-of-the-art AI and machine learning methodologies for assessing the credit risk. As such, these models allow making faster, often more accurate, service decisions, taken solely according to objective parameters.
Intentionally or not, the fintech companies founded primarily to solve issues and streamline the entire market with a business model that generates revenues for them also succeeded in eliminating, albeit partially, the prejudice inherent to face-to-face meetings due to appearance, gender, or ethnicity. The legal violations, the social humiliation, the lost revenues, and the guilty feelings remain with the traditional bankers, while fintech companies are free of these faults.
The emergence of fintech companies, as providers of credit and financial services across the globe, has inadvertently contributed to social equality. Today, two people with the same economic parameters, but of different origin or gender, who need funds for their new business venture, are assessed by the credit decision models based exclusively on the real figures of their business, not irrelevant prejudice and discrimination.
By Noam Grabinsky, VP risk, Fundbox