Blog

Blog

Zoey Wang

Zihui (Zoey) Wang is a Digital Communications Intern at SME Finance Forum. She is currently a second-year graduate student majoring in Communication, Culture, and Technology at Georgetown University. Her specialization is cross-cultural study and video productions. Prior to her graduate study, she graduated with dual degrees in B.S. in Finance and B.S. in Public Affairs Management from Zhejiang University, China. Before joining SME Finance Forum, Zihui worked as a marketing intern at Lenovo Group Ltd. and an assistant editor at Harper's Bazaar Magazine in Beijing. She has adept skills in using Adobe photo and video editing software and data analysis software.

How to Ensure AI-based Finance Works for Women?

Jun 08, 2021
How to Ensure AI-based Finance Works for Women?
Artificial intelligence (AI) and machine learning are changing the face of financing, expanding what might be possible for financial services providers, lowering the cost of understanding customers, and increasing precision in targeting financial services to the unique needs of women-owned SMEs. However, we know from other industries that without careful attention, these technologies can fall into the same or new biases even as they seek to enable inclusion. As the financial services industry sits on the cusp of a new era, asking questions about how we might ensure AI-based finance works for women is critical to the effective deployment of these technologies. 
 
A few weeks ago, the SME Finance Forum held an Open Webinar entitled “Making AI-based Finance Work for Women”. Sonja Kelly (Women’s World Banking), Rachel Freeman (Tyme), Dietmar Bohmer (Tyme), Puneet Gupta (Kaleidofin), and Daniel Awe (Africa Fintech Foundry) participated in the panel. This open webinar was moderated by Matthew Gamser, the CEO of SME Finance Forum.   
 
Sonja Kelly opened the webinar by framing the topic, sharing that if left unchecked, financial services providers will deploy biased machine learning and AI algorithms. Women’s World Banking, which recently published a paper on this topic, invests in and designs solutions that increase women’s financial inclusion and women’s economic empowerment. This prediction kicked off a conversation between the panelists, which we summarize here by topic. 
 
Where does the algorithmic bias emerge? 
The panel agreed that bias might emerge from three separate areas. First, data itself is vulnerable to bias, not from the technology, but from the human beings that are telling it what to do.  Technology is defacto neutral, but human input is not neutral.
  • One example that is often to explain this phenomenon is that the algorithm systematically screens out women candidates during the recruiting process based on human input which blocks out breaks in career.   
  • A second issue is that lenders do not put sufficient time and focus on to concerns about bias, and thus do not sufficiently learn what factors might drive credit decisions.   
  • A third issue is coder bias. Digital credit companies are not necessarily physically close to the customers, and coders can not only be far, but often not even in the same country.  As such, coders in the development of the model might have preexisting unconscious biases. 
How might the algorithmic bias be mitigated? 
Fortunately, among providers Tyme, Kaleidofin, and Africa Fintech Foundry, we have a number of examples of how firms work to mitigate bias at all levels as they deploy new technologies, especially for credit scoring. These included: 
  • Name the bias in pre-processing: Using AI and ML to re-weight the data or detect labeling biases is critical before the model is created.  A gender lens is important to ward against bias against women in lending decisions. 
  • Double check the bias:  During the processing, data scientists and developers can audit the decisions to insure against bias.   
  • Learn from mistakes:  Post processing, data scientists and developers can integrate checks on and changes to the credit scoring outputs to consistently rebuild the model to ward off bias.   
How do institutions in the finance industry practically utilize technology and algorithms? 
  • Kaleidofin. Kaleidofin has chosen to work informal sector customers in the country with a specific focus on women customers. Informal sector customers make up for over 80% of India’s population. Most other fintech entities are largely serving the top 3-10% of the country’s population that has higher incomes and are already digitally active. In order to reach its customer segment, Kaleidofin uses a tech and touch model, where an assisted channel helps customers get on board, where the customers are not comfortable with the use of technology. This is supplemented by continuous efforts to make the technology more friendly for customers to work on self-serve platforms. Kaleidofin has a number of ways that it deploys technology to meet the needs of women customers:

    • Help meet goals as opposed to offer financial products: Kaleidofin does not sell financial products. Kaleidofin understands the goals of the customers, their sources of financial vulnerability and then uses a wealth management framework to arrive at a portfolio of financial products that would be needed to help the customer meet their goals. Such portfolio of products is offered as unique solutions labeled as a new solution that connects to their goals.
    • Create personas: Creation of many solution bundles that are tailored to an individual’s goals requires significant effort and can be scaled only using technology. Kaleidofin manages this by using algorithms that help create personas that allow for an easy tagging of solutions to many different customer personas. The final goal is that the personas get more and more refined. 
    • Credit Models: Kaleidofin has built specific credit models that help offer loans to this customer segment. The company uses the algorithms to predict whether its clients would be able to pay back loans taken. The algorithm offers risk-based pricing and also helps identify customers who will need more focus on collections as outputs. 
    • Managing Biases: Kaleidofin works on managing biases in multiple ways:

      • Identify sources of data that allows for underrepresented groups to form a part of getting loans.
      • Look at patterns in customers who have been eliminated by models to continuously check if there are biases there.
      • Remove variables that could be sources of potential bias. 
    • Product Flexibility: Kaleidofin’s solutions have been designed to keep in mind the specific contexts of customers. Many customers may have unpredictable cash flows and the solutions allow for customers to opt in for flexible payments that mirror customer’s cash flows. Offering savings-based solutions allows customers to take care of fixed expenses and commitments in months of low cashflows. 
  • TymeBank. As one of the fastest-growing digital banks in the world, TymeBank is focusing on emerging markets. It has 3.3 million customers in South Africa in just over 2 years from launch and grows at 120k+ per month. TymeBank uses technologies to acquire and maintain new customers, including efforts to:

    • Set up digital kiosks at almost every Pick-n-Pay and Boxer store to onboard customers in under five minutes. 85% of TymeBank’s customers onboard through the kiosk, and over 52% are women.
    • Track and monitor customers every month to understand customers behavior well, using gender disaggregated data
    • Currently, launching MoreTyme, Tyme’s Buy Now Pay Later product uses gender-tested algorithms to prevent bias.   
  • Africa Fintech Foundry aims to nurture, fund, and accelerate the growth of FinTech startups in Africa through its mentorship and accelerator programs. It does this by:

    • Providing incubation support to startups. 
    • Offering a testbed to showcase best practices and the successes of African-led FinTech solutions. 
    • Enabling customers to digitally transform and innovate to keep up with the industry’s pace.  
Is this just an issue of technology—or is there more to it from an operations perspective? 
Combatting bias requires understanding and action from all levels of an organization—not just the data scientists. To achieve the above, organizations must address algorithmic bias at all levels of an organization. Some ways to do this are: build a team of people who are looking at fairness; make strategic decisions at the management level; and increase gender diversity at all levels. 
 
Conclusion 
The risk of algorithm-based underwriting to women customers – fueled by conscious and unconscious bias is a complex topic. Finding bias is not as simple as finding a decision to be “unfair.” There are dozens of definitions of gender fairness, from keeping gendered data out of credit decisions to ensuring equal likelihood of granting credit to men and women. It is important to start with defining fairness and then recognize where biases emerge. Finally, there are many implementable bias mitigation strategies relevant to financial institutions. Mitigating bias requires intentionality at all levels. These strategies are equally relevant for algorithm developers and decision makers alike.