• 21 Dec, 2024

AI Set to Hate Nigerians with Disabilities

AI Set to Hate Nigerians with Disabilities

By the time Nigeria fully adopts AI, bots may go rogue, threatening PWDs. Then government will have a second thought, and try debugging the system.

By Elijah Olusegun

Olufemi Bayode, lost his SIM card, and needed to register a new one. That meant he had to go over the process afresh, with all the hassles of data capturing at Airtel’s outlet around him in Lagos. It could have been easy for him with snatches of his data that have been captured for different kinds of registration up till 2019. As a data scientist himself, he knows what a well-deployed artificial intelligence app can do in this repetitive task.

He navigated his way to the teleco office, and somebody led him to the registration point. The officer listened to him, and then switched over to the man that brought him. Bayode could only hear them.

“After a while, they told me to go—that they have registered me,” he told ER in October. Rather than capture his fingerprints, face, and other biometric details, the registration officer captured the other person’s—for Bayode. 

He made a scene there, and they asked him to start the process again.

“I had a haunch these people might not be forthright still. So I called the NCC office to report them.” The marching order from the government agency resolved the matter, and Airtel captured his biometrics on another appointment.

He was lucky he could call the NCC. “How many PWDs have the exposure, technical know-how, and connection to know and address this wrongful data collection Airtel must have been doing for them in Nigeria?

It’s discrimination at a different level: data level. And that casts a question mark on the integrity of data so collected from Nigerian with disabilities.

Beyond that, the discrimination foreshadows how the AI technology, as its adoption progresses in Nigeria, will perpetuate the entrenched bias and inequality PWDs face.

Nigeria ranked 25th in Global AI Readiness in Africa, and 138th in the world, according to the Oxford Insights 2021 report. Many in AI technology even believe the biggest ICT market in Africa has yet to start. They ignore the adoption of chatbots on bank and fintech apps and other online interfaces; they reckon little with the adoption in health (J Blood Match for blood donation by JDI); and the deployment in education (Windows Electronic Readers by a blind developer Oghenetejiri Peace). The experts take all these apps as fringe operations.

These observers have a justification for taking that position. The federal government doesn’t have a policy on AI yet. Kenya does.  Apart from the Cybercrime Act 2015 and other data protection policies like the National Data Protection Regulation, nothing specifically guides or regulates the use of AI yet in Nigeria. The National Information Technology Development Agency (NITDA) has submitted the first draft of the AI policy framework for approval, though. More than 30 experts and technocrats contributed to the draft. (The National Commission for People With Disabilities (NCPWD) has not indicated its participated in the drafting.) Its launch and implementation have to wait indefinitely.

In the meantime, issues of safety, ethics, rights violation, and discrimination stand at risk. Which is already happening. And the disability communities afflicted with typecasting are among the early victims.

“AI is powered by data. And data relating to PWDs in Nigeria lacks quality; it’s faulty,” Bayode said.

He cited what he called the UN framework for determining disability figures: For every population, 11 percent is disabled. Bayode argued this is not applicable in Africa where accidents, disasters, insurgencies, malnutrition, and medical errors increase the risk of disabilities.

“What is the authenticity and consistency of the 1.9 million Nigerian agencies like to quote using the framework?” According to him, there is no census figures in Nigeria that pinpoint disabilities and their clusters.

“Data officers who purportedly collect data just assign random figures that indicate no education, social status, occupation, and others data points for the PWDs,” he said.

So by no stretch of the imagination can data scientists and app developers help training their AI technologies with biased data – data weighted with limitation, dependence, minority, and exclusion.

The consequences are many, even now when repetitious decision-making responsibility still falls on humans in most tech-driven organisations in Nigeria.

A number of women with disabilities in small businesses ER surveyed recently shared their experience of such kind of data. They have been applying for some government- and interventionist-guaranteed micro-loan schemes in Lagos.

Among over a dozen and half ER interviewed was Arinola Ogunsaya. She’s been blind for 15 years when she applied to a multinational MFB in Lagos for a business loan. She completed the application with the required data, which looks, to some of the women, more like profiling: a savings account with a certain amount, and entrepreneurship knowledge. Her loan officer, however, disqualified her.

Her data on risk guaranty didn’t convince the officer. “I was declined the service because I couldn't get a guarantor that meet the microfinance requirements,” she told ER.

Esther Salami’s story followed a similar arc. But her loan officer’s denial captured the rightness of his decision based on the profiling of women entrepreneurs with disabilities. “They said they are afraid of me been able to repay,” she said.

That fear grew out of the data the loan officers’ intelligence was trained with.  There were cases of women with disabilities who borrowed from the MFBs, but could not pay back. Reviewing the new applications, the loan officers had to keep their eyes peeled for such data, primary or secondary:  business owners, female, blind, short of well-moneyed guarantors, single, and so on. They paid little attention to other data, including education, experience, and skills. 

After weeks of paper work, they arrived at the biased decision to dismiss Salami, Ogunsanya, and two other blind women in the survey as potential loan defaulters. The joy of it is that AI apps or bots trained with the same data will accomplish the discrimination within seconds, and on a much larger scale of work. Like the 22 tiered commercial banks in Nigeria. No fewer than 15 of them currently deploy bots to perform repetitive tasks, especially in the CX department. Bayode, Peace, and an official of one of the disability clusters said they have never used the bots on their banks’ websites or elsewhere. One can only imagine the level of complication here, considering the volumes of biased data, extracted from persons on spectrums of disabilities, feeding into the AI machine learning.

This stereotyping, according to online data platform Inside Big Data, becomes inevitable as a result of limited data points.

“Bias can be avoided by providing as many data points as possible for a subject,” it said.

Probably the biased data is also to blame for the attitude of the data scientists and developers themselves.

 “The tech people and data scientists in Nigeria are not aware of PWDs, their needs, abilities to use technology competitively, and even develop apps,” Peace told ER. Most PWDs in the ICT industry, including Bayode, have a similar opinion about their tech counterparts with no disabilities.

They agree that is why there are not many apps developed with the PWDs in mind. And it won’t be different when AI takes full effect in Nigeria’s digital economy and technocracy.

It’s not clear yet if the NITDA AI policy draft factors in the ethical and human rights implications of the technology for the disability community. It’s most likely it doesn’t. And the NCPWD, even in its policy works done on assistive technology, has not ventured into the realm of AI.  The alternative instrument available to handle this is the Discrimination Act 2018, Nigeria’s cure-all for all things disabilities. And if the act happens not to envisage AI either, the policymakers will find a way to tuck in disability elsewhere. As usual.

Play the story