Tech policyTechnology

An algorithm intended to reduce poverty in Jordan disqualifies people in need


An algorithm funded by the World Bank to determine which families should get financial assistance in Jordan likely excludes people who should qualify, according to an investigation published this morning by Human Rights Watch. 

The algorithmic system, called Takaful, ranks families applying for aid from least poor to poorest using a secret calculus that assigns weights to 57 socioeconomic indicators. Applicants say that the calculus is not reflective of reality, however, and oversimplifies people’s economic situation, sometimes inaccurately or unfairly. Takaful has cost over $1 billion, and the World Bank is funding similar projects in eight other countries in the Middle East and Africa. 

Human Rights Watch identified several fundamental problems with the algorithmic system that resulted in bias and inaccuracies. Applicants are asked how much water and electricity they consume, for example, as two of the indicators that feed into the ranking system. The report’s authors conclude that these are not necessarily reliable indicators of poverty. Some families interviewed believed the fact that they owned a car affected their ranking, even if the car was old and necessary for transportation to work. 

The report reads, “This veneer of statistical objectivity masks a more complicated reality: the economic pressures that people endure and the ways they struggle to get by are frequently invisible to the algorithm.”

“The questions asked don’t reflect the reality we exist in,” says Abdelhamad, a father of two who makes 250 dinars ($353) a month and struggles to make ends meet, as quoted in the report.

Takaful also reinforces existing gender-based discrimination by relying on sexist legal codes. The cash assistance is provided to Jordanian citizens only, and one indicator the algorithm takes into account is the size of a household. Although Jordanian men who marry a noncitizen can pass on citizenship to their spouse, Jordanian women who do so cannot. For such women, this results in a lower reportable household size, making them less likely to receive assistance.

The report is based on 70 interviews conducted by Human Rights Watch over the last two years, not a quantitative assessment, because the World Bank and the government of Jordan have not publicly disclosed the list of 57 indicators, a breakdown of how the indicators are weighted, or comprehensive data about the algorithm’s decisions. The World Bank has not yet replied to our request for comment. 

Amos Toh, an AI and human rights researcher for Human Rights Watch and an author of the report, says the findings point to the necessity of greater transparency into government programs that use algorithmic decision-making. Many of the families interviewed expressed distrust and confusion about the ranking methodology. “The onus is on the government of Jordan to provide that transparency,” Toh says. 

Researchers on AI ethics and fairness are calling for more scrutiny around the increasing use of algorithms in welfare systems. “When you start building algorithms for this particular purpose, for overseeing access, what always happens is that people who need help get excluded,” says Meredith Broussard, professor at NYU and author of More Than a Glitch: Confronting Race, Gender, and Ability Bias in Tech

“It seems like this is yet another example of a bad design that actually ends up restricting access to funds for people who need them the most,” she says. 

The World Bank funded the program, which is managed by Jordan’s National Aid Fund, a social protection agency of the government. In response to the report, the World Bank said that it plans to release additional information about the Takaful program in July of 2023 and reiterated its “commitment to advancing the implementation of universal social protection [and] ensuring access to social protection for all persons.”

The organization has encouraged the use of data technology in cash transfer programs such as Takaful, saying it promotes cost-effectiveness and increased fairness in distribution. Governments have also used AI-enabled systems to guard against welfare fraud. An investigation last month into an algorithm the Dutch government uses to flag the benefit applications most likely to be fraudulent revealed systematic discrimination on the basis of race and gender.



Source link