One other variable, “presumed associate,” is used to find out whether or not somebody has a hid relationship, since single individuals obtain extra advantages. This includes looking information for connections between welfare recipients and different Danish residents, reminiscent of whether or not they have lived on the identical deal with or raised youngsters collectively.
“The ideology that underlies these algorithmic techniques, and [the] very intrusive surveillance and monitoring of people that obtain welfare, is a deep suspicion of the poor,” says Victoria Adelmant, director of the Digital Welfare and Human Rights Undertaking.
For all of the complexity of machine studying fashions, and all the info amassed and processed, there may be nonetheless an individual with a call to make on the laborious finish of fraud controls. That is the fail-safe, Jacobsen argues, however it’s additionally the primary place the place these techniques collide with actuality.
Morten Bruun Jonassen is certainly one of these fail-safes. A former police officer, he leads Copenhagen’s management workforce, a gaggle of officers tasked with guaranteeing that the town’s residents are registered on the right deal with and obtain the proper advantages funds. He is been working for the town’s social companies division for 14 years, lengthy sufficient to recollect a time earlier than algorithms assumed such significance—and lengthy sufficient to have noticed the change of tone within the nationwide dialog on welfare.
Whereas the struggle on welfare fraud stays politically well-liked in Denmark, Jonassen says solely a “very small” variety of the instances he encounters contain precise fraud. For all of the funding in it, the info mining unit isn’t his finest supply of leads, and instances flagged by Jacobsen’s system make up simply 13 % of the instances his workforce investigates—half the nationwide common. Since 2018, Jonassen and his unit have softened their strategy in comparison with different items in Denmark, which are usually harder on fraud, he says. In a case documented in 2019 by DR, Denmark’s public broadcaster, a welfare recipient mentioned that investigators had trawled her social media to see whether or not she was in a relationship earlier than wrongfully accusing her of welfare fraud.
Whereas he provides credit score to Jacobsen’s information mining unit for attempting to enhance its algorithms, Jonassen has but to see important enchancment for the instances he handles. “Principally, it’s not been higher,” he says. In a 2022 survey of Denmark’s cities and cities performed by the unit, officers scored their satisfaction with it, on common, between 4 and 5 out of seven.
Jonassen says individuals claiming advantages ought to get what they’re due—no extra, no much less. And regardless of the dimensions of Jacobsen’s automated paperwork, he begins extra investigations primarily based on suggestions from colleges and social staff than machine-flagged instances. And, crucially, he says, he works laborious to know the individuals claiming advantages and the tough conditions they discover themselves in. “When you have a look at statistics and simply have a look at the display screen,” he says, “you don’t see that there are individuals behind it.”
Further reporting by Daniel Howden, Soizic Penicaud, Pablo Jiménez Arandia, and Htet Aung. Reporting was supported by the Pulitzer Heart’s AI Accountability Fellowship and the Heart for Inventive Inquiry and Reporting.