Van der Vliet and other welfare advocates I met on my trip, like representatives from the Amsterdam Welfare Union, described what they see as a number of challenges faced by the cityโs some 35,000 benefits recipients: the indignities of having to constantly re-prove the need for benefits, the increases in cost of living that benefits payments do not reflect, and the general feeling of distrust between recipients and the government.ย
City welfare officials themselves recognize the flaws of the system, which โis held together by rubber bands and staples,โ as Harry Bodaar, a senior policy advisor to the city who focuses on welfare fraud enforcement, told us. โAnd if youโre at the bottom of that system, youโre the first to fall through the cracks.โ
So the Participation Council didnโt want Smart Check at all, even as Bodaar and others working in the department hoped that it could fix the system. Itโs a classic example of a โwicked problem,โ a social or cultural issue with no one clear answer and many potential consequences.ย
After the story was published, I heard from Suresh Venkatasubramanian, a former tech advisor to the White House Office of Science and Technology Policy who co-wrote Bidenโs AI Bill of Rights (now rescinded by Trump). โWe need participation early on from communities,โ he said, but he added that it also matters what officials do with the feedbackโand whether there is โa willingness to reframe the intervention based on what people actually want.โย
Had the city started with a different questionโwhat people actually wantโperhaps it might have developed a different algorithm entirely. As the Dutch digital rights advocate Hans De Zwart put it to us, โWe are being seduced by technological solutions for the wrong problems โฆ why doesnโt the municipality build an algorithm that searches for people who do not apply for social assistance but are entitled to it?โย
These are the kinds of fundamental questions AI developers will need to consider, or they run the risk of repeating (or ignoring) the same mistakes over and over again.
Venkatasubramanian told me he found the story to be โaffirmingโ in highlighting the need for โthose in charge of governing these systemsโย to โask hard questions โฆ starting with whether they should be used at all.โ
But he also called the story โhumblingโ: โEven with good intentions, and a desire to benefit from all the research on responsible AI, itโs still possible to build systems that are fundamentally flawed, for reasons that go well beyond the details of the system constructions.โย
To better understand this debate, read our full story here. And if you want more detail on how we ran our own bias tests after the city gave us unprecedented access to the Smart Check algorithm, check out the methodology over at Lighthouse. (For any Dutch speakers out there, hereโs the companion story in Trouw.) Thanks to the Pulitzer Center for supporting our reporting.ย
This story originally appeared inย The Algorithm, our weekly newsletter on AI. To get stories like this in your inbox first,ย sign up here.
Source link
#algorithm #fair

























