Algorithm that screens for child neglect raises concerns

0

title=

Attorney Robin Frank poses for a photo outside the Family Law Center in Pittsburgh, Thursday, March 17, 2022. A longtime family law attorney, Frank is fighting for parents at one of their lowest points – when they risk losing their children. (AP Photo/Matt Rourke)

PA

For family law attorney Robin Frank, defending parents at one of their lowest points – when they face losing their children – has never been easy.

In the past, she knew what she faced when she faced child protective services in family court. Now she fears fighting something she can’t see: an opaque algorithm whose statistical calculations help social workers decide which families will have to endure the rigors of the child welfare system and which won’t. .

“A lot of people don’t even know it’s being used,” Frank said. “Families should have the right to have all the information in their file.”

From Los Angeles to Colorado and across Oregon, as child welfare agencies use or consider tools similar to the one in Allegheny County, Pennsylvania, an Associated Press review has identified a number number of concerns about the technology, including questions about its reliability and its potential to reinforce racial disparities in the child welfare system. Related issues have already torpedoed plans by some jurisdictions to use predictive models, such as the tool notably abandoned by the state of Illinois.

According to new research from a team at Carnegie Mellon University obtained exclusively by AP, Allegheny’s algorithm in its first years of operation showed a tendency to flag a disproportionate number of black children for a survey of “mandatory” neglect, compared to white children. The independent researchers, who received data from the county, also found that social workers disagreed with the risk scores the algorithm produced about a third of the time.

County officials said social workers can always override the tool and called the search “hypothetical.”

Child welfare officials in Allegheny County, home of Mister Rogers’ TV district and the icon’s child-centered innovations, say the cutting-edge tool – which draws attention in everything the country – uses data to support agency workers as they try to protect neglected children. This nuanced term can include everything from inadequate housing to poor sanitation, but it is a separate category from physical or sexual abuse, which is investigated separately in Pennsylvania and is not submitted to the algorithm.

“Workers of any kind should not be asked to make 14, 15, 16,000 of these types of decisions in any given year with incredibly imperfect information,” said Erin Dalton, director of the County Department of Social Services and a pioneer in the implementation of the Child Protection Predictive Algorithm.

____

This story, backed by the Pulitzer Center for Crisis Reporting, is part of an ongoing Associated Press series, “Tracked,” that investigates the power and consequences of decisions made by algorithms on people’s daily lives.

____

Critics say it gives a program fueled by data mostly collected on the poor an outsized role in deciding the fate of families, and they warn of local authorities’ growing reliance on intelligence tools artificial.

If the tool had acted on its own to screen for a comparable rate of calls, it would have recommended that two-thirds of black children be investigated, compared to about half of all other reported children, according to another study published last month. and co-authored by a researcher who audited the county’s algorithm.

Advocates worry that if similar tools are used in other child protection systems with minimal or no human intervention — similar to how algorithms have been used to make decisions in the criminal justice system — they could reinforce existing racial disparities in the child welfare system.

“It doesn’t lessen the impact on black families,” said Carnegie Mellon University researcher Logan Stapleton. “With respect to accuracy and disparity, (the county) makes strong statements which, in my view, are misleading.”

Because family court hearings are closed to the public and records are sealed, AP was unable to identify first-hand which families the algorithm recommended should be subject to a mandatory investigation for child neglect, or any case that resulted in a child being placed in foster care. care. Families and their lawyers can never be sure of the algorithm’s role in their lives either, because they are not allowed to know the scores.

Child welfare agencies in at least 26 states and Washington, DC, have considered using algorithmic tools, and at least 11 have deployed them, according to the American Civil Liberties Union.

Larimer County, Colorado, home to Fort Collins, is currently testing a tool modeled after Allegheny’s and plans to share the scores with families if it continues the program.

“It’s their life and their story,” said Thad Paul, director of the county’s Child, Youth and Family Services. “We want to minimize the power differential that comes with being involved in child protection…we really think it’s unethical not to share the score with families.”

Oregon does not share risk score numbers from its statewide screening tool, which was first implemented in 2018 and modeled after Allegheny’s algorithm. The Oregon Department of Human Services – currently preparing to hire its eighth new director of child protection in six years – explored at least four other algorithms while the agency was under committee scrutiny of crisis surveillance ordered by the governor.

It recently suspended a pilot algorithm designed to help decide when foster children can be reunited with their families. Oregon also explored three other tools — predictive models to assess a child’s risk of death and serious injury, whether children should be placed in foster care, and if so, where.

For years, California explored data-driven approaches for the statewide child welfare system before dropping a proposal to use a predictive risk modeling tool in 2019.

“During the project, the state also explored concerns about the tool’s potential impact on racial equity. These discoveries have resulted in the state halting exploration,” department spokesman Scott Murray said in an email.

The Los Angeles County Department of Child and Family Services is being audited following high-profile child deaths and is looking for a new director after its previous director resigned at the end of Last year. It drives a “complex risk algorithm” that helps isolate the highest-risk cases that are investigated, the county said.

In the first few months that social workers in the city of Lancaster in the Mojave Desert began using the tool, however, county data shows that nearly half were black children. of all surveys flagged for further scrutiny, despite making up 22% of the city’s child population, according to the U.S. Census.

The county did not immediately explain why, but said it would decide whether to expand the tool later this year.

___

Associated Press reporter Camille Fassett contributed to this report.

___

Follow Sally Ho and Garance Burke on Twitter at @_sallyho and @garanceburke.

___

Contact the AP Global Investigation Team at [email protected] or https://www.ap.org/tips/

Share.

Comments are closed.