For family law attorney Robin Frank, defending parents at one of their lowest points — when they’re at risk of losing their kids — has never been easy.
The job is never easy, but in the past she knew what she was up against when she took on child protection in family court. Now she’s worried she’s fighting something she can’t see: an opaque algorithm whose statistical calculations help social workers decide which families should be examined in the first place.
“A lot of people don’t know it’s even used,” Frank said. “Families should have the right to have all the information in their records.”
From all over Oregon to Los Angeles to Colorado, as child welfare agencies use or consider tools similar to those in Allegheny County, Pennsylvania, an Associated Press review identified a number of concerns about the technology, including questions about its reliability and the potential to reinforce racial inequalities in the child welfare system. Related issues have already torpedoed some jurisdictions’ plans to use predictive modeling, such as the tool dropped notably by the state of Illinois.
According to new research from a Carnegie Mellon University team obtained exclusively by AP, in its early years of use, Allegheny’s algorithm showed a pattern of flagging a disproportionate number of black children for a “mandatory” study of neglect, compared to white children. The independent researchers, who received data from the county, also found that social workers disagreed with the risk scores the algorithm produced about a third of the time.
County officials said social workers can always ignore the tool, calling the study “hypothetical.”
Child welfare officials in Allegheny County, the cradle of Mister Rogers’ TV neighborhood and the icon’s kid-centric innovations, say the cutting-edge tool — which is drawing attention across the country — is using data to support agency workers as they try to help children. to protect from neglect. That nuanced term can encompass everything from inadequate housing to poor hygiene, but it’s a different category than physical or sexual abuse, which is researched separately in Pennsylvania and not subject to the algorithm.
“Employees, whoever they are, should not be asked to make 14, 15, 16,000 of these kinds of decisions in any given year with incredibly imperfect information,” said Erin Dalton, director of the county’s Department of Human Services and a pioneer in implementing the predictive child welfare algorithm.
Critics say it gives a program powered by data mostly collected on poor people an inordinate role in determining the fate of families, and they warn of local officials’ growing reliance on artificial intelligence tools.
If the tool had acted on its own to screen a similar number of calls, it would have recommended that two-thirds of black children be screened, compared to about half of all other children reported, according to another study published last month and co-written by a researcher who checked the province’s algorithm.
Proponents worry that if similar tools are used in other child welfare systems with minimal or no human intervention — similar to how algorithms have been used to make decisions in the criminal justice system — they could amplify existing racial inequalities in the child welfare system.
“It doesn’t lessen the impact among black families,” said Logan Stapleton, a researcher at Carnegie Mellon University. “In terms of accuracy and inequality,[the county]makes strong statements that I believe are misleading.”
Because the family court hearings are closed to the public and the files are sealed, the AP was unable to identify families that were required to be screened for child neglect under the algorithm, nor cases where a child was sent to foster care. concern. Families and their lawyers can also never be sure of the algorithm’s role in their lives, because they are not allowed to know the scores.
According to the American Civil Liberties Union, child welfare agencies in at least 26 states and Washington, DC have considered using algorithmic tools, and at least 11 have deployed them.
Larimer County, Colorado, home of Fort Collins, is now testing a tool modeled after Allegheny’s and plans to share scores with families as the program moves forward.
“It’s their life and their history,” said Thad Paul, a manager with the county’s Children Youth & Family Services. “We want to minimize the power difference that comes with involvement in child welfare… we just think it’s really unethical not to share the score with families.”
Oregon’s screening tool inspired by Allegheny’s algorithm
Oregon does not share risk score numbers from its statewide screening tool, which was first implemented in 2018 and inspired by Allegheny’s algorithm. The Oregon Department of Human Services — currently preparing to appoint its eighth new director of child welfare in six years — investigated at least four other algorithms while the agency was investigated by a crisis oversight board commissioned by the governor.
It recently broke a pilot algorithm built to help decide when foster children can be reunited with their families. Oregon also examined three other tools: predictive models to assess a child’s risk of death and serious injury, whether children should be placed in foster care, and if so, where.
California spent years exploring data-driven approaches to the national child welfare system before dropping a proposal in 2019 to use a predictive risk modeling tool.
“During the project, the state also explored concerns about how the tool could affect racial equality. These findings resulted in the state’s cessation of exploration,” department spokesman Scott Murray said in an email.
The Los Angeles County Department of Children and Family Services is being monitored over high-profile infant deaths and is seeking a new director after the previous one stepped down late last year. It runs a “complex-risk algorithm” that helps isolate the highest-risk cases under investigation, the province said.
However, in the first few months that social workers in the town of Lancaster in the Mojave Desert began using the tool, provincial data shows that black children were the subject of nearly half of all studies flagged for additional research, despite the fact that they make up 22% of the city’s child population, according to the US Census.
The province did not immediately say why, but said it will decide whether to expand the tool later this year.