One of these evenings, after having hesitated for a long time between not going to a restaurant or not going to the movies, I went for a documentary. Although the trailer warned me of the harshness of what I was going to see, I wasn't prepared for it.
In April 2013, in Southern Los Angeles County, an 8-year-old boy, Gabriel Fernandez, died at the hands of his mother and stepfather.
A healthy child ceased to exist.
A child was in excruciating pain during the eight months of torture he endured.
Suddenly, global warming, COVID, the vaccine rush, the President of the United States and the world economy were degraded to insignificant anecdotes.
Justice has done its work and condemned the perpetrators of this abject crime with the utmost severity, but in the end you will agree with me that this brings us no relief, because Gabriel, 8 years old, is no longer alive.
Why don't we see that the protection of abused children and the prevention of violence must be the top priority? I can already hear you telling me that it is, that since 2016, in Allegheny County, Pennsylvania, social workers are helped in their follow-up decision by a predictive algorithm. Apparently, the city of Bristol in England has also started using it.
Did you know that in 2003, in Switzerland, there were 0.8 deaths of beaten children per 100'000 inhabitants? Take a look at this figure. It means that for a city like Lausanne and its immediate suburbs, between 2 and 3 children under the age of 15 are killed by their own parents every year. In the United States, this figure rises to 7 children for a population equivalent to that of the capital city of Vaud.
The truth is that financial and human resources will never be enough to effectively ensure the protection to which these children are entitled.
And then, as I was told, it's more complicated than that, that each situation must be examined with tact, because removing a child from his or her home is also a trauma, blah blah blah...
Doing good for the child requires no doubt a lot of finesse and nuance. How to treat the case of these kids plunged in dysfunctional but pseudo-loving homes? Jana, 7 years old, goes to bars with her father almost every night, while her mother, a prostitute, works at night. When she falls asleep, she asks to go home but her father calls her a wimp and loses interest in her, until she collapses under a table. Oscar has his father shave his head at every visitation, because he hates long hair, defying the child's mother in the process. Arthur, 12 years old, has no bed of his own and sleeps sometimes with his sister, brother or mother, according to the available space.
How does a social worker from here or elsewhere carry out these assessments?
Let's take an example: A child reports to his teacher that he saw his mother's boyfriend bumping into her and calling her a "whore". The teacher, alerted, reports the child to her local child welfare office. A social worker will enter the child's name into their database, or look to see if a file has already been opened.
Let's imagine that this family has already been reported for a similar case, by the older sister, more than five years earlier. Based on precedent and local guidelines, the social worker will likely decide to visit this family. On site, imagine that the mother confirms the argument, noting that this only happens very rarely. The home is clean and well-kept, food in the fridge is sufficient and adequate. The social worker will report on the visit and will have to make a decision based on their experience and feelings, whether to keep an eye on this family or to close the case.
Let's see how the predictive algorithm from Allegheny County, Pennsylvania, helps social workers in their decision making. Let's take our example again: After entering the incident, the algorithm will give a score on a scale of 1 to 20, with 1 being the lowest level of risk. The evaluation is based on a statistical analysis of the reports from the previous four years by listing more than a hundred criteria.
The virtuous idea behind these predictive algorithms is obviously to prevent, before they occur, tragedies like the one that led to the death of Gabriel Fernandez or so many other children before or after him. However, voices are raised against these methods worthy of the film "minority report". In an open letter published in The Guardian in 2018, Patrick Brown, Ruth Gilbert, Rachel Pearson, Gene Feder, Charmaine Fletcher, Mike Stein, Tina Shaw and John Simmonds warned of the risks involved in using these algorithms. Based on the input data on which the algorithm will base its prediction, the risks of "false positives" in a given population are high, as are the risks of "false negatives" due to less statistical data in less indexed populations. Moreover, if the use of these databases is to make up for the endemic lack of qualified personnel, the governments must provide the services concerned with the means of intervention, and this is far from being the case.
I don't know what is effective or what nuances need to be added; what I do know is that in a place in this world where celebrity trails like a shadow that we are not sufficiently scared of, a little boy became famous for having been atrociously beaten by those who should have given their lives to protect him, and as long as thousands of Gabriels are concerned, let's choose an imperfect algorithm that we must strive to perfect and let's learn to say sorry if we are wrong.
Photo by Kat Jayne