What Does Apple’s Decision To Pause Mean For Children At Risk Of Sexual Abuse
By: Glen Pounder, COO of Child Rescue Coalition
We found out on September 3, 2021, that Apple had decided to press the pause button on being proactive in the child protection space.
It is worth remembering that what Apple has chosen to do is just that, a choice. There is NO LAW which compels any company to look for child sexual abuse material – CSAM – on its platforms.
Google, arguably a similar company with a comparable amount of users and similar technologies such as email and cloud storage, made 546,704 reports to the National Centre for Missing and Exploited Children (NCMEC), the clearing house for reporting, in 2020.
We also know that, in 2020, Apple made 265 reports.
If Apple was to use similar detection as Google, and achieved a similar success rate, then that is almost 1500 reports a DAY that Apple will not be collecting.
We don’t have any idea when or even if they will come back to this critical issue.
In the meantime, these are the harsh realities:
- Gone is the option for a parent of a 10-year-old to be notified if their child is about to open an image that clever Apple technology strongly believes is an erect penis.
- Gone is the possibility of Apple reporting a suspect who has 10, 20 or 2,000 images of KNOWN CSAM stored in the Apple iCloud.
- Gone is the possibility of that suspect, potentially your child’s teacher, gym coach or pediatrician being investigated by law enforcement.
- Gone is the possibility of a survivor of child sexual abuse knowing that one more person who possesses digital recordings of the worst moment of their lives being reported to NCMEC.
- Gone is the information which could have been made available from the very second Apple decided to check how many child predators enjoy using their services.
Why is it so crucial to report suspects for possessing and distributing CSAM?
These are the disturbing and unsettling FACTS from a 2019 report on non-production offenders. These are actual cases – where the offender was found to be in possession of CSAM but are not known to have produced the CSAM.
– 52.2% of offenses included images or videos of infants or toddlers.
– 48.0% of offenders engaged in aggravating sexual conduct prior to, or concurrently with, the CSAM charge (up from 35.1% in 2010).
– 84.0% of cases required sentencing enhancements for images depicting sadistic or masochistic conduct or abuse of an infant or toddler.
– 77.2% of cases required sentencing enhancements for possession of 600 or more images.
– 43.7% of offenders participated in an online community (if we can call it that). This means they were actively in contact with others who shared an illegal sexual interest in children.
Apple MUST provide clarity on its revised timetable. Moving forward and leaning in to this difficult issue is the absolute minimum that the survivors of this heinous crime deserve.