Child Rescue Coalition Calls out Apple Over U turn
- September 3, 2021
Our COO, Glen Pounder, was asked by NBC reporters about Apple’s decision to reverse their stated intention to proactively scan for known child sexual abuse material – CSAM.
Their decision comes after intense pressure from privacy campaigners but means that Apple will now not be looking for the material and not reporting suspects for investigation by law enforcement.
Hundreds and potentially even thousands of criminals will never be reported. The victims of child sexual abuse will know that the worst moments of their lives will continue to be traded without any hope of the perpetrators being reported or their devices being seized.
“We absolutely value privacy and want to avoid mass surveillance in any form from government, but to fail children and fail the survivors of child sexual abuse by saying we’re not going to look for known rape videos and images of children because of some extreme future that may never happen just seems wholly wrong to me,”