Apple recently announced that they planned to introduce software that would allow them to scan user photos for child exploitation images. Today, they have decided to delay the rollout of this software following criticism from privacy and other advocacy groups. Many people are responding to this announcement – supporting or lauding it – without recognizing and speaking to the real complexities that Apple is trying to contend with.
At Love146, we know that the number of child exploitation images online is abhorrent, and that tech companies have failed to adequately address this issue on their platforms and devices. In fact, we have previously rolled out campaigns encouraging technology companies, such as Facebook, to change their default settings to better protect children on their site.
We would like to call on more tech companies to increase the resources they are putting towards proactively reducing the vast number of child exploitation images on their platforms and devices.
We also, however, recognize that efforts to address child exploitation images cannot be done in a vacuum without balancing the very real and potential impact on privacy and other vulnerable populations. We applaud Apple’s willingness to proactively work to address this issue, and appreciate that they want to do this “right” — even if it means a delay.
If you want to learn more about some of the complexities facing Apple and other technology companies trying to address child exploitation images we would suggest starting with this podcast below.