Usually tactics by tech companies to help reduce child pornography are met with great appreciation. However the announcement by Apple, on August 5th, that the company was introducing new image detection software that would alert the company if any known illegal images were uploaded to the individual´s iCloud storage as well as a parental control that users can activate on their children’s accounts were met with mixed reactions. The measures have had many supporters as well as detractors. According to the BBC detractors have criticised the initiative as a potential tool for authoritarian governments “to spy on its own citizens”, and supporters have said that the move – quite unprecedented by a tech company – is a good step in the right direction. However, as with many things, the deeper you dig the more shallow this move appears: paying lip service to a topic as important to society and parents in general is deceitful at best. Identifying child sex abuse material (CSAM) is very important, but Apple does not seem to have taken on the gravity of this with their proposed algorithms. When one finds out that as well as this measure they have also introduced encryption on content when using the Safari browser (potentially concealing when an individual is accessing this abuse material) one can only conclude that they are trying to pacify privacy-obsessed supporters as well as those fighting against child abuse images. If Apple wants to take a stance, go about it the right way and be inclusive and work with other players in this field so you can offer a robust solution not a stab in the dark. Being first to do this is brave but do more now, and recognise your limits.
#applecsam #safety #parents