Apple is currently facing a major lawsuit regarding its handling of child sexual abuse material (CSAM) on its iCloud service.
The lawsuit was announced on Sunday in the U.S. District Court for Northern California.
A 27-year-old woman, using a pseudonym, claimed that Apple’s failure to detect and remove these abusive images caused her trauma.
Read also: Google Photos’ new feature lets users delete Cloud storage without losing files on device
Details of the lawsuit
The lawsuit states that Apple did not follow through with a system previously announced in 2021. This system was meant to scan iCloud photos for known CSAM, using digital signatures from organisations like the National Center for Missing and Exploited Children.
However, Apple abandoned this plan in 2022 due to privacy concerns and backlash from advocacy groups worried about potential government surveillance.
The woman’s attorney, James Marsh, argues that Apple’s negligence has allowed images of her childhood abuse to be shared online.
He estimates that 2,680 victims could be eligible for compensation if the lawsuit succeeds. Marsh pointed out that survivors are forced to relive their trauma every time they see news about individuals charged with possessing images of their abuse.
Many child safety advocates are frustrated with Apple, saying the company hasn’t done enough compared to other tech giants like Google and Meta, which have implemented stricter measures against CSAM.
Read also: Google Lens Now Widely Available On Google’s Search Page
Apple’s response
Apple said the tech company was urgently looking into the claims and was deciding on the best way to address the concerns without compromising users’ privacy.
A spokesperson for the company stated, “We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users.”
This lawsuit follows another case where a nine-year-old girl accused Apple of allowing strangers to send CSAM through iCloud. These incidents raise serious questions about Apple’s practices and whether they adequately protect vulnerable users.
This case highlights a more significant tech industry issue as it develops: balancing online child protection and consumer privacy. The result may change how internet providers filter content and protect users.