Apple has announced that it will be abandoning its planned child abuse material (CSAM) detection system on its iCloud service, a move that has sparked lawsuits from privacy advocates who claim that the tech giant has broken its promises to protect victims of online child abuse.

In 2022, Apple had announced plans to scan iCloud storage for CSAM, but the plan was met with backlash from privacy groups and lawmakers, who argued that it would allow the company to snoop on its users' content and infringe on their rights.

The case against Apple is being brought by a 27-year-old woman who was abused as a child and claims that the tech company has failed to honor its promises to protect victims of online abuse.

'I was repeatedly subjected to grooming and exploitation on the internet, and now I'm being asked to trust Apple to protect me and my daughter from the very same predators they allowed to exist in the first place,' she said in a statement.

Apple argued that the CSAM detection system would have allowed it to identify and report child abuse material to the authorities, but critics argued that it would have also enabled the company to secretly scan users' private data for content that matched a database of known child abuse images.

The case highlights the complex and contentious issue of online child abuse, which requires a delicate balance between protecting vulnerable individuals and preserving users' right to privacy.

As the case makes its way through the courts, it remains to be seen how Apple's abandoned CSAM detection plan will affect the tech industry's efforts to combat online child abuse.