uk-watchdog-alleges-apple-failed-to-report-child-sexual-images
GREEK NEWS

UK Watchdog Alleges Apple Failed to Report Child Sexual Images

Apple May Feature AI Emojis
Apple fails to monitor and report child sexual images effectively, alleges a UK watchdog. Credit: Jorge Láscar / Flickr / CC BY 2.0

Experts in child safety claim Apple is not doing enough to check for pictures and videos of child sexual abuse on its platforms. This raises worries about how Apple will manage more of this harmful content as artificial intelligence grows.

The UK’s National Society for the Prevention of Cruelty to Children (NSPCC) says Apple reports far fewer cases of child sexual abuse material (CSAM) than actually occur.

Based on police data the NSPCC obtained, child predators used Apple’s iCloud, iMessage, and Facetime to share CSAM more often in England and Wales alone than Apple reported for all other countries combined in a year.

Data from freedom of information requests shared with The Guardian reveal concerning numbers for Apple. The children’s charity, NSPCC, found that, between April 2022 and March 2023, Apple was linked to 337 recorded cases of child abuse images in England and Wales.

In 2023, Apple reported only 267 cases of suspected CSAM worldwide to the National Center for Missing & Exploited Children (NCMEC). This is much lower compared to other tech giants. Google reported over 1.47 million cases, and Meta reported more than 30.6 million, according to NCMEC’s annual report.

All US-based tech companies obliged to report cases of child sexual abuse material

All US-based tech companies must report any detected cases of CSAM on their platforms to NCMEC. This organization, based in Virginia, collects reports of child abuse from around the world and forwards them to the appropriate law enforcement agencies.

Apple’s iMessage service uses encryption, which means Apple cannot view the contents of users’ messages. However, Meta’s WhatsApp, also an encrypted service, made about 1.4 million reports of suspected CSAM to NCMEC in 2023.

“There is a concerning discrepancy between the number of UK child abuse image crimes taking place on Apple’s services and the almost negligible number of global reports of abuse content they make to authorities,” said Richard Collard, head of child safety online policy at the NSPCC.

“Apple is clearly behind many of their peers in tackling child sexual abuse when all tech firms should be investing in safety and preparing for the roll out of the Online Safety Act in the UK,” Collard said.

Apple chose not to comment directly on The Guardian‘s article. Instead, the company pointed to statements from August in which it explained its decision to cancel a program that would scan iCloud photos for CSAM. Apple said it wanted to focus on “the security and privacy of [its] users.”

In late 2022, Apple decided not to use its iCloud photo-scanning tool called neuralMatch. This tool was designed to check photos before they were uploaded to iCloud. It would compare them to a database of known child abuse images using special codes called hash values, as reported by The Guardian.

Related posts

Greece Asks for Compensation for the PPR Cattle Plague

timesadmin

Sole Resident of Greece’s Kinaros Island Gets a Donkey as Companion

protothema.gr

US Tourists Face Roaming Issues in Greece

timesadmin

Ancient Eleutherna on Crete Reveals Story of the Dawn of Greek Civilization

timesadmin

Swimming Among the Ruins of a Glorious Ancient Greek Military Harbor

timesadmin

The Brave Greeks Who Saved the Jews of Zakynthos

timesadmin