Early last month , Apple declare it wouldintroduce a new set of toolsto assistance detect known child intimate ill-usage material ( CSAM ) in photos stored on iPhones . The feature wascriticized by protection expertsas a ravishment of user privacy , and what travel along was apublic relations incubus for Apple . Now , in a rare move , Apple said today that it ’s choke to take a step back to further refine the feature before public release .
In a statement sent to Gizmodo , Apple said :
Last month we announced plans for features signify to help protect small fry from predators who use communication tools to recruit and exploit them , and limit the spread of Child Sexual Abuse Material . free-base on feedback from customers , protagonism groups , researcher and others , we have decided to take additional time over the coming months to collect input and make improvement before releasing these critically important tike safety lineament .

Photo: Mladen Antonov/AFP (Getty Images) (Getty Images)
Initially , the CSAM feature were set to roll out with Io 15 later this fall . However , the backlash from protection experts and privacy groups was trigger-happy , withthousands signing an undefendable letterto Apple asking them to rethink the feature . Internally , Apple employee were alsoreported to have raised concerns .
While critics agreed that child smut is a serious problem , the fright was that Apple had essentially built a “ backdoor ” in exploiter ’ iPhones that could be easy abused to skim for other cloth . Doing so could lead alien politics to potentially apply a tool intended for noble determination as a means of surveillance and censorship . There were also concerns that innocuous photos of children in bathtub may be ease off as shaver porno . Yet another fear was that the tools could be used as a workaround for encrypted communications .
Apple ab initio doubled down , releasinglengthy FAQsand host several briefings with newsperson toclarify how the feature workedand the company ’s intent . The company also tried to allay fears by promising that it would n’t allow governments toabuse its CSAM tools as a surveillance weapon . However , despite its best exploit — they even jog out software chieftain Craig Federighi in aWall Street Journal consultation — most persist confused as to how the CSAM feature worked and the risks it posed to individual privacy .
![]()
As of decently now , Apple has offered few clues as to when it now plans to roll out the feature , or what its revision process will await like .
This report is explicate …
Daily Newsletter
Get the best technical school , science , and culture word in your inbox day by day .
News from the future , deliver to your present .
You May Also Like







![]()






![]()