"But this is peoples' private data," Sweeney protested in his Twitter thread. Though Apple wrote the code, its function is to scan personal data and report it to the government". But inescapably, this is government spyware installed by Apple based on a presumption of guilt. Sweeney took to Twitter and said, "I've tried hard to see this from Apple's point of view. Sweeney said Apple's move on child safety initiatives would open the way for governments to conduct surveillance.
However, after WhatsApp, it was the turn of the popular Fortnite online gamemaker Epic Games CEO Tim Sweeney to step forward and fire a full broadside at Apple. WhatsApp has said no to such efforts, but clearly Apple is looking at being more flexible even though it is saying, that it would never allow anything except child abuse photos to be analysed.Īpple move is likely to make governments to bring more pressure to bear on WhatsApp to follow suit. WhatsApp has been facing governmental pressure to decrypt its fully encrypted messaging system so that governments can see what users are sending in an effort to end extremist activities and hold people causing social conflicts to task. Now, Apple was doing the opposite, he indicated. Mincing no words he slammed Apple over this move and he even rolled out Apple chief Tim Cook's letter to customers saying the iPhone maker would never allow any outside agency to look into what is on users gadgets. This will introduce changes in iMessage, Siri and Search, that would enable scanning iCloud Photos for known CSAM imagery and thereby help protect children online.įirst to launch an attack on Apple was WhatsApp Head Will Cathcart. What has Apple done? Apple has launched various tools to reduce the spread of child sexual abuse material (CSAM) on Thursday. The first to react was WhatsApp Head Will Cathcart and then Fortnite gamemaker Epic Games CEO Tim Sweeney. However, while the attempt to address this issue has been on the agenda of all tech majors, this move by Apple has been blasted by others as it is likely to cause more problems for them.
Apple has revealed that it may look into users iPhones and check for child abuse photos when they are being uploaded onto iCloud in an attempt to wipe out this horrific online crime.