- Advertisement -
Apple is about to launch a new technology that will allow it to scan your iCloud and identify the pictures and videos of Child Sexual Abuse Material (CSAM), and also the media content related to child abuse and child pornography. This new technology is expected to be announced soon officially and can be integrated with Apple servers. Once the company is done with the development it can be implemented on the user’s device to detect if there is any content related to CSAM, if it detects any such content it will immediately respond and notifies Apple servers. This whole process will be done without violating the privacy of the user.
An Associate Professor at Johns Hopkins Information Security Institute and cybersecurity expert, U.S.A., Matthew Daniel Green tweeted about the plan that Apple made to detect Child Sexual Abuse Material (CSAM) content from the user’s iPhone. He also mentioned, currently this tool is under development eventually it will play a major role in encrypted messaging and surveillance systems.
Daniel Green on Apple will scan your iCloud:
In a detailed thread on Twitter Daniel Green said “The way Apple is doing this launch, they’re going to start with non-E2E [non-end-to-end] photos that people have already shared with the cloud. So it doesn’t ‘hurt anyone’s privacy. But you have to ask why anyone would develop a system like this if scanning E2E photos weren’t the goal”.
- Advertisement -
Many other cloud services like Microsoft, Dropbox, Google to name a few are already using this kind of tool to identify the content that is illegal and violating their terms of services, such as CSAM related content. When it comes to Apple it has given a few options to the users, so they can encrypt the media before it reaches the iCloud servers. Now, Apple is more concerned about CSAM, this is the reason why it is working on this new technology.