Apple has announced impending changes to its operating systems that include new “protections for children†features in iCloud and iMessage. If you’ve spent any time following the Crypto Wars, you know what this means: Apple is planning to build a backdoor into its data storage system and its messaging system.Child exploitation is a serious problem, and Apple isn't the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.
Apple on Friday delayed a set of features designed to protect children from sexual predators on some of its iPhones, iPads and Mac computers. The move follows criticism from privacy advocates and security researchers who worried the company's technology could be twisted into a tool for surveillance.In a statement, Apple said it would delay its new tools to identify images of child abuse on its devices as well as features to warn children about sexualized messages sent by SMS or iMessage. Apple had announced the tools last month."Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," a company spokesman said. The company didn't respond to a request for further comment about when it plans to reintroduce these technologies.