Apple officials have revealed plans to scan all U.S. iPhones for images that depict child sex abuse. Using a tool called “NeuralMatch,” company officials say they’ll be able to find illegal pics in customers’ iPhones without decrypting the users’ personal messages. A phone’s contents will be looked at only if the tool finds an offending image, according to Apple execs. Needless to say, the plan has already been met by plenty of backlash. Cryptography researcher Matthew Green says the program will make it easy for people to frame others by sending them seemingly harmless photos that are encrypted to give “NeuralMatch” a positive result. “Researchers have been able to do this pretty easily,” he says. Consumers have to give up a level of privacy to protect our babies from these evil monsters.
Apple To Scan iPhone For Images That Depict Child Abuse
Comments are closed.