Apple’s iPhone X shares face scan data with third-party app makers, prompting privacy concerns.
“Beyond a photo, the iPhone X’s front sensors scan 30,000 points to make a 3D model of your face. That’s how the iPhone X unlocks and makes animations that might have once required a Hollywood studio,” reported the Washington Post. “Now that a phone can scan your mug, what else might apps want to do with it? They could track your expressions to judge if you’re depressed. They could guess your gender, race and even sexuality. They might combine your face with other data to observe you in stores—or walking down the street.”
Though the Washington Post praised Apple “for storing the face data it uses to unlock the iPhone X securely on the phone, instead of sending it to its servers over the Internet,” they criticized “how the iPhone lets other apps now tap into two eerie views from the so-called TrueDepth camera.”
“There’s a wireframe representation of your face and a live read-out of 52 unique micro-movements in your eyelids, mouth and other features,” the Washington Post explained. “Apps can store that data on their own computers.”
In a statement, one Apple spokesman attempted to quell concerns, claiming, “We take privacy and security very seriously.”
“This commitment is reflected in the strong protections we have built around Face ID data—protecting it with the Secure Enclave in iPhone X,” the spokesman declared. “As well as many other technical safeguards we have built into iOS.”
However, Jay Stanley, a senior policy analyst for the American Civil Liberties Union, disagreed.
“I think we should be quite worried,” Stanley expressed. “The chances we are going to see mischief around facial data is pretty high,” he added. “If not today, then soon—if not on Apple then on Android.”
Last year, it was reported that half of American adults are tagged in a facial recognition database accessible to law enforcement, while the FBI’s facial recognition database is ten times bigger than originally promised, 90 percent of which is made up of non-offending citizens.
In September, LGBT groups expressed concern after artificial intelligence was able to “accurately” determine whether people were gay or straight based on photos of their faces.
Charlie Nash is a reporter for Breitbart Tech. You can follow him on Twitter @MrNashington and Gab @Nash, or like his page at Facebook.
COMMENTS
Please let us know if you're having issues with commenting.