FaceApp, which edits a person’s photo to imagine how they might look at an older age, quickly went viral this week before becoming the subject of yet another data privacy concern.
When users sign up on the app, they agree to, among other things, granting FaceApp “a perpetual, irrevocable, nonexclusive, royalty-free, worldwide, fully-paid, transferable sub-licensable license to use, reproduce, modify, adapt, publish, translate, create derivative works from, distribute, publicly perform and display your User Content and any name, username or likeness provided in connection with your User Content in all media formats and channels now known or later developed, without compensation to you.”
“It’s part of the same trend of tech companies writing these overly broad terms of service that gives them way more power than they actually need to perform the service,” Gillula said.
Nonetheless, concerns that FaceApp could aid in training facial recognition software, or that FaceApp was somehow providing information to the Russian government, continued. U.S. Sen. Chuck Schumer, D-N.Y., has even asked the FBI and the FTC to investigate the app to learn how it collects user data and for what it might be used.
“In an age of facial recognition technology as both a surveillance and security use, it is essential that users have the information they need to ensure their personal and biometric data remains secure,” Schumer wrote in a letter about his concerns. “Including from hostile foreign nations.”
Many of those concerns have been described as both overblown and rooted in xenophobia. In a statement to TechCrunch, FaceApp said photos are altered in the cloud and most data isn’t transferred to Russia. The company also said that “most images” were deleted from its servers within 48 hours of them being uploaded.
Gillula said that while any image of a person could theoretically be used to train a facial recognition algorithm, it’s not as easy as uploading an image by itself. It often requires time-intensive legwork to label and tag a trove of images in a way that could be used to teach an automated system whether it is properly identifying images and faces, he said.
“If the Russian government wanted photos of a ton of people’s faces, I suppose this is one way they could possibly get it, but there are lots of other ways of possibly getting it,” Gillula said, pointing to existing databases that contain labeled images.
Some argued the brouhaha over FaceApp was a test of whether Americans cared about their privacy enough to sit out a viral online moment. And as consumers and Congress alike begin to catch up with the existing realities of “surveillance capitalism,” there’s a question of whether people give much thought to ramifications of uploading photos and other information to platforms like Facebook, Instagram and Snapchat.
The issue, in some ways, parallels the fallout from the Facebook app “This Is Your Digital Life.” Many thought the app was a harmless personality quiz, but the personal data about users and their friends was later used by the political firm Cambridge Analytica for political ad-targeting. While FaceApp hasn’t faced a breach or a misuse of data, there is a renewed focus on how much users might risk when they use free apps and services online.
There are always risks to uploading images to the internet, so Gillula suggested not uploading any photo that could potentially be compromising—just in case hackers ever find their way into FaceApp’s image trove.
“What we really need to level the playing field is to work as a team of all consumers,” he said. “To say, actually, companies, we’re going to set some higher bars for you that you have to hold yourselves to.”