Some technologies are just a bit too creepy. And when they are, it doesn’t matter that they are technically achievable; they are still too creepy. Even in cases where such technology has some practical utility – if it is too creepy, it is too creepy and should not be used.
Facebook’s Moments is an example of something that is technically possible, and serves a convenient function, but is nevertheless too creepy.
Moments is a standalone application aimed at helping users organise and share their photos more efficiently. In the words of Facebook:
“Moments groups the photos on your phone based on when they were taken and, using facial recognition technology, which friends are in them. You can then privately sync those photos quickly and easily with specific friends, and they can choose to sync their photos with you as well. Moments also keeps all of your synced photos organized and even lets you search them to find the ones that you or specific friends are in.”
This is no doubt a useful app, but one with severe privacy implications. As pointed out elsewhere, it means that Facebook, in addition to the wealth of information it already holds about its users, and non-users, will use facial recognition technologies to know where we are and who we are with. This knowledge is perhaps harmless in many contexts, but not all (think of places such as a strip club, a religious or political gathering or a gay bar).
Another step in the face-to-data trend
Facebook Moments is by no means the first or only privacy threat in the facial recognition arena. In 2012, I wrote of the privacy risks associated with the trend of face-to-data.
Face-to-data refers to at least partially automated processes for accessing personal information about a person based on an image of that person’s face. This has become an accessible option for many people due to the availability of (1) cheap facial recognition software, (2) powerful processing power via cloud computing, and (3) a wealth of online data (including photos) about people, for example, via social media.
The fears I expressed then – ranging from the risk of stalking to discriminatory pricing in shops – have not been realised on any big scale (yet). But the risks remain as we gradually become more accustomed to these technologies. The concerns about the inadequacy of our data privacy laws in addressing these technologies, remain true today.
While face-to-data for business purposes may be regulated in some countries, personal use would typically be unregulated. This is a serious gap. Furthermore, even in those countries, such as Australia, where commercial use may fall under applicable data privacy schemes, it may be possible to circumvent the regulatory impact by gaining user consent. Consent is often easily achieved given virtually no one reads the lengthy, and typically complex, agreements we routinely click “I agree” to.
Once again it’s up to the users, not the regulators
Unless they overstep their users’ sensibilities so as to scare off users, social media organisations like Facebook will continue to push the envelope as far as they can. The more we are prepared to disclose, the better it is for their business.
Governments have interests in both protecting data privacy, and in encouraging data treasure-troves like what Facebook is building. Lawmakers, even where they genuinely wish to ensure adequate data privacy protection, will always be a step (or two) behind the technology development.
In light of this, the group best suited to look out for privacy interests is users, who need to “vote with their feet”. If a technology is too creepy, do not use it even if it is convenient. But this perhaps overly paternalistic advice takes us to a core problem with technologies such as Facebook Moments. You may not like Facebook, and you may not even use it, but chances are you’ll be subject to facial recognition apps with or without your consent.
Dan Jerker B. Svantesson is an ARC Future Fellow (project number FT120100583) and receives funding from the Australian Research Council. He is a member of the Australian Privacy Foundation. The views expressed herein are those of the author and are not necessarily those of the Australian Research Council or of the Australian Privacy Foundation.
Authors: The Conversation