Having worked in the tech industry for decades, I have gotten used to the idea that some tech companies do things a little bit differently from everyone else. One of the companies that I once did a project for had installed a huge sliding board for employees who preferred not to take the stairs. Another company had its own video arcade for employees. Still another company put bean-bag chairs in the conference room. The point is that the Silicon Valley crowd has always marched to the beat of a different drummer (which I love). Even though certain eccentricities are to be expected from tech companies, every once in a while something happens that leaves me scratching my head in confusion. A case in point was Facebook’s recent announcement, which I initially assumed was a joke. So what was this big announcement? Facebook recently requested that Australian users upload nude selfies.
You just can’t make this stuff up.
Facebook’s rather unorthodox request is not a prank, or an attempt to create an adults-only social network, but rather an effort to prevent revenge porn. When I first read the reason why Facebook wanted these photos, I assumed that the story had to be a hoax. I mean think of how the request sounds. On the surface, Facebook’s request makes it sound as if the company is saying that the best way to prevent someone from embarrassing you with compromising photos later on is to beat them to the punch by uploading those photos on Facebook yourself first. As you have probably guessed, however, Facebook isn’t actually asking people to post nude selfies.
Apparently, revenge porn has become a big problem for social media sites such as Facebook. For those who might not be familiar with the concept, revenge porn refers to the public posting of intimate photos as a form of revenge following a breakup or similar situation. Facebook’s idea is to compile a database of photographs that people do not want appearing on social media, so that attempted uploads of those photos can be blocked.
But what about personal privacy? According to at least some sources, Facebook is not planning on storing the actual nude photos. Instead, they will retain the photos for just long enough to create a mathematical hash. When Facebook users upload photos in the future, those photos will also be hashed, and the hash will be compared against the hashes in the database in an effort to determine whether or not the photo was previously submitted. The process works very similarly to antivirus software that uses a signature database to identify viruses, only in this case the “signatures” within the database are being used to identify specific photographs.
Facebook’s intentions seem noble enough, but there are at least a couple of potential problems with it. The most obvious is the potential for unwanted exposure. The moment that you share a picture with Facebook (or anyone else for that matter), you no longer have sole control over the image. Even if Facebook does not store a copy of uploaded nude images, what is to stop a hacker from intercepting the images in transit? Once a hacker has possession of the images, they can use them as they see fit. A hacker might, for example, sell the images to a porn site. Conversely, a hacker could conceivably even go so far as to develop a new type of ransomware, in which victim’s photos are distributed to the people on their address list, unless the victim pays up.
Is it hacker-proof?
I have no doubt that Facebook will use SSL or some other form of encryption to prevent sensitive images from being intercepted. Even so, a hacker would not necessarily have to break the encryption to intercept the image. If a hacker were able to plant malware on the victim’s computer, then the malware could be designed to identify images being sent to Facebook, and make a copy of those images (from the local computer where the original copy of the image resides), without ever having to worry about any of the security that has been put into place by Facebook.
For the casual Facebook user, recent high-profile security breaches targeting companies such as Uber, Verizon, and Equifax have not exactly inspired consumer confidence in the way that companies handle sensitive data. As such, users may be understandably reluctant to upload their most private photos to Facebook. Even so, there is a way that Facebook could accomplish its objective without requiring anyone to upload any potentially embarrassing photos.
Rather than asking subscribers to submit their nude photos, Facebook could create a small app for users to download to their own PC. Such an app could create the necessary mathematical hash locally on the user’s device, and then upload the hash to Facebook, without the need for uploading the actual photo. This would not only help protect Facebook subscriber’s privacy, it would also reduce Facebook’s bandwidth costs.
Another problem with Facebook’s approach is that using a hash to identify a photo isn’t completely reliable. Altering a photograph in any way would likely change the photograph’s hash. As such, someone who wanted to upload an image that has been blocked could probably circumvent the upload filter by cropping the picture, adding some text, or making some other modification.
Unfortunately, there isn’t an easy solution to this challenge, but I do have one idea. Facebook has enormous computing resources at its disposal. In addition to the systems that are already online, Facebook recently announced plans to build a $1 billion, 1 million square foot datacenter in Virginia. Facebook could leverage some of its vast computing power, and use artificial intelligence to spot compromising photos before they can be exposed to the world.
I will be the first to admit that the sheer size of Facebook’s subscriber base would make real-time detection of such photos difficult. However, Facebook wouldn’t have to completely reinvent the wheel. For example, Facebook could base the detection algorithms on existing facial recognition technology. An AI engine might, for example, determine who is in a particular photo, and also determine whether any of the people in the photo are nude. If the software is able to successfully determine the identity of someone who appears in a nude photo, then the upload could be blocked, and the person who appears in the photo could be automatically alerted to the attempted upload.
Of course, Facebook would have to make sure that the algorithm works perfectly before using it in the real world. Back when fingerprint recognition technology first became available, the company that I worked for at the time decided to experiment with it. The software’s fingerprint recognition algorithm was so inaccurate that it could not differentiate between my fingerprints and those of half a dozen other people in my office. Now imagine what would happen if the facial recognition engine had a similar level of accuracy. People would get notifications that total strangers were uploading nude photos of them. Never mind how much fun the lawyers would have with the situation. Fortunately, facial recognition technology works pretty well. Besides, Facebook could also leverage its massive database of personal connections as a tool for weeding out potential false positives.
While Facebook has an interesting idea, I think that there are better ways to achieve the desired result. In any case, it is interesting to see technology evolve to the point that compromising photos can be blocked before they are made public.