How Facebook’s Anti-Revenge Porn Project Really Works - Rickey J. White, Jr. | RJW™
18835
post-template-default,single,single-post,postid-18835,single-format-standard,ajax_fade,page_not_loaded,,qode-theme-ver-16.3,qode-theme-bridge,wpb-js-composer js-comp-ver-5.4.7,vc_responsive
 

How Facebook’s Anti-Revenge Porn Project Really Works

How Facebook’s Anti-Revenge Porn Project Really Works

In an announcement that raised plenty of eyebrows, Facebook said this week that it wants to help potential revenge porn victims keep intimate images of themselves off the social network by uploading the images via Facebook-owned Messenger. The effort, which is being rolled out in an initial test in Australia, is an extension of existing Facebook tools for identifying and removing revenge porn imagery. The goal of the new tool is to allow Facebook to create digital fingerprints of images that could be used to block photos subsequently posted for revenge porn purposes.

The company promises it isn’t storing the images, and that the goal is simply to keep vengeful people from sharing intimate imagery on Facebook “in the first place,” the company’s head of global safety, Antigone Davis, wrote in a blog post.

Unfortunately, because of issues tied to the Russians’ use of Facebook to interfere in America’s election last year, and the company’s evolving explanations for how that happened, the public’s trust in Facebook is low at the moment, and some doubt whether it is able, or even willing, to truly erase the imagery.

But advocates for domestic violence victims think the new tool is a “bold move” and say that privacy concerns might be exaggerated.

“The timing is unfortunate because of scrutiny on other issues,” says Cindy Southworth, the executive vice president and founder of the Safety Net Technology Project, and someone who’s been advising Facebook on the development of the new tool for over a year. “I think it’s getting overblown because of much bigger frustration over Russia….I don’t want victims of domestic violence to lose out because of this timing.”

For some time, Facebook, as well as other tech giants like Twitter and Google, has had systems in place that allow revenge porn victims to report offending imagery and have it taken down. What Facebook’s new tool is designed to do is to allow people to send in images they think might be used for revenge porn purposes so that Facebook can proactively create a digital fingerprint that would be used to automatically reject matching images should they be posted later.

The pilot program works like this: Australians concerned about potentially becoming victims of revenge porn first complete a form on that country’s eSafety Commissioner’s Web site and then send the image or images they’re worried about to themselves via Messenger. The eSafety Commissioner’s office then notifies Facebook that the images have been sent, after which a member of Facebook’s Community Operations team would review the image or images in order to “hash” it, which “creates a human-unreadable, numerical fingerprint,” Davis wrote.

Then, Davis continued, “We store the photo hash–not the photo–to prevent someone from uploading the photo in the future. If someone tries to upload the image to our platform, like all photos on Facebook, it is run through a database of these hashes and if it matches, we do not allow it to be posted or shared.”

Perceptual image hashing is a common way of “creating a fingerprint of an image based on its visual appearance,” according to a Safari Books blog post. “This fingerprint makes comparing similar images much easier. Algorithms like the one below are used for many purposes, most notably Google’s Image Match where you can provide an image and it will return a listing of images that are visually similar.”

The technique does not utilize any artificial intelligence that would allow Facebook’s systems to identify similar images. That means, according to Southworth, that if someone is worried about becoming a victim of revenge porn, they would need to use the exact images they’re concerned might be “shared with their 93-year-old grandmother” on Facebook.

But she says revenge porn victims frequently have those exact images because couples often send them to each other using their phones or computers. It’s only later, when a relationship erodes that the images are used as revenge porn, perhaps as threats not to end the romance, or as punishment for having done so.

Southworth, who has advised Facebook on safety issues since 2010, but hasn’t received any payment for her consultations, says the company has a dedicated team that deals with highly-sensitive images–things like beheadings, child pornography, and revenge pornography–that works to verify such imagery is non-consensual, hash it, delete it, and then report it. “They know how to do this,” she says. “These are people who are highly trained” and often require therapy because of the intense nature of what they see every day.”

Although Facebook removes offending photographs after they’re reported, there’s still some risk that victims’ friends, family, or employers see the images before they disappear–and the subsequent fallout is exactly the point or revenge porn.

According to the research institute Data & Society, 10.4 million Americans, or 4% of the population, “have been a victim of revenge porn through threats or actual posting of sensitive images.” For members of the LBGTQ community, the number jumps to 15% who have been threatened with revenge porn, and 7% who have actually had non-consensual imagery posted.

From Southworth’s perspective, Facebook’s hashing process is sufficient to protect against any revenge porn imagery being posted, and to keep it from being stored on the company’s servers, and she’s hopeful that the pilot project will be successful enough in Australia that Facebook will roll the system out broadly. But she’s worried about the initial resistance to the idea of asking people to send in their imagery in the hope that it will protect them.

“I applaud Facebook’s boldness in wanting to go the extra mile,” she says. “Personally, I really hope that this overblown backlash doesn’t cause the company to back away from a plan to help victims globally. That would be devastating to me. There’s no other company saying [they’re] going to help proactively.”

Some experts in digital forensics have doubts that, best intentions aside, Facebook can truly remove all artifacts of the imagery that gets uploaded via Messenger, and that the human inspection element only complicates the matter.

“Totally deleting all trace of images from a computer system is not trivial to do,” says Lesley Carhart, a digital forensics specialist. “Simply deleting a file on most operating systems doesn’t remove the file from the system’s hard drive or memory–the files are still retrievable with forensic tools until they’ve been completely overwritten with something new. Add in the element of human reviewers validating every single photo, and there are a number of potential data leakage points.”

Further, Carhart expects that although Facebook “has a clever security team” that keeps its hashing operations “private and server side,” there’s likely to be an ongoing cat-and-mouse game between the company and the perpetrators of revenge porn who might find ways to modify small portions of images in a bid to defeat the algorithms.

Those efforts could include things like drastically altering image colors, adding new objects into a photo after the fact, flipping images around, and so on. But, Carhart says, “I’m quite sure [Facebook is] aware of this, and they seem to be taking every possible precaution available to prepare.”

And while she acknowledges the imperfections of Facebook’s algorithmic approach and “firmly” advocates against uploading sensitive imagery, Carhart lauds the company’s attempt to get in front of the revenge porn problem. “Action has to be taken,” she says. “To my knowledge, nobody has presented a more viable technical solution–this is the least awful of a number of unpleasant solutions.”


Source: Fast Company

Tags:
No Comments

Sorry, the comment form is closed at this time.