Are deepfakes legal? Here’s what the law says about the creepy video mashups - Rickey J. White, Jr. | RJW™
19603
post-template-default,single,single-post,postid-19603,single-format-standard,ajax_fade,page_not_loaded,,qode-theme-ver-16.3,qode-theme-bridge,wpb-js-composer js-comp-ver-5.4.7,vc_responsive
 

Are deepfakes legal? Here’s what the law says about the creepy video mashups

Are deepfakes legal? Here’s what the law says about the creepy video mashups

The latest online hysteria revolves around what’s known as “deepfakes,” which are pornographic videos that use algorithms and other computer coding tricks to graft celebrity faces onto porn-star bodies. Naturally, they first gained notoriety on Reddit, and have slowly gotten more and more popular. Reddit has now announced it’s banning them.

Deepfakes present a slew of problems–namely that they appear to be XXX videos involving people who actually had nothing to do with them. Wired recently wrote that in many cases, “the law can’t help you,” because they could be construed as parody or other protected works.

The Electronic Frontier Foundation, however, has a different opinion. In a new blog post written by its civil liberties director David Greene, the organization argues the law does offer some protections for deepfake victims. He points to a series of safeguards–for example, if people were being extorted or harassed–that could easily be used as a legal remedy. Greene also references a tort known as “False Light,” which addresses “photo manipulation, embellishment, and distortion, as well as deceptive uses of non-manipulated photos for illustrative purposes.” Green goes on: “Deepfakes fit into those areas quite easily.”

There are a few other other legal defenses that could be implemented when dealing with these fake pornographic videos. These include copyright infringement as well what’s known as “intentional infliction of emotional distress.” In short, while there is reason to be alarmed by the rise of these deeply problematic images, laws already do exist that would protect people if they find themselves unwittingly starring in a deepfake.

You can read the full EFF post here.


Source: Fast Company

Tags:
No Comments

Sorry, the comment form is closed at this time.