Virginia has updated its revenge porn law to make it an offense to share deepfake photos and videos of people without their consent.
Such content uses machine learning to create fake videos that can sometimes look highly realistic. In other words, it can appear as if someone did something they didn’t do. The software used to create deepfakes is growing increasingly sophisticated, making it harder to tell whether or not the material is genuine.
Aiming to keep pace with advances in deepfake technology, lawmakers in the Virginia General Assembly have now incorporated deepfakes into the state’s existing law regarding revenge porn, which has been in force since 2014.
Listed as a Class 1 misdemeanor, it means a fine of as much as $2,500 or jail time of up to a year awaits anyone who shares nude or sexual content of a person in an effort to “coerce, harass, or intimidate” — regardless of whether the material is real or doctored.
Specifically, the amendment states that the content can comprise “a falsely created videographic or still image,” which means it also includes crudely altered images and videos using basic software, as well as more realistic content created using complex machine learning techniques.
Delegate Marcus B. Simon, who put forward the amendment, said recently that offenders “put [the material] on a website with the intent to coerce, harass, or maliciously hurt those folks.” He added, “These days you don’t even need to actually have photos like that of the person in your possession. All you have to have are pictures of their face. You can use artificial intelligence to wrap that on the body.”
Deepfakes first gained attention a couple of years ago with the arrival of fake celebrity porn videos, but the definition has since expanded to include bogus images and video of any nature.
In the realm of news and politics, deepfake technology, in the wrong hands, has the potential to cause increasing confusion among the masses as people try to work out what’s real and what isn’t — or they could simply assume what they see is real because it fits with their existing belief system.
The challenge now is to create software capable of detecting deepfakes, though the effectiveness of such an endeavor is questionable because, as the saying goes, “A lie gets halfway around the world before the truth has a chance to get its pants on.”