Sex is productivity.

Deepfake This technology is known to everyone, and the level of development is beyond everyone’s expectations. The bottom line is that Deepfake has a great soil – pornography.

According to CNN, in the Deepfake video that appeared last year, about 96% is pornography. These videos are all dominated by women, and they are mostly popular stars. This is where Deepfake’s “charm” is, it allows stars to “customize” pornographic content for you, not just a one-time fantasy, but also a highly realistic sex video.

▲ Deepfake video made with the face of actress Galgato

If Paris Hilton was born in 2019, she would have a hard time getting angry. Nowadays, the sex video is so easy to get, even if the real person plays a limit action movie, it is difficult to get the attention of the year.

Deepfake’s threat to people is expanding, not only for direct female victims, but also for all people watching videos.

▲ Image from: Life2Coding

Everything is being upgraded. At first, Deepfake is only proprietary to the technicians. The complex code makes it very expensive to generate a video.

With the lowering of the threshold, Stars such as Scarlett Johansson and Emma Watson are all “selling in the sea”, each of which is a terrible sex video. Now, Deepfake video can easily threaten everyone, politicians, your neighbors, your girlfriend… It’s so easy to generate a dedicated Deepfake video just by getting a positive photo of the other person on the social network. It could be a sex video or a fake political speech video.

Deepfake threatens everyone. When you can’t tell the truth, you start to doubt everything, and it’s easy to be fooled by fake videos.

▲ Image from: Giphy

To this end, Adobe is working with the US Department of Defense to develop tools to help determine the authenticity of the image; researchers at the University of Oregon want to make animals Help us identify fake video and audio; other startups start to aim at image recognition business, automatically add time and location information to photos, and provide relevant true and false information confirmation.

Technology is looking for a more appropriate solution, but all restrictions also require the participation of laws and regulations.

This time, the law is finally no longer missing.

Last week, California Governor Gavin Newsom signed two new laws One of them is to prohibit anyone from posting a Deepfake video about the candidate within 60 days of the election. Another decree is to allow residents of the state to sue anyone who uses Deepfake technology to put their images into pornographic videos.

▲ Gavin Newsom

It is aimed at the two most controversial parts of Deepfake video: pornography and political elections.

Marc Berman, also a representative from California, believes that videos made by Deepfake may be a dangerous tool to mislead voters to make choices.

The voters have the right to know that the videos, audio and images they see may be manipulated in upcoming elections, which will affect their voting, but this does not represent reality.

▲ Image from: medium.com

But is this really enough? Is it enough to minimize the interference of the Deepfake video on the election within a limited time? Is it enough for the victim to sue those who have made a pornographic video for themselves without permission?

This may not be enough.

Innocent women who have been “shot” a Deepfake porn video can only guarantee their rights through a lengthy litigation process, and only California has the relevant laws to support you in defending your rights.

This is not enough.

At present, cyber security expert Deeptrace has discovered 14,678 Deepfake videos, and the number of videos has increased by 84% compared to the first statistics in 2018, most of which are artificially produced sex videos. There are more than 13,000 Deepfake videos on related porn sites, and their protagonists include female stars, net red and ordinary people.

▲ Image from: Wired

The development of technology has given them the trouble they would not have.

Real-time video simulation software Face2Face developer Justus Thies also realized the dangers of this technology, he isIn an interview, representation :

If this type of video software is widely used, it will have a dramatic impact on society. This is one of the reasons why we don’t open up the software code. If you let immature people come into contact with such software, it will greatly improve the level of cyberbullying.

▲ Face2Face

The cyberbullying caused by Deepfake has already begun to take shape, and the relevant laws and regulations are not yet ready. The cautious attempt by the “first mover” California was accompanied by a lot of controversy.

Legislative restrictions on the use of new technologies in specific areas are only the beginning, how to regulate the use of technology, how to protect the rights of users? These should be more discussed and focused.

The technology used by Deepfake is largely blank in the legal arena, and the introduction of two new California laws is a positive sign.

It tells everyone that the new technology involved in Deepfake video should not be the legal guarantee of the producer.

The title map is from The Conversation.