Deepfake videos have been around for some time, but now the artificial intelligence (AI) platforms that create them are getting so sophisticated that it’s nearly impossible to pick them out over real images.
What is a deepfake?
This kind of video is a type of synthetic media that is created using advanced (AI) technology. These videos use AI to superimpose someone’s face and voice onto another person’s body.
These videos can look shockingly like the target person. This technology can be used to create videos of people saying or doing things they never actually did. Which is great fun in the entertainment industry.
But of course, the tech can also be used in serious situations that have life-changing repercussions for individuals and whole countries. An example is that of politicians embroiled in sex tape scandals or drugs scandals.
The rise of porn
Then there’s the other issue — deepfake porn. While there’s nothing illegal about porn where consenting adults are involved, deep fake porn is a whole other issue. And the AI behind these videos are getting more intuitive.
Synthetic non-consensual exploitative images (NCEI) are now everywhere. Beyonce, Emma Watson, Scarlet Johansson, Margot Robbie and Taylor Swift are among the many names of celebrities that have had fake porn appear in their likeness.
The social media platforms have been having issues with deepfakes for a while, and Twitch is one of them. The platform is primarily known as a livestreaming platform for video game content, where gamers can broadcast themselves playing games and chat with their audiences. But sometimes, streamers have things on their computers that their followers see accidentally. Like open tabs running porn.
The social media platform took the time this week to write a blog about the issue, making it clear that NCEI, or deepfake porn, will not be tolerated on the platform. A sharing of this kind of porn will lead to an Instaban on the first offense.
Are these videos legal in Australia?
The legality of this kind of porn remains a complex issue. White Knight Lawyers in Sydney shared that in Australia, we’re yet to see formal regulation targeting deepfakes, and the only legal protection is around defamation.
“The tort of defamation may provide some recourse for a victim of a deepfake. Deepfake creators can maliciously create deepfake content falsely depicting victims in compromising situations, defaming the reputation of the victim by making various defamatory imputations, the firm shared.
For instance, a vengeful former partner can create videos portraying their victim engaging in sex acts; political parties can create videos of their opponents consuming drugs. If defamatory deepfakes are created and published, the victim of the deepfake may have a claim against anyone involved in the publication of the deepfake to compensate for the damage to the victim’s reputation.”
Defamation law also applies to digitally altered images, and it “appears to be well-suited to managing some of the issues which will be caused by deepfakes.”
The lack of regulation around deepfakes is a complaint that Australian consent activist Chanel Contos recently made public in an attempt to push the country towards more sturdy rules.
How to spot a deepfake
Deepfakes are becoming increasingly difficult to distinguish from real videos or images, but there are some signs you can look out for. Keep an eye on unnatural movements, lack of shadows or reflections and inconsistency in facial expressions.
1. Uncanny valley effect: The deepfake may look almost realistic, but something is kinda off, it may make the video look strange or uncanny.
2. Visual artifacts: If you look closely at the fake, you may notice blurry edges, inconsistent lighting, or distortions around the person’s face.
3. Unnatural movements: If the movements don’t look natural, or if their facial expressions don’t match the tone of their voice, it could be fake.
4. Inconsistencies: Inconsistencies like background or lighting changes could be a sign that it has been manipulated.
5. Audio anomalies: If the audio seems off, such as if the person’s voice doesn’t sound quite right, it could be a fake.