AFL Insiders Say Leaked Explicit Photos of Players Were ‘Fake’ AI-Generated Images

2 min read
Disclaimer

This article is for general information purposes only and isn’t intended to be financial product advice. You should always obtain your own independent advice before making any financial decisions. The Chainsaw and its contributors aren’t liable for any decisions based on this content.

Share

Follow

AI-generated images are theorised to be behind a troubling invasion of privacy in Australian Football League (AFL) circles. In a disturbing turn of events, the AFL found itself deep in a scandal involving a leak claiming to show explicit photographs of 45 current and former players. The images were placed in a Google Drive folder, which were publicly available to anyone who had the link before the folder was removed.

More than one commentator has suggested that the images were “fakes”, however the AFL has not confirmed if indeed this was an example of “deepfake” porn (AI-generated porn in the likeness of someone).

Brownlow Medallist Jimmy Bartel said on 3AW Breakfast said, “It’s gross – a majority of images are fake, made up, staged.”

Paul Marsh, the Australian Football League Players Association chief executive said, “The AFL Players’ Association is aware of the AFL investigating a collection of explicit images that have been distributed of past and present players without their consent. While it is important to note that some of the images may not be legitimate, this is an appalling and disgusting act and a likely unlawful breach of privacy that is unacceptable. We ask the public to treat this matter seriously by not seeking out or sharing any of these photos and respecting the rights and privacy of those impacted.”

AFL images that were leaked were explicit but it seems they were generated by AI
Both images used in this article have used models.

Laws around Deepfakes

Existing laws are thought to be robust enough to protect victims from this type of AI-generated pornographic images. While no deepfake cases have been prosecuted yet in Australia, the above situation could be the first case to set a precedent. 

Because this is a new area for legal teams, it seems to be a complex issue. White Knight Lawyers in Sydney shared that in Australia, formal regulation targeting deepfakes has not been drawn up yet. But it might not be needed, as protection already exists within existing defamation laws.

“The tort of defamation may provide some recourse for a victim of a deepfake. Deepfake creators can maliciously create deepfake content falsely depicting victims in compromising situations, defaming the reputation of the victim by making various defamatory imputations,” the firm stated.

“For instance, a vengeful former partner can create videos portraying their victim engaging in sex acts; political parties can create videos of their opponents consuming drugs. If defamatory deepfakes are created and published, the victim of the deepfake may have a claim against anyone involved in the publication of the deepfake to compensate for the damage to the victim’s reputation.”

While there may still be abuse material of the AFL players online, the sharing of these images is still illegal, as images that are in a victim’s likeness distributed without consent of the victim is illegal in Australia.

The leak of the images, whether they be AI-generated or not, raises some new questions around the risks associated with emerging technologies.