
Used under a Creative Commons Licence
What can you do if someone defames you or uses deepfake?
Deepfakes are becoming a huge issue and so is their potential for defamation.
Imagine logging onto social media and discovering a video of yourself, or someone you love, performing explicit acts—except it isn’t real.
The face is yours, but the body, the actions, and the entire scenario have been digitally manipulated.
The damage, however, is very real. Jobs are lost, reputations are shattered, and in some cases, victims suffer severe psychological distress. This is not a hypothetical problem—it is already happening, and under Australian law, the legal framework to deal with it is limited.
As Mark Twain famously once said, “A lie can travel halfway around the world while the truth is still putting on its shoes.”
In the age of deepfakes, this statement has never been more relevant.
Once a deepfake video is online, it can spread at an uncontrollable pace, viewed by thousands—if not millions—before the victim is even aware of its existence. By the time they realise the damage, the falsehood has embedded itself in the public consciousness, and removing it becomes nearly impossible. The law struggles to keep up with the speed and scale of digital defamation, leaving victims in a difficult position.
At present, defamation law in Australia provides one of the primary legal avenues for victims of deepfake abuse.
A person may sue for defamation if the false content has caused serious harm to their reputation.
However, deepfakes pose a unique challenge because, unlike traditional defamatory statements, they are not written words or direct accusations—they are highly realistic false imagery and video. This can complicate arguments around whether the content is “published” in a way that meets defamation law requirements, particularly when the identity of the creator is unknown.
Another potential legal avenue is privacy law, though Australian privacy protections are still evolving. The Privacy Act 1988 (Cth) primarily applies to businesses handling personal data, rather than individuals engaging in harmful conduct online.
There is no explicit “right to privacy” in Australian law, meaning victims of deepfakes struggle to argue their likeness has been used without consent. The use of someone’s face in a digitally manipulated video does not neatly fit within existing privacy protections, leaving victims with limited legal options.
There is also criminal law, which has seen some recent progress in dealing with deepfakes. In 2023, amendments to the Criminal Code Act 1995 (Cth) made it a crime to create or distribute non-consensual deepfake pornography. This law recognises the serious harm caused by fake sexually explicit content, and offenders can face significant penalties. However, this legislation only applies to cases involving explicit content, meaning deepfakes that damage a person’s reputation in other ways—such as making them appear intoxicated, engaging in illegal activity, or saying something offensive—may not be covered.
Then there is the question of platform liability. Social media sites and video-hosting platforms often act as the breeding ground for deepfake content, allowing harmful material to spread virally before a victim even becomes aware of it. Under Australian law, platforms have some protections against liability, but there are growing calls for stricter regulations that force tech companies to take proactive steps to remove harmful deepfake content and prevent it from being uploaded in the first place. Some platforms already have deepfake detection policies, but enforcement is inconsistent, and victims often struggle to have damaging content removed swiftly.
As the technology improves, the challenge becomes even greater. AI-generated content is reaching a level where it is almost indistinguishable from reality. Jordan Peele, in his viral deepfake PSA featuring Barack Obama, warned: “We are entering an era in which our enemies can make anyone say anything at any point in time.” This statement highlights the broader implications of deepfakes—not just for individuals, but for trust in media, politics, and society as a whole. If video evidence—once considered irrefutable proof—can no longer be trusted, what does that mean for justice, truth, and reputation?
One of the biggest challenges in taking legal action against deepfake defamation might just well be identifying the perpetrator.
Many deepfake creators operate anonymously. They can encrypted networks and fake profiles to distribute content.
Even when they are known, enforcing legal consequences can be costly and time-consuming. This means that even if defamation, privacy, or criminal laws apply, holding someone accountable is often an uphill battle.
So, what can be done?
This is where legal strategy becomes critical.
At Sharon Givoni Consulting, we assist victims of deepfake defamation by taking a multi-pronged legal approach. This includes:
- Legal Advice & Case Assessment – Determining the best legal avenue, whether defamation, privacy or other action
- Strategic Cease and Desist Letters – Warning people and letting platforms know
- Litigation & Suing for Defamation – Taking court action if necessary
For more about defamation law, read: https://www.artslaw.com.au/information-sheet/defamation-law/ and https://sharongivoni.com.au/when-does-online-defamation-cross-the-line/
For more about how to spot deepfakes read: https://www.theguardian.com/technology/2020/jan/13/what-are-deepfakes-and-how-can-you-spot-them.
We believe deepfake defamation will become an increasingly serious issue in Australia as AI-generated content becomes more sophisticated and accessible. At this stage, we can only work with the laws we have, but we know how to navigate those laws to provide the strongest possible protection for our clients. If you or your business has been affected by deepfake defamation or online reputation damage, seeking legal advice early is key—the faster we act, the better the chance of controlling the fallout.
Conclusion
We have successfully resolved numerous defamation disputes, achieving strong outcomes for our clients. For expert legal guidance in defamation law, internet law, and deepfake reputation protection, contact Sharon Givoni Consulting today. For more about our defamation services click here: https://sharongivoni.com.au/services/defamation-law/online-defamation/.
Please note the above article is general in nature and does not constitute legal advice.
Please email us info@iplegal.com.au if you need legal advice about your brand or another legal matter in this area generally.