Used under a Creative Commons Licence
ChatGPT Can Chat. We Give Legal Advice
At Sharon Givoni Consulting, many clients now arrive with something written by an AI system like ChatGPT or another online “legal” tool. These tools can be helpful for ideas, but they often get the law wrong, use the wrong country’s rules, or miss key details about your situation.
Your intellectual property, contracts and reputation are too important to hand over to a robot, so this blog explains, in plain language, how we work with AI and what our rules are.
Rule 1: AI is not legal advice
AI gives general information, not legal advice tailored to you. It does not know all the facts of your matter, it does not ask follow‑up questions like a good lawyer, and it does not carry any professional responsibility if it is wrong. That means anything AI tells you is only a rough starting point, not something you should rely on for an important decision.
At our firm, only a qualified lawyer gives you legal advice. We may look at what you bring us from AI as background, but we will not treat it as “the law” or follow it blindly. If there is a conflict between what AI suggests and what the law actually says for your situation, our advice wins every time.
Rule 2: We must check anything AI writes
Rule 3: Your instructions must still be your own
Sometimes clients arrive and say, “Just do exactly what ChatGPT wrote.” That is not how good legal practice works. Our duty is to act in your best interests, using our own independent judgment. That means we need to understand what you actually want, what your risk tolerance is, what your commercial goals are and what your legal rights and duties are.
You are always in charge of big decisions, such as whether to settle or litigate, sign or walk away, change your branding or fight over it. But you cannot delegate your decision‑making to an AI, and neither can we. We will help you think through options that make sense for you, not for some hypothetical “average” person in an AI model.
Rule 4: We will tell you when AI is wrong
There will be times when you bring us AI‑generated content that looks polished but is legally wrong, over‑the‑top, or not in your interests. In those cases, we will say so clearly. We may advise you that a claim ChatGPT has drafted overstates your rights, makes threats you cannot back up, or asks for remedies the law does not allow.
Part of our job is to push back when a path is risky, unrealistic or unethical, even if an AI tool suggested it and even if it sounds attractive. If we think something will hurt your position in the long run – for example by making you look unreasonable, breaching a contract, or annoying a judge – we will recommend a better approach.
Rule 5: We protect your confidentiality
Putting your confidential information into a public AI system is not the same as emailing your lawyer. Many online tools store prompts, and some use them to improve their models. That can create real risks for trade secrets, new brands, creative concepts, and sensitive disputes.
Our policy is simple: you should not paste confidential or identifying details about your legal matter into public AI tools. Instead, tell us first. Once you are our client, you are protected by legal professional privilege (subject to some limits), and we are bound to keep your information confidential. We will never put your confidential material into a public AI system without your informed consent, and, in most cases, we will avoid doing so altogether.
Rule 6: How we may use AI behind the scenes
Used carefully, AI can help with simple, low‑risk tasks. For example, it can sometimes help brainstorm structures, spot obvious drafting gaps, or generate alternative wordings that we then refine. If we use AI at all on your matter, it will be under close human supervision, and always behind a lawyer “firewall”.
Any AI output is treated as a draft only. We check every point of substance against proper legal sources and our experience before it forms part of your advice or documents. You will not be charged for “AI time” as a separate thing; you are charged for the value of the professional work, which always remains our responsibility.
Rule 7: When we may refuse to act
Occasionally, a client may insist that we send or file an AI‑generated document exactly as written, even when we have clearly explained that it is wrong, misleading or inappropriate. In those rare cases, we may have to decline to act. Our professional duties require us to be honest with courts and regulators and not to advance claims that are hopeless, abusive or misleading.
We also may step back if a client repeatedly refuses to follow our advice about serious confidentiality risks (for example, continuing to upload entire briefs, trade secrets or personal data into public AI tools despite clear warnings). We do this not to be difficult, but because your matter, our reputation and the integrity of the legal system are all on the line.
What this means for you in practice
For most clients, this policy is reassuring rather than restrictive. In practice, it means:
You are free to use AI for early thinking, lists of questions to ask us, or rough first drafts.
When you bring that material to us, we will check it, correct it and adapt it so it actually works for your situation.
We will be honest with you if AI has led you down the wrong track and will help you back onto the right one.
Your confidential information is safest when it stays between you and your lawyer, not between you and a robot.
AI is a powerful new tool, but it is not a lawyer and it is not your advocate. Our job is to do the thinking, questioning and careful checking that a machine cannot do for you, and to stand beside you when it really counts.
FAQs: AI and your legal rights in Australia
1. Can I use ChatGPT to draft my contract and just get you to “check” it?
Yes, but we will still need to review it carefully, correct errors and may suggest starting again where the structure is unsound. This review is chargeable work.
2. Are AI agreements legally binding in Australia?
A contract can be binding if it meets the normal legal requirements, regardless of how it was drafted, but AI tools often miss key terms or use the wrong law, which can make a contract risky or ineffective.
3. Is it safe to paste my trade mark or business idea into an AI prompt?
From a legal‑risk and confidentiality perspective, it is better not to. Talk to us before sharing sensitive information with any third‑party platform.
4. Will you tell a court if we used AI?
If a court or tribunal requires disclosure of AI use for certain documents, we will comply with that requirement and explain the context to you.
5. Does using AI make legal work cheaper?
Sometimes AI can speed up simple, background tasks, but checking and correcting AI output takes time. The real value lies in getting the right advice the first time, not in cutting corners.
At Sharon Givoni Consulting, we fix any AI‑written legal documents you bring us so they are accurate, lawful in Australia and right for your business.
We also give real‑world IP and brand advice that AI can’t, and we design legal strategies for you – AI should not make decisions for our clients.
We also have clear policies about how we work with AI and our clients – click here to read our AI and Client Instructions Policy.
Further Reading
How Does Copyright Law Work with AI? (Sharon Givoni Consulting)
https://sharongivoni.com.au/protecting-your-creativity-why-copyright-matters-for-australian-creators/
Is AI Stealing Your Style? Navigating Copyright in the Age of AI (Sharon Givoni Consulting)
https://sharongivoni.com.au/is-ai-stealing-your-style-navigating-copyright-in-the-age-of-ai/
AI & Copyright in Australia: What Artists and Businesses Need to Know (Sharon Givoni Consulting)
https://sharongivoni.com.au/ai-and-copyright-in-australia-what-artists-and-businesses-need-to-know/
AI and Copyright in Photography (Sharon Givoni Consulting)
https://sharongivoni.com.au/photographers-vs-algorithms-who-wins-the-copyright-battle/
business.gov.au – Intellectual property for small business
https://business.gov.au/planning/protect-your-brand-ip/intellectual-property
OAIC – Australian privacy principles and guidance on handling personal information
https://www.oaic.gov.au/privacy/the-privacy-act
Federal Court of Australia – Information on the use of AI in federal courts
https://www.fedcourt.gov.au/digital-law-library/judges-speeches/justice-needham/needham-j-20250627
Please note the above article is general in nature and does not constitute legal advice.
Please email us info@iplegal.com.au if you need legal advice about your brand or another legal matter in this area generally.

