Artificial intelligence is quickly reshaping marketing, with everything from generating content to optimising campaigns receiving the robot touch. But with these powerful capabilities come complex legal questions - especially for startups and solo marketeers.
So, we sat down with Kristy Coleman, a legal expert specialising in the creative and digital sectors, to unpack exactly what we need to know before incorporating AI into our marketing contracts. Keep scrolling for the intel.
Artificial intelligence (AI) is being dubbed the fourth industrial revolution, and it is rapidly transforming marketing. Its application ranges from generating copy and images to creating pitch documents and even automating campaign optimisation. In the UK, there is a non-legislative, sector-led approach to AI regulation, which doesn’t mean it comes without legal considerations and complexities. For startup marketing agencies and solo marketeers, it’s easy to focus on the benefits—such as speed and creativity—and overlook the risks contained in contracts. Whether you’re signing with a client, sub-contractor or freelancer, understanding how AI fits into legal agreements is essential.
Before you sign your next contract involving AI-powered tools or deliverables, here’s what you need to know:
Q: Who owns AI-generated content?
A: One of the most common misconceptions is that you or your client automatically own the content produced using tools like ChatGPT or other generative AI platforms. The reality is more nuanced.
For example, if a work (e.g. copy) is created by a computer and there is no human author, the law assigns ownership to the person who was responsible for the creation of the work, which in most cases will be the person or agency that instructed the platform to create the work. However, some generative AI tools use other people’s content, which may give rise to intellectual property infringement. There’s also a question as to whether works produced by generative AI would even qualify as copyright works and therefore be protected.
Tip: Ensure contracts clearly state who owns any content produced, regardless of whether AI tools were used. In client contracts, if you want to retain portfolio rights or reuse materials, specify those terms. If you’ve got a team, make sure you have an AI use policy.
Q: Liability when AI gets it wrong
A: AI tools are powerful and can be incredibly useful to businesses, but they are not perfect. For example, AI tools might write copy that includes inaccurate or even illegal claims, or generate an image using protected trade marks or copyrighted content. If AI gets it wrong and you’ve used it for a client, liability will fall on the agency—or you—if the contract doesn’t shift the risk.
Tip: Look for indemnity clauses that assign responsibility for errors, copyright violations, or misrepresentations. Does the client have final sign-off on campaigns?
Q: Disclosing AI use to clients and customers
A: Regulators are beginning to crack down on deceptive marketing practices involving AI, such as undisclosed synthetic content (such as imagery), fake testimonials (illegal), or AI influencers posing as real people, and even deepfakes. You may even find that clients expect full transparency about your use of AI to create work for them—or prohibit it altogether—especially where publicly facing material is concerned.
Tip: Your client contracts should clarify when and how AI will be used, and whether the final content is reviewed or edited by a human. Remember, it’s a tool, not a perfect solution.
Q: Vendor agreements for AI tools
A: If you or your agency uses third-party AI tools, the terms and conditions of use matter (yes, you need to read the small print). Many AI platforms include clauses that limit their liability, deny content ownership, or allow them to use your inputs and outputs to train their models. This poses a bigger issue around data, confidentiality, and potentially unwittingly sharing your creative ideas with the unknown.
Tip: Before you incorporate an AI tool into your client deliverables, review the terms of service. If it includes broad data rights or risky disclaimers, you may need to negotiate exceptions, find a more compliant alternative, or even avoid using it for data you aren’t happy being regenerated.
This is a growing risk in AI marketing, and it’s why your contracts must be proactive, not reactive.
Final thoughts: build legal literacy into your growth plan Don’t be put off by the risks. Instead, build awareness into your processes and get support when you’re unsure. Conduct a regular AI audit of your business to assess how and where you are using AI, what your contracts say, and what your internal use policy is. Use your audit to fill in the gaps and strengthen any weaknesses, such as implementing an internal policy on AI and amending client contracts.