Contact us

Book consultation

Visit our office

Book consultation

Contact our team to find out if we can help.

Book a free call to discuss your matter with us. Please leave your details and we will call you. We would also ask you to briefly describe your matter in the notes section, for the assessment before the call.

Please kindly note, we'll try to call you within the one hour slot you book, however, sometimes we'll have to reschedule the call.

Please answer mandatory questions below.






    Contact Us

    Corporate services

    Individual services

    The Age of AI Must Respect the Age of IP

    Studio Ghibli images shook up the social media world over the weekend. This, in turn, raised a complex question in the IP space: did it amount to copyright infringement? Whether the answer is yes or no, the message is clear – the age of AI must respect the age of IP.

    Let’s take a closer look at Synthesia, a company that recently made headlines for taking a different approach to copyright in the world of AI. Synthesia is a London-based AI start-up that specialises in generating digital avatars for use in video content. Just two months ago, the company raised $180 million in a Series D funding round. Now, it’s gaining attention not just for what it builds, but for how it engages with the people whose data goes into its technology.

    In a landmark move, they are offering equity to actors whose likenesses were used to train its hyper-realistic AI avatars. This approach is a sign that IP rights are becoming a central part of responsible AI development. And it’s a move other players in the space should be paying attention to.

    The Risk Behind the Data

    AI models rely on datasets: images, text, voice recordings, video clips. When that training material includes copyrighted content or elements protected under publicity and personality rights (for example, person’s likeness or voice), unauthorised use can result in infringement.

    While some rely on doctrines like “fair use” to justify training practices, this legal territory remains uncertain. Courts haven’t yet spoken decisively (and we are looking forward to that), but the reputational and financial risks are already here.

    Companies like OpenAI, Google, and Synthesia are under increasing pressure to operate on solid legal and ethical ground. Synthesia’s decision to offer equity to data contributors (the actors behind their avatars) recognises the role humans play in shaping AI and acknowledges that those contributions deserve fair treatment and compensation.

    In today’s world, compliance with IP law is no longer an afterthought. It’s a strategic decision. The most successful companies in this space will not only have advanced technology. They will also have transparent, lawful, and collaborative data practices.

    How to Stay Out of Trouble…

    … or at least try to. If you’re working on AI models that involve creative content or public-facing data, here are 3 key risks to keep in mind and practical steps to minimise them:

    ❌ 1. Using Copyrighted Content Without Permission

    Training on books, films, music, or artwork without a proper licence puts your company at legal risk.

    ✅ Secure explicit licences and prioritise datasets that are either in the public domain or clearly licensed for training purposes.

    ❌ 2. Breaching Publicity and Personality Rights

    AI models that imitate someone’s face, voice, or mannerisms can cross legal boundaries – even unintentionally.

    ✅ Always obtain consent. If you’re using a real person’s data, they should be fully informed and fairly compensated.

    ❌ 3. Generating Derivative Works Without Rights

    AI-generated outputs that closely mimic existing works may be viewed as unauthorised derivatives under copyright law.

    ✅ Avoid close mimicry unless you have the rights to the source material. Include clear disclaimers and track content provenance where possible.

    In the UK, copyright infringement can lead to large fines and even imprisonment. Civil damages vary but can reach ££££££££ in compensation, especially where loss of earnings or reputational harm is proven. For AI developers, using unlicensed content isn’t just risky – it can be extremely costly.

    Synthesia’s approach points toward a future where the relationship between AI developers and content contributors is more collaborative than extractive. It’s a thoughtful step forward from a legal, ethical, and commercial perspective. For any developer working with content, the time to build IP-aware practices is now.

    See all