bto solicitors - Corporate & Commercial Business Lawyers Glasgow Edinburgh Scotland

  • "really fights your corner..."
    "really fights your corner..." Chambers UK
  • "Consistently high-quality work and client-friendly approach."
    "Consistently high-quality work and client-friendly approach." Chambers UK

AI Governance and the Copyright Challenge

12 June 2024

On 28 May 2024, the Science, Innovation and Technology Committee (“the Committee”), appointed by the House of Commons, published its last report of the 2019 to 2024 Parliament in relation to the governance of AI, the impact of AI on different areas of society and the economy, and whether and how AI and its different uses should be regulated. The report highlighted (amongst other things) what it describes as “The Twelve Challenges of AI Governance”, which challenges include: the bias challenge; the privacy challenge; the access to data challenge; the open-source challenge; the liability challenge; and the copyright challenge.

With reference in particular to the latter, the report concludes that the growing volume of litigation relating to alleged use of works protected by copyright to train AI models and tools, and the value of high-quality data needed to train future models, has underlined the need for a sustainable framework that “acknowledges the inevitable trade-offs and establishes clear, enforceable rules”.

Lauren McFarlane
Lauren McFarlane
Associate

In the UK, that litigation includes the ongoing Getty Images litigation, in which Getty Images has sued Stability AI (a London-based AI developer behind a range of different generative AI systems) for copyright infringement, and the defence to which includes i) the fact that no specific images from the training data set is memorised or otherwise reproduced in response to text prompts and ii) in the alternative, reliance on the pastiche exception (per Section 30A of the Copyright, Designs and Patents Act 1988) on the basis that the output images are artistic works “in a style that may imitate that of another work, artist or period, or consist of a medley of material imitating a multitude of elements from a large number of varied sources of training material” (see our previous discussion concerning the pastiche defence here).

Similar litigations are ongoing in the US. For example, in December 2023 the New York Times brought suit against OpenAI and Microsoft, in which it alleged that they are seeking to “free-ride” on the work of the New York Times journalists by using their content to “build substantive products without permission or payment”. Similarly, a trade group for US authors also sued OpenAI in a Manhattan federal court on behalf of well-known authors including John Grisham, George Saunders and Jodi Picoult. The allegations in that action relate to OpenAI’s use of text from the authors’ books, which, it is alleged, may have been taken from illegal online book repositories.

To address this issue in the UK, the Committee’s report suggests that it is “inevitable” that the discussions between the AI sector and the creative industries will:

“involve the agreement of a financial settlement for past infringements by AI developers, the negotiation of a licensing framework to govern future uses, and in all likelihood the establishment of a new authority to operationalise the agreement. If this cannot be achieved through a voluntary approach, it should be enforced by the Government, or its successor administration, in co-operation with its international partners”.

The suggestion of a new authority being “enforced” by the government (in the absence of a successful voluntary approach) presumably follows the unsuccessful creation of a code of practice on copyright and AI, which the UK government sought to implement earlier this year via discussions with users and rights holders. Those discussions failed to yield a workable solution (see our previous discussion on that here).

The Committee’s proposal, whilst on the face of it sensible, may be difficult to execute in practice. In particular, the financial settlement for past infringements itself requires i) at the very least an acceptance by AI companies that there was an infringement (which in effect would constitute an admission of liability) and ii) a value being placed on those infringements, which may not be straightforward. The negotiation of a licensing framework may however gain more traction. For example, in December 2023, Axel Springer partnered with OpenAI in order to “strengthen independent journalism in the age of AI” – i.e., Axel Springer brokered a licensing deal with OpenAI, enabling OpenAI to train its systems using content from the Axel Springer media brands, in exchange for a fee. It seems likely that other publishers will ultimately follow suit.

The Committee’s report also considered “the liability challenge”, noting that determining liability for AI-related harms is not just a matter for the courts but rather is one that can be influenced by the government and regulators. The report concluded that “Nobody who uses AI to inflict harm should be exempted from the consequences, whether they are a developer, deployer or intermediary” and that the UK government, together with relevant regulators across the sectors, should publish guidance on where liability for “harmful uses of AI” falls under existing law, as well as in relation to the establishment of liability via statute rather than “simply relying on jurisprudence”.

As with the proposed solution to “the copyright challenge”, such regulation is a good idea in theory but likely difficult to execute in practice, not least given the difficulty of defining “harm” within an AI context, something which may itself require very careful drafting and presumably input from stakeholders with competing interests. 

It will be interesting to see whether and to what extent the incoming government seeks to deal with these issues.

Lauren McFarlane, Associate: lmf@bto.co.uk / 0131 222 2939

“The level of service has always been excellent, with properly experienced solicitors dealing with appropriate cases" Legal 500

Contact BTO

Glasgow

  • 48 St. Vincent Street
  • Glasgow
  • G2 5HS
  • T:+44 (0)141 221 8012
  • F:+44 (0)141 221 7803

Edinburgh

  • One Edinburgh Quay
  • Edinburgh
  • EH3 9QG
  • T:+44 (0)131 222 2939
  • F:+44 (0)131 222 2949

Sectors

Services