Introduction

We have been working with a leading data analytics company over a number of years. The client first approached us in 2022 requesting help with its existing contracts. As it grew and attracted more business, it encountered more and more in-house legal teams who were heavily scrutinising the client’s old legal documentation. This scrutiny slowed down the client’s new business onboarding process and led to most customers being on different legal terms with the client which on its own presents a legal and commercial risk. 

Context and Challenge

We were asked to provide new drafts of a number of documents, the core document being the client’s template “data licence.”

The client distributed its data to its customers via an in-house SaaS product which allowed its customers to view and manipulate the client’s data in a number of different ways. 

The contract needed to incorporate standard SaaS provisions – making it clear that the provision of the data was a service in and of itself and setting out the normal terms one would expect to see in relation to that service (payment terms, limitation of liability, access rights and so on). 

From a legal perspective, the terms on which the client’s data may be used by its customers, were complicating matters.

The client’s data was the essence of its business: getting this right was essential. The client could not just keep on generating data and selling it by way of an intellectual property rights assignment (that would give the buyer the right to restrict future sales). Nor could the client distribute its data without serious restrictions, as that could limit the use-case for its business. 

After a number of calls, we landed on the client’s position of permitting the client’s customers to access and manipulate data in pretty much any way they would seek to do so, but ensuring that all manipulated data remains the property of the client (and not the customer). The customer would be able to develop its analysis and own it, but the client’s data would remain tightly controlled. Owning data manipulated by the customer also prevents onward resale in competition with the client – another key point that needed to be addressed properly. 

This approached achieves both the client’s objectives in this respect – being commercial but protective at the same time.

Process and Insight

After we had got to grips with how best to achieve the client’s objectives, we got down to drafting the data licence and the related documentation.

The client wanted to be able to utilise the documentation with its customers without constant back and forth with us. Contractual negotiations can drive up legal spend so it was important to the client to ensure that it knew enough about how the documentation worked to do so without constantly checking with lawyers. To that end, we provided the client with a comprehensive set of guidance notes that explained the meaning and implication of the most important clauses in the document in clear non-legal terms. We also indicated which clauses the client’s customers were likely to try and remove and whether that should be resisted or not (and/or what to do or rely on instead). 

Result

The guidance notes proved to be a particularly useful tool – in over two years of use the client never had a need to come back to us for further advice around contract negotiation. 

However, by the end of that period the client did want to check in with us given a more frequent contractual request: to use the client’s data for the purposes of training an AI model.

AI models were not as commercially widespread in 2022, and at that time it wasn’t necessary to deal with AI as apart from any other kind of use of the data by the client’s customers. 

However, with the relative ease that AI systems could now be deployed and utilised by businesses, it was important (and is important for any data heavy business) to decide on and contractually limit how its data can be utilised by an AI. 

The key dividing line is between “ingestion” and “analysis”. The former is a technical term for the process by which information is absorbed into the machine learning aspects of an AI model – once data has been ingested it becomes part of the model itself. Mere “analysis” however, does not require ingestion – AI systems can be constructed in such a way that they can use data but not learn from it. For our client, making sure its data wasn’t ingested by its customer’s AI models was key: otherwise the client’s data could end up being shared across any user of the AI (including third parties, if an AI model is being shared amongst businesses).

We ended up having to amend the client’s documentation to accommodate this point. Implementing the necessary contractual restrictions, now and in the future, will be a key part of protecting data heavy organisations from the (mis)use of AI systems. If you are collecting or receiving a significant amount of data and you have questions around how best to protect yourself, please do not hesitate to contact us.