Who Owns AI-Generated Content? IP Ownership When You Build on a Licensed Model
Who Owns AI-Generated Content? IP Ownership When You Build on a Licensed Model
Most founders assume paying for API access means owning the output. The legal reality is more complicated — and the gap surfaces at exactly the wrong moment: due diligence, a licensing dispute, or an acquirer's review.
The ownership assumption founders get wrong
The assumption runs like this: the model provider built the model, I am paying to use it, I provide the prompts, and therefore I own the output. Each part of that reasoning sounds plausible. Each part has a flaw. Three distinct legal questions — not one — determine who owns AI-generated output, and conflating them is where the exposure lives. See our AI IP ownership services for a full analysis of where your business stands.
API pricing covers compute, throughput, and access to the model's capabilities. It does not function as an IP assignment. The model provider retains ownership of the model, its weights, and its training methodology regardless of what you spend. What you receive is a contractual right to generate outputs using their infrastructure — the scope of that right is defined entirely by the licence terms, not by the fee.
Some providers grant broad output rights in their terms of service; others impose restrictions on commercial use, redistribution, or derivative works. The model licence is the document that matters, not the invoice.
Copyright law in most jurisdictions requires human authorship as a precondition for protection. The US Copyright Office has consistently declined to register works produced autonomously by AI. UK law provides limited protection for computer-generated works under a narrow doctrine. The EU position leans toward no protection without genuine human creative input.
The practical consequence: even if you have contractual rights to the output, the output itself may not be copyrightable, which means you cannot stop a competitor from reproducing it. Understanding your AI IP ownership position requires engaging with both the contract and the law — separately.
Even where the model provider grants you output rights and copyright law extends some protection, those rights do not automatically flow to your clients or investors. If your terms of service or client agreements do not contain clear IP assignment provisions covering AI-generated content specifically, you may hold rights that your clients cannot rely on.
This becomes acute in regulated industries — financial services, legal tech, medical — where clients need to demonstrate they own the output they are acting on. The gap between what the model provider gives you and what your contracts pass downstream is where IP and IT legal advice is most often needed.
What the model licence actually says
Model licences vary significantly — and the differences matter more than most founders realise until they are in a dispute or a deal. The three categories below cover the majority of AI-enabled product builds: commercial API providers, open-source models, and enterprise or fine-tuning arrangements. Read the actual licence terms for your model. Do not assume they match the category that sounds most like yours.
- Output ownership typically granted to the user — the provider generally assigns or waives any claim to content you generate
- Commercial use of outputs is permitted under standard terms for most major providers
- Prohibited use categories, content policies, and usage limits are binding — violation can trigger licence termination
- Provider may retain rights to use your inputs and outputs to improve the model — check data processing terms if confidentiality matters
- Enterprise agreements can modify defaults — negotiate data handling, output rights, and liability caps for high-volume or sensitive use cases
- MIT and Apache 2.0 licences are permissive — commercial use, modification, and distribution of outputs generally permitted with attribution
- AGPL and GPL carry copyleft obligations — modifying and deploying as a service may require releasing your modifications as open source
- Custom community licences (Meta Llama, Mistral) impose restrictions on large-scale commercial use — check user count thresholds and prohibited categories
- Self-hosting does not remove licence obligations — the terms follow the model, not the deployment method
- Output rights, data handling, and model exclusivity are all negotiable — enterprise agreements can achieve positions standard API terms do not
- Fine-tuning arrangements should explicitly address who owns the fine-tuned weights, training data pipeline, and outputs from the adapted model
- Vague language on IP ownership is the most common source of disputes — "we shall own the work product" requires careful definition of what "work product" includes
- Get specialist AI model licensing advice before signing enterprise AI agreements — standard contract review often misses the AI-specific terms
The licence terms are the starting point, not the end of the analysis. A commercial API licence that grants you output rights does not resolve whether those outputs are protectable under copyright law — or whether your downstream contracts correctly pass those rights to clients. See our AI IP ownership advisory for how these layers interact.
Copyright law versus contractual assignment
These are two separate questions that most discussions conflate. Copyright law determines whether AI-generated output is protectable at all — that is a question of statute and judicial interpretation, and the answer varies by jurisdiction. Contractual assignment determines who holds whatever rights the law recognises — that is a question of your agreements. You need both to work together.
Fine-tuned models: what you own and what you don't
Fine-tuning a model on your proprietary data is one of the most commercially valuable things an AI-enabled business can do. It can also create false confidence about IP ownership. Fine-tuning adapts the model's behaviour using your data — it does not transfer ownership of the model's underlying architecture, pre-training data, or base weights to you. Understanding the boundary precisely matters for commercial positioning, investor representations, and the warranties you make to clients.
IP ownership framework: four questions to ask before you ship
The IP questions for an AI-enabled product are not complicated to ask — they are complicated to answer without looking at the right documents. Most businesses get into difficulty not because the questions are unanswerable but because they are never asked until a deal, dispute, or due diligence process forces them to the surface. Work through these four questions before you launch, not after.
Pull the current licence terms for your model — not what you remember from when you signed up, but the terms as they stand today. Model providers update their terms, and the terms that applied when you started building may have changed. Specifically check: output ownership, commercial use rights, prohibited use categories, data use by the provider, and termination provisions.
If you are using an open-source model, identify the specific licence version and whether your use case falls within its permitted scope. The model licensing review is the first document to read — everything else depends on what it says.
Identify the jurisdictions where your clients use and rely on the output. For each jurisdiction, determine whether AI-generated output of your type — text, code, images, analysis — is likely to receive copyright protection, and under what conditions.
If your product involves significant human creative input in the prompting, selection, curation, or arrangement of outputs, document that input: it is the factual basis for any copyright claim you make. If the outputs are not independently protectable, assess what other protections — trade secret, contractual exclusivity, speed — you can rely on instead. See our AI IP ownership advisory for a jurisdiction-specific analysis.
Review your terms of service, client agreements, and enterprise contracts for three things. First, does your business receive a clear assignment or licence of output rights from the model provider — and does that assignment cover the commercial use your product makes? Second, do your client-facing terms correctly pass those rights downstream, covering AI-generated content specifically? Third, are the restrictions in the upstream model licence reflected in your downstream terms?
Most businesses built before 2022 have contracts that pre-date generative AI and need updating. This is a standard IP and IT legal review — but it needs to be done.
An acquirer or investor will ask: what happens to the business if the model provider changes its terms, raises its prices, or is acquired? If the answer is "the product requires a complete rebuild," that is a risk that affects valuation. The structural question is whether your proprietary value — your training data, fine-tuning methodology, prompt engineering, evaluation framework — is separable from the base model infrastructure.
Build your IP in a way that is model-agnostic where possible. If your business is in a regulated sector, also consider how holding structure affects whether IP sits in the right entity for investment, licensing, or exit purposes.