Who Owns AI-Generated Content? IP Ownership When You Build on a Licensed Model

Who Owns AI-Generated Content? IP Ownership When You Build on a Licensed Model

Who Owns AI-Generated Content? IP Ownership When You Build on a Licensed Model

⚖️ IP & Technology Law · 2026 Guide

Who Owns AI-Generated Content? IP Ownership When You Build on a Licensed Model

Most founders assume paying for API access means owning the output. The legal reality is more complicated — and the gap surfaces at exactly the wrong moment: due diligence, a licensing dispute, or an acquirer's review.

5 sections · ~6 min read
Model Licensing
Copyright Law
Fine-Tuning IP
📋 In This Guide
5 sections · ~6 min read
1
The ownership assumption founders get wrong
Why "I paid for it" does not mean "I own it" — three reasons the assumption fails
2
What the model licence actually says
Commercial, open-source, and enterprise model terms compared side by side
3
Copyright law versus contractual assignment
What the law provides, what contracts must fill in, and where the gap sits
4
Fine-tuned models: what you own and what you don't
Fine-tuning changes the model's behaviour — it does not change the IP picture automatically
5
IP ownership framework: four questions to ask before you ship
A practical checklist for AI-enabled products and the structural decisions that follow
Section 1

The ownership assumption founders get wrong

The assumption runs like this: the model provider built the model, I am paying to use it, I provide the prompts, and therefore I own the output. Each part of that reasoning sounds plausible. Each part has a flaw. Three distinct legal questions — not one — determine who owns AI-generated output, and conflating them is where the exposure lives. See our AI IP ownership services for a full analysis of where your business stands.

Why "I paid for it" does not establish IP ownership
Three reasons the assumption fails
1
Paying for API access purchases a service licence, not IP rights

API pricing covers compute, throughput, and access to the model's capabilities. It does not function as an IP assignment. The model provider retains ownership of the model, its weights, and its training methodology regardless of what you spend. What you receive is a contractual right to generate outputs using their infrastructure — the scope of that right is defined entirely by the licence terms, not by the fee.

Some providers grant broad output rights in their terms of service; others impose restrictions on commercial use, redistribution, or derivative works. The model licence is the document that matters, not the invoice.

2
Copyright protection for AI-generated output is unsettled — and often absent

Copyright law in most jurisdictions requires human authorship as a precondition for protection. The US Copyright Office has consistently declined to register works produced autonomously by AI. UK law provides limited protection for computer-generated works under a narrow doctrine. The EU position leans toward no protection without genuine human creative input.

The practical consequence: even if you have contractual rights to the output, the output itself may not be copyrightable, which means you cannot stop a competitor from reproducing it. Understanding your AI IP ownership position requires engaging with both the contract and the law — separately.

3
Your downstream contracts may not correctly capture whatever rights you do have

Even where the model provider grants you output rights and copyright law extends some protection, those rights do not automatically flow to your clients or investors. If your terms of service or client agreements do not contain clear IP assignment provisions covering AI-generated content specifically, you may hold rights that your clients cannot rely on.

This becomes acute in regulated industries — financial services, legal tech, medical — where clients need to demonstrate they own the output they are acting on. The gap between what the model provider gives you and what your contracts pass downstream is where IP and IT legal advice is most often needed.

⚠️
The assumption creates real exposure at acquisition
Acquirers of AI-enabled businesses routinely ask: does the target own its output, or is it dependent on a licence that can be terminated? If the answer is "we rely on the provider's terms of service," the acquirer cannot value the IP independently of the provider relationship. Structure the IP correctly before the question is asked under deal pressure — not during due diligence.
Section 2

What the model licence actually says

Model licences vary significantly — and the differences matter more than most founders realise until they are in a dispute or a deal. The three categories below cover the majority of AI-enabled product builds: commercial API providers, open-source models, and enterprise or fine-tuning arrangements. Read the actual licence terms for your model. Do not assume they match the category that sounds most like yours.

🖥️
OpenAI · Anthropic · Google
Commercial API providers
The most commonly used category for product builds
  • Output ownership typically granted to the user — the provider generally assigns or waives any claim to content you generate
  • Commercial use of outputs is permitted under standard terms for most major providers
  • Prohibited use categories, content policies, and usage limits are binding — violation can trigger licence termination
  • Provider may retain rights to use your inputs and outputs to improve the model — check data processing terms if confidentiality matters
  • Enterprise agreements can modify defaults — negotiate data handling, output rights, and liability caps for high-volume or sensitive use cases
⚙️
Llama · Mistral · Falcon
Open-source models
Licences vary — "open-source" is not a single standard
  • MIT and Apache 2.0 licences are permissive — commercial use, modification, and distribution of outputs generally permitted with attribution
  • AGPL and GPL carry copyleft obligations — modifying and deploying as a service may require releasing your modifications as open source
  • Custom community licences (Meta Llama, Mistral) impose restrictions on large-scale commercial use — check user count thresholds and prohibited categories
  • Self-hosting does not remove licence obligations — the terms follow the model, not the deployment method
📝
Enterprise · Fine-tuned
Enterprise & bespoke agreements
Negotiated terms — define everything in writing
  • Output rights, data handling, and model exclusivity are all negotiable — enterprise agreements can achieve positions standard API terms do not
  • Fine-tuning arrangements should explicitly address who owns the fine-tuned weights, training data pipeline, and outputs from the adapted model
  • Vague language on IP ownership is the most common source of disputes — "we shall own the work product" requires careful definition of what "work product" includes
  • Get specialist AI model licensing advice before signing enterprise AI agreements — standard contract review often misses the AI-specific terms

The licence terms are the starting point, not the end of the analysis. A commercial API licence that grants you output rights does not resolve whether those outputs are protectable under copyright law — or whether your downstream contracts correctly pass those rights to clients. See our AI IP ownership advisory for how these layers interact.

Section 3

Copyright law versus contractual assignment

These are two separate questions that most discussions conflate. Copyright law determines whether AI-generated output is protectable at all — that is a question of statute and judicial interpretation, and the answer varies by jurisdiction. Contractual assignment determines who holds whatever rights the law recognises — that is a question of your agreements. You need both to work together.

What the law says
Copyright law position
What protection AI output receives — by jurisdiction
United States: human authorship required
The US Copyright Office has confirmed that copyright protection requires human authorship. Works generated entirely by AI without human creative control are not registrable. Where a human makes sufficient creative choices in prompting, selection, and arrangement of AI outputs, those choices may be protectable — but the AI-generated elements themselves are not.
United Kingdom: narrow computer-generated works doctrine
UK law provides limited protection for "computer-generated" works where there is no human author — protection vests in the person who "made the arrangements necessary" for the work. Courts have not yet applied this doctrine to generative AI specifically, and whether current AI use meets the threshold is genuinely uncertain.
European Union: human creative expression required
EU copyright law requires that works reflect the author's own intellectual creation — a standard that presupposes human creative input. The CJEU has not ruled directly on generative AI output, but the doctrinal framework strongly suggests purely AI-generated content receives no protection.
What this means for your product
If your product generates content that users rely on — reports, code, creative assets, summaries — and that content is not independently copyrightable, competitors can reproduce it without liability. Document the human creative input in your AI workflows; it is a component of your IP ownership position.
What contracts can do
Contractual assignment
What your agreements must cover — and what they cannot fix
Assignment clauses transfer whatever rights exist
A well-drafted assignment clause in your terms of service or client agreement will transfer to your client whatever IP rights you hold in AI-generated output. The important qualifier: it transfers what exists. If copyright law provides no protection for a particular output, there is nothing to assign.
Work-for-hire requires careful drafting in the AI context
Work-for-hire provisions in client agreements typically cause IP to vest directly in the client. In the AI context, the drafter must address the three-party structure: model provider → your business → client. Standard work-for-hire language was not written with this chain in mind and may leave gaps.
Your terms of service must address AI-generated content specifically
Generic IP ownership clauses drafted before generative AI often do not cover AI outputs adequately. Your terms should address what rights the user receives, whether those rights are exclusive, what restrictions apply, and what warranties you make about IP ownership.
Contracts cannot override the model licence
You cannot grant clients rights that are greater than what the model provider has granted you. If the provider's licence restricts certain uses, those restrictions bind you and your clients. IP and IT legal review of the full licence chain is the only way to verify alignment.
⚠️
The combination of uncertain copyright and contract gaps
Founders focus on one layer — usually the model provider's terms — and miss the other. Copyright law may not protect the output. Your terms of service may not assign what it does protect. Your enterprise agreements may not flow rights down correctly. All three layers need to be reviewed together, not in isolation.
Section 4

Fine-tuned models: what you own and what you don't

Fine-tuning a model on your proprietary data is one of the most commercially valuable things an AI-enabled business can do. It can also create false confidence about IP ownership. Fine-tuning adapts the model's behaviour using your data — it does not transfer ownership of the model's underlying architecture, pre-training data, or base weights to you. Understanding the boundary precisely matters for commercial positioning, investor representations, and the warranties you make to clients.

What you own
What you own after fine-tuning
IP that genuinely sits with your business
Your proprietary training data
If you own the data used to fine-tune the model — customer interactions, domain-specific content, labelled datasets you created or licensed exclusively — that data is your IP and the primary source of your competitive advantage. Protect it through confidentiality agreements and data processing terms that prohibit the provider from using it for any other purpose.
The fine-tuning methodology and pipeline
The process by which you fine-tune the model — your data curation approach, labelling methodology, evaluation framework, and training pipeline — is protectable as trade secret or know-how if properly documented and kept confidential. This methodology is often more valuable than the fine-tuned model itself.
The fine-tuned weights — subject to provider terms
Some providers grant ownership or exclusive access to the fine-tuned weights produced from your training run. Others retain ownership. This is a contractual question — the answer depends entirely on what your fine-tuning agreement says. If the agreement is silent, assume the provider retains ownership until you obtain written confirmation otherwise.
Outputs generated by your fine-tuned model
Most commercial providers grant you rights to outputs from fine-tuned models under the same terms as standard API outputs. Verify this is explicitly stated in your fine-tuning agreement rather than implied. See our AI model licensing services for how to document this correctly.
What you don't own
What you don't own regardless of fine-tuning
IP that remains with the model provider
The base model weights and architecture
Fine-tuning modifies the parameters of a model that the provider created. The underlying architecture, pre-training data, and base weights remain the provider's IP in all standard arrangements. An acquirer does not automatically acquire the right to use the base model — they step into your licence, which can be terminated or repriced by the provider.
The pre-training dataset
The capabilities of the model derive from the provider's pre-training dataset, which may contain licensed third-party content or proprietary data. You have no visibility into and no claim over this dataset. Ongoing litigation concerning AI training data may affect what providers can legally offer.
Provider infrastructure and serving stack
If you are using a hosted fine-tuning service, your product depends on the provider's compute, serving, and API infrastructure. Disruption to the provider affects your product directly. This is an AI governance risk question that affects the valuation of AI-enabled businesses that have not addressed it.
Rights that exceed the provider's own licence grant
You cannot grant clients rights in fine-tuned model outputs that exceed what the provider has granted you. If the provider's fine-tuning licence restricts redistribution, competitive use, or industry-specific applications, those restrictions cascade to your clients — a gap that standard commercial agreements often do not address.
⚠️
Fine-tuning creates an adapted version of someone else's model
Fine-tuning does not create a new model. The distinction matters for investor representations, client warranties, and competitive moat analysis. Before describing a fine-tuned model as proprietary IP, verify what your agreements actually grant, what your training data provenance looks like, and whether the fine-tuned weights are yours to move or use independently of the provider's infrastructure.
Section 5

IP ownership framework: four questions to ask before you ship

The IP questions for an AI-enabled product are not complicated to ask — they are complicated to answer without looking at the right documents. Most businesses get into difficulty not because the questions are unanswerable but because they are never asked until a deal, dispute, or due diligence process forces them to the surface. Work through these four questions before you launch, not after.

Four questions that determine your IP ownership position
IP due diligence checklist
1
What model are you using, and what does its licence say?

Pull the current licence terms for your model — not what you remember from when you signed up, but the terms as they stand today. Model providers update their terms, and the terms that applied when you started building may have changed. Specifically check: output ownership, commercial use rights, prohibited use categories, data use by the provider, and termination provisions.

If you are using an open-source model, identify the specific licence version and whether your use case falls within its permitted scope. The model licensing review is the first document to read — everything else depends on what it says.

2
Is the output subject to copyright protection in your key markets?

Identify the jurisdictions where your clients use and rely on the output. For each jurisdiction, determine whether AI-generated output of your type — text, code, images, analysis — is likely to receive copyright protection, and under what conditions.

If your product involves significant human creative input in the prompting, selection, curation, or arrangement of outputs, document that input: it is the factual basis for any copyright claim you make. If the outputs are not independently protectable, assess what other protections — trade secret, contractual exclusivity, speed — you can rely on instead. See our AI IP ownership advisory for a jurisdiction-specific analysis.

3
Do your contracts correctly assign whatever rights exist to your business and your clients?

Review your terms of service, client agreements, and enterprise contracts for three things. First, does your business receive a clear assignment or licence of output rights from the model provider — and does that assignment cover the commercial use your product makes? Second, do your client-facing terms correctly pass those rights downstream, covering AI-generated content specifically? Third, are the restrictions in the upstream model licence reflected in your downstream terms?

Most businesses built before 2022 have contracts that pre-date generative AI and need updating. This is a standard IP and IT legal review — but it needs to be done.

4
Can your IP be cleanly separated if you change model provider or exit?

An acquirer or investor will ask: what happens to the business if the model provider changes its terms, raises its prices, or is acquired? If the answer is "the product requires a complete rebuild," that is a risk that affects valuation. The structural question is whether your proprietary value — your training data, fine-tuning methodology, prompt engineering, evaluation framework — is separable from the base model infrastructure.

Build your IP in a way that is model-agnostic where possible. If your business is in a regulated sector, also consider how holding structure affects whether IP sits in the right entity for investment, licensing, or exit purposes.

📌
The four questions are a starting point
Each answer leads to a follow-on: the model licence question leads to a contract review; the copyright question leads to a market-by-market analysis; the assignment question leads to a terms update; the portability question leads to an architecture and structure decision. The goal is not to answer each question once but to build an IP position that holds up as the business grows and counterparties ask harder questions under pressure.
Get your AI IP position right before it matters
We advise AI-enabled businesses on model licensing, output ownership, contract structure, and IP positioning — before the due diligence, not during it.

Oleg Prosin is the Managing Partner at WCR Legal, focusing on international business structuring, regulatory frameworks for FinTech companies, digital assets, and licensing regimes across various jurisdictions. Works with founders and investment firms on compliance, operating models, and cross-border expansion strategies.