Google Gemma: The Hidden Risks of an “Almost Open” License
Google Gemma: The Hidden Risks of an "Almost Open" License
Google markets Gemma as an open model — and for most use cases, commercial deployment is genuinely permitted. But the Gemma Terms of Use is not Apache-2.0. It carries a Prohibited Use Policy, a flow-down obligation to every downstream user, and a unilateral termination right that no truly open licence contains. This guide maps every commercial risk that the "open" label obscures.
What Is Actually Permitted — Commercial Use, Self-Hosting, and Fine-Tuning
The genuine freedoms in the Gemma Terms of Use: what you can build, deploy, and charge for
The Prohibited Use Policy — What Is Banned and the Flow-Down Obligation
Google's use restrictions, the categories that catch founders by surprise, and why you must pass them to every user of your product
Google's Right to Restrict, Modify, and Terminate
The unilateral rights Google reserved — what triggers them, what they mean for your product, and how to reduce the risk
Derivative Models and Fine-Tunes — How the License Carries Through
What the Gemma ToU says about models you build on top of Gemma, and the obligations that follow your fine-tuned product downstream
Introduction — What "Almost Open" Actually Means for Your Business
Google's Gemma model family is described across developer documentation as "open" — and in a practical sense, much of it genuinely is. You can download Gemma weights, deploy them on your own infrastructure, fine-tune them on proprietary data, and charge users for a product built on top of them. None of that requires a commercial licence negotiation with Google.
What is not true is that Gemma is governed by a permissive open-source licence like Apache-2.0 or MIT. Gemma is released under the Gemma Terms of Use — a custom, Google-authored document that grants the permissions above while reserving rights that no OSI-approved licence would permit: a detailed Prohibited Use Policy, a contractual obligation to pass restrictions downstream to every user of your product, and the right for Google to modify terms or terminate access unilaterally. These are the hidden risks the "open" label does not advertise.
Common misconception
"Gemma is open source, so it's like Apache-2.0 — no strings attached."
Gemma is released under a custom Terms of Use, not Apache-2.0 or any OSI-approved licence. The Terms include a Prohibited Use Policy, flow-down obligations, and Google's unilateral right to update or terminate. These provisions have no equivalent in standard open-source licences.
Common misconception
"My application can do whatever it wants as long as I don't redistribute the weights."
The Prohibited Use Policy applies to all use of Gemma — including inference, API-based products, and embedded deployments. You cannot circumvent the restrictions by keeping the weights private or calling the model through a third-party endpoint.
Common misconception
"Once I fine-tune Gemma, the resulting model is fully mine and free of Google's terms."
The Gemma Terms of Use explicitly apply to derivative models — any model that is fine-tuned on, distilled from, or built on top of Gemma weights remains subject to the same Terms of Use. The restrictions follow the model, not the checkpoint name.
Permissions — genuine and broad
Commercial deployment, self-hosting, and fine-tuning are all permitted. Section 1 of this guide covers what you can actually build and sell.
Prohibited Use Policy — detailed and flow-down
A specific list of prohibited applications with a contractual obligation to pass those restrictions to every downstream user of your product.
Termination and modification rights
Google retains the right to update the Terms unilaterally and to terminate licences for breach — a provision absent from all OSI-approved open-source licences.
Derivative model obligations
Fine-tuned and derived models carry the same Terms of Use as the base model. The licence does not end when you modify the weights.
IP ownership context: The Gemma Terms of Use govern how you may use Google's model — they do not address who owns the fine-tuning data, the prompts, or the outputs your product generates. For analysis of AI IP ownership in commercial deployments and investment structures, see AI IP Ownership — wcr.legal.
The four sections that follow examine each dimension of the Gemma Terms of Use in detail — starting with the genuine freedoms the Terms grant, before turning to the restrictions, Google's unilateral rights, and the obligations that follow your product downstream.
Section 1 — What Is Actually Permitted: Commercial Use, Self-Hosting, and Fine-Tuning
The Gemma Terms of Use are more permissive than most commercial model licences and more restrictive than any OSI-approved open-source licence. The starting point is genuinely favourable: Google grants a broad non-exclusive right to use, reproduce, distribute, and create derivative works from the Gemma model weights. What follows is a precise account of what that permission covers in practice.
Building and selling products with Gemma
The Terms explicitly permit commercial use — you can build products on Gemma, charge users, and generate revenue without a separate commercial agreement with Google.
Running Gemma on your own infrastructure
You can download, host, and serve Gemma on your own servers, cloud instances, or on-premise hardware — no API dependency on Google's infrastructure is required.
Adapting Gemma to your domain and data
The Terms permit fine-tuning — you can train Gemma weights on proprietary data and domain-specific corpora to create specialised derivative models for your product.
Conditions attached to all three permissions
Permission Summary — Gemma vs Common Open-Source Licences
The permissions in Section 1 are genuine and commercially valuable. The restrictions in Section 2 — the Prohibited Use Policy and the flow-down obligation — are the conditions under which those permissions operate.
Section 2 — The Prohibited Use Policy: What Is Banned and the Flow-Down Obligation
The Gemma Prohibited Use Policy is the operative restriction in the Gemma Terms of Use — a specific list of application categories and activities that the Terms prohibit regardless of commercial intent, jurisdiction, or the technical architecture of your product. Understanding the Policy has two components: knowing what it prohibits, and understanding the flow-down obligation that makes compliance your responsibility — not just your users'.
The Policy is incorporated by reference into the Gemma Terms of Use. Accepting the Terms means accepting the Prohibited Use Policy in full. Because Google updates its policies and the Terms provide that continued use constitutes acceptance of any updated version, the scope of the Policy is not static — it is whatever Google publishes at the time of your use.
Weapons Development and Mass-Harm Technologies
Applies to all Gemma models and all derivative models — no military or dual-use exceptionThe Prohibited Use Policy prohibits using Gemma to facilitate the development, production, or deployment of weapons capable of mass harm — including biological, chemical, nuclear, and radiological weapons. This prohibition is broadly drafted and encompasses research assistance, information synthesis, and workflow automation in these domains.
Surveillance, Tracking, and Profiling Without Consent
Includes automated decision-making systems that profile individuals without consentThe Policy prohibits using Gemma for surveillance activities, tracking individuals without their knowledge and consent, and automated profiling for purposes that the Policy characterises as harmful or deceptive. This category has the broadest relevance for enterprise and HR-technology products.
Child Safety and Illegal Content
Absolute prohibition — no exception by use case, architecture, or intended safeguardsThe Policy contains an absolute prohibition on any use of Gemma to generate, facilitate, or distribute content that sexually exploits minors or constitutes child sexual abuse material. This prohibition is unconditional. There is no research exception, content-moderation exception, or technical-architecture exception that permits a Gemma-based product to process, generate, or classify this category of content, even for detection or filtering purposes.
Election Interference and Political Manipulation
Includes disinformation, automated political messaging, and voter manipulation at scaleThe Policy prohibits use of Gemma in activities designed to interfere with elections, generate disinformation at scale, or automate political messaging intended to mislead voters or manipulate electoral outcomes. This category has growing relevance as AI is used in political campaign technology.
Violations of Applicable Law and Other Harmful Activities
A catch-all category that incorporates jurisdiction-specific legal standards and general harm-prevention principlesThe Policy includes a provision prohibiting use of Gemma "in ways that violate applicable laws" and for activities designed to cause significant harm. This category is jurisdiction-dependent — an application that is lawful in one market may breach this provision in another — and its scope is not limited by specific enumeration.
The Flow-Down Obligation — Why Your Users Are Your Responsibility
The Gemma Terms of Use impose a contractual obligation on everyone who makes Gemma — or a Gemma derivative — available to other users. The Terms require that you ensure your users are also prohibited from the same activities that are prohibited to you under the Prohibited Use Policy. This flow-down obligation is not optional — it is a condition of your licence.
In practice, this means your product's Terms of Service or acceptable use policy must explicitly incorporate the Gemma Prohibited Use Policy (or equivalent restrictions) as binding obligations on your users. If a user of your product violates the Prohibited Use Policy and you have not contractually bound them to those restrictions, you may be in breach of the Gemma Terms of Use — not the user.
Practical implication for product teams
Your Terms of Service or Acceptable Use Policy must include Gemma-compliant restrictions. A generic "don't do illegal things" clause is not sufficient — the Prohibited Use Policy contains categories (surveillance without consent, election interference, weapons development) that go beyond general illegality. Your ToS must either reproduce the substance of the Policy or incorporate it by reference.
Review your ToS every time Google updates the Prohibited Use Policy. Since continued use of Gemma after a Policy update constitutes acceptance of the new restrictions, you must ensure your downstream users are bound by whatever the current version of the Policy says — not just the version that existed when you launched.
Section 3 — Google's Right to Restrict, Modify, and Terminate
No OSI-approved open-source licence gives the licensor the right to unilaterally modify the terms after release, or to terminate the licence of a downstream user. The Gemma Terms of Use reserve both rights for Google. These provisions are the clearest structural difference between Gemma and a truly open model — and the most significant source of dependency risk for commercial products built on Gemma.
The Right to Modify the Terms of Use
Google may update the Gemma Terms of Use — continued use constitutes acceptanceThe Gemma Terms of Use contain a standard update clause: Google may revise the Terms at any time and will make the revised Terms available through the same channel as the original. Continued use of Gemma after the revised Terms are posted constitutes acceptance. You are not required to affirmatively agree to the changes — continued operation of your product is treated as consent.
In practice, this means the terms under which you built your product may not be the terms under which you operate it at Series A or at acquisition. The Prohibited Use Policy that applied when you started development is not guaranteed to be identical to the one that applies eighteen months later — and the restrictions that flow down to your users are those in the current version of the Policy, not the version you agreed to at launch.
The Right to Terminate Your Licence
Google can revoke your right to use Gemma on breach of the Terms — the licence is not irrevocableThe Gemma Terms include a termination provision: Google may terminate your licence if you breach the Terms of Use. Unlike irrevocable open-source licences, this means that a breach — including inadvertent non-compliance with the Prohibited Use Policy — creates a legal basis for Google to end your right to use, distribute, or build on Gemma weights.
Termination does not necessarily mean Google will seek injunctive relief for every breach — practically, enforcement depends on Google's priorities and the severity of the violation. But the legal exposure is real: a product that has been built entirely on Gemma, and whose licence Google has terminated, is distributing an AI model without a valid licence to do so. Investors and acquirers will identify this risk in due diligence.
Trademark and Naming Restrictions
Using "Gemma" in product names, marketing, or branding is not included in the model licenceThe Gemma Terms do not grant any rights to Google's trademarks, including the "Gemma" name. Using "Gemma" in a product name, company name, or marketing materials implies endorsement or official association with Google — rights that the model licence does not provide. A startup named "GemmaAssist" or marketing its product as "Powered by Gemma" (without separate authorisation from Google) operates with unlicensed trademark use.
The naming restriction creates a practical product-naming constraint that is separate from the Prohibited Use Policy. It is one of the most commonly non-compliant aspects of Gemma deployments in practice — development teams choose product names without trademark clearance and launch before anyone has checked whether the brand use is authorised.
Impact Assessment — What Termination Risk Means for Your Product Stage
Risk mitigation measures for commercial products
Section 4 — Derivative Models and Fine-Tunes: How the Licence Carries Through
The Gemma Terms of Use do not expire when you modify the model. Every fine-tuned checkpoint, distilled variant, or derivative model built on Gemma weights is subject to the same Terms of Use as the original — including the Prohibited Use Policy and the flow-down obligation. Understanding what survives the fine-tuning process, and what the obligations are for derivative models you distribute, is essential before using Gemma as the base for a commercial AI product.
Derivative models are explicitly covered
The Gemma Terms of Use define "Gemma" to include derivative models — the Terms apply to any model that incorporates, is derived from, or is built upon Gemma weights. Fine-tuning does not change your licence status. A 70B Gemma model fine-tuned on your proprietary data remains a "Gemma" for the purposes of the Terms.
Prohibited Use Policy obligations survive fine-tuning
The Prohibited Use Policy applies to derivative models. You cannot fine-tune Gemma and then use the resulting model for a purpose that would have been prohibited with the original Gemma weights. The restrictions follow the weights, not the checkpoint name or the fine-tuning dataset.
Your fine-tuning data and incremental changes
You own the data you use for fine-tuning and the specific adaptations you make. The Terms do not require you to disclose or open-source your fine-tuning data or your proprietary additions. What you do not own is the freedom to treat the resulting model as Apache-2.0 or MIT — the Gemma ToU is the governing document regardless of your contribution to the derivative.
Distributing fine-tuned models requires ToU compliance
If you distribute a Gemma-derived model to third parties — whether as open weights, via an API, or as part of a product — you must include or reference the Gemma Terms of Use and ensure downstream users are bound by the Prohibited Use Policy. Distributing without this is a breach of the Terms.
Compliance Checklist for Gemma-Based Products
Gemma vs Comparable Open-Weight Models — Derivative Obligations Summary
Conclusion — Building on Gemma with Open Eyes
Gemma is a genuinely capable and genuinely useful model for commercial AI products. The "almost open" label is not a criticism — the permissions it grants are broad, and for the majority of use cases the Prohibited Use Policy is not a binding constraint. But the risks are real, and they follow a predictable pattern: founders who do not read the Prohibited Use Policy before building, do not include its restrictions in their downstream ToS, and do not plan for the possibility that Google updates the Terms mid-product — these are the founders who surface a compliance problem at Series A or acquisition, not at launch.


