Microsoft’s AI Indemnification Promise: A Potential Game-Changer?
On September 7, 2023, Microsoft announced a new “Copilot Copyright Commitment” for users of its suite of AI-powered tools, following their earlier “AI Customer Commitments” policy statement in June. The general purpose of this program is to indemnify Microsoft’s customers from legal liability in the event their use of Microsoft’s Copilot is accused of infringing a third-party’s copyrights. To a studio or publisher exploring the possibilities of incorporating AI into their workflows, this promise might sound appealing, if almost too good to be true. It’s therefore worth taking a closer look at what it means, and what the future might hold for this and similar pledges from AI toolmakers.
There remain a lot of unanswered questions regarding intellectual property and emerging AI technology, and it is likelier than not that many of these questions will be resolved in court. In the meantime, developers of AI technology want users to adopt and utilize it today. However, customers are justifiably worried that an AI tool might generate content (code, an image, audio, etc.) that infringes on somebody else’s intellectual property rights, leading to a costly lawsuit and potential public relations backlash.
Who should ultimately be responsible when an AI tool generates infringing material? The tool’s creator who arguably “trained” it to infringe, or its user who arguably “asked” it to infringe? Before Microsoft changed the game, AI developers squarely placed the risk of IP infringement solely on their users. For example, Midjourney’s Terms of Service currently state the following:
We provide the service as is, and we make no promises or guarantees about it.
You understand and agree that we will not be liable to You or any third party for any loss of profits, use, goodwill, or data, or for any incidental, indirect, special, consequential or exemplary damages, however they arise.
You are responsible for Your use of the service. If You harm someone else or get into a dispute with someone else, we will not be involved.
If You knowingly infringe someone else’s intellectual property, and that costs us money, we’re going to come find You and collect that money from You. We might also do other stuff, like try to get a court to make You pay our attorney’s fees. Don’t do it.
Obviously, this language is justifiably scary to many in-house legal teams!
While we have heard stories of larger publishers being able to negotiate specific IP indemnification deals with AI toolmakers, Microsoft is the first time we’ve seen a major company making indemnification their standard policy for all customers. In their announcement, Microsoft claims that their purpose for offering this indemnification is threefold: first, because they want to stand behind their customers when they use Microsoft’s products; second, because they believe Microsoft should assume the responsibility of addressing third-party IP complaints involving their tools; and third, because they have built and incorporated guardrails into their Copilot tools to help reduce the likelihood of infringement.
That third factor is perhaps the most telling here, since Microsoft appears to be banking at least somewhat on the strength of their content filters to prevent the generation of infringing material. Microsoft even conditions their indemnity on this factor, pledging to assume responsibility only if the customer adheres to Copilot’s internal guardrails. It’s also noteworthy that Microsoft’s pledge only applies to copyright claims, and not to any other forms of alleged intellectual property infringement arising from the use of their tools, such as trade secrets or patents.
While the exact metes and bounds of Microsoft’s commitment remain to be tested, at minimum this promise of legal coverage is likely to draw wider adoption of Copilot. More use of the tools ensures more field testing, and therefore more useful data for Microsoft to further refine and improve upon its tools. On the other hand, it also positions Microsoft front and center as the ideal deep-pocketed party that will no doubt attract an aggressive plaintiff’s bar.
Assuming Microsoft’s strategy ends up paying off, it’s reasonable to assume that others in the AI tool development space will follow suit with similar promises. It’s possible this will result in an overall reduction in the number of companies offering AI tools, as smaller startups won’t have the resources—in a legal war chest or otherwise—to meaningfully back up their promises of indemnification.
Practically speaking then, if you are a developer wanting to utilize AI technology in your development process, you need to choose your vendor carefully, and review the specific indemnification provision they offer. Even if a smaller AI provider offers indemnification, that’s no guarantee that it can provide meaningful support when an actual dispute hits. Just like with any other vendor, choosing someone inexpensive but unreliable or untested could create more problems than it solves.