Skip to content
Artificial Intelligence

Microsoft Says You Should Take Copilot Seriously but Not Literally

More than entertainment, less than legally liable.
By

Reading time 2 minutes

Comments (8)

Microsoft does not want Copilot to become the next Clippy. After a number of users noted that the terms of use for the company’s flagship AI assistant stated that it is “for entertainment purposes only,” the company has clarified that it would actually like you to take Copilot seriously.

According to the blog Windows Latest, Microsoft has clarified that Copilot has come a long way since the company slapped the extremely cautious “for entertainment purposes only” label on the AI assistant, and that you should go ahead and start treating it like more than just a novelty.

“The ‘entertainment purposes’ phrasing is legacy language from when Copilot originally launched as a search companion service in Bing,” the company said in a statement to the publication. “As the product has evolved, that language is no longer reflective of how Copilot is used today and will be altered with our next update.”

That explanation passes an initial sniff test. Companies like Microsoft take extra steps to protect themselves, especially with an emerging technology. The company claims the language of the terms of use first came about while Copilot and Bing Chat were gaining popularity. (Remember when people swore that Bing was going to eat into Google’s market share because Microsoft was first to market with AI-powered search? Good times.) In an abundance of caution, taken because the company really had no idea what kind of outputs its AI assistant was going to give people, the company threw down some boilerplate “Please do not sue us” language to protect itself.

It is true that a version of that language has been present since the company’s first published terms of use for Copilot, dated February 2023. But the text has changed over time. In the initial terms of use, the company stated, “The Online Services are for entertainment purposes; the Online Services are not error-free, may not work as expected and may generate incorrect information.”

That phrasing remains through every archived iteration of the terms up until the update published on October 24, 2025, which is the most recent and still in effect. At that point, the language changes to what users spotted: “Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk.”

Now that smells a little fishier. That October 2025 update came after CEO Satya Nadella insisted that Copilot was part of his daily workflow , running a multi-trillion-dollar multinational corporation. It came after the company started inserting Copilot branding and functionality into every nook and cranny of its software suite. It came after there would be any plausible suggestion that the company views the technology as a plaything.

It’s still likely that this is just some excessive ass-covering that Microsoft will eventually remove, as the company claims that it will. But it’s also worth noting that for the moment, the company is insisting that you should treat Copilot as something substantially more than just an entertainment tool while also getting cover from the language in its terms of use. It’s having its cake and eating it, too.

Share this story

Sign up for our newsletters

Subscribe and interact with our community, get up to date with our customised Newsletters and much more.