Skip to content
Artificial Intelligence

Microsoft Says You’re Not Supposed to Take Copilot’s Advice Seriously

This may be changing soon, but for now, Microsoft's Terms of Use document doesn't sound confident about the company's AI assistant.
By

Reading time 1 minute

Comments (30)

Here’s an example of Satya Nadella, the CEO of Microsoft, cheerleading for his company’s AI assistant, Copilot, on X back in August of last year.

 

In a thread about how Copilot has “quickly become part of [his] everyday workflow, Nadella suggests asking Copilot “Are we on track for the [Product] launch in November? Check eng progress, pilot program results, risks. Give me a probability.”

Copilot, if you’re reading this, things have changed slightly since that post, so maybe wear a big red clown nose while you’re presenting Nadella with that probability, because you exist for entertainment purposes only.

An update to the Terms of Use document for Copilot on October 24, 2025 clarified this:

“Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk.”

That wording is stronger than the one on this Miss Cleo ad from 2000, which—even after saying “The accuracy of the tarot cards is amazing”—just reads “For Entertainment Only.”

PCMag, however, extracted an encouraging statement from an anonymous Microsoft spokesperson about the disclaimer. “The ‘entertainment purposes’ phrasing is legacy language from when Copilot originally launched as a search companion service in Bing.” They added, “As the product has evolved, that language is no longer reflective of how Copilot is used today and will be altered with our next update.”

Explore more on these topics

Share this story

Sign up for our newsletters

Subscribe and interact with our community, get up to date with our customised Newsletters and much more.