Zoom Video Communications Zoom Video Communications
Illustration: Alex Ivanovs // Stack Diary

Zoom’s updated Terms of Service permit training AI on user content without Opt-Out

The video conferencing platform quietly planted some interesting conditions in their terms of service.

Well, well, well… It looks like Brave isn’t the only company out there that is willing to bet all its chips on reusing other people’s content for AI training.

Zoom Video Communications, Inc. recently updated its Terms of Service to encompass what some critics are calling a significant invasion of user privacy.


📢 An update.

Hey everyone – Zoom users and curious readers alike. This update was written on 8/12/2023 (August 12th) to recap the timeline of events and to make a final statement on the situation regarding this article.

For starters, let’s look at the entire timeline:

  • August 6th: This article is published.
  • August 7th: A few hours after this article goes live, Gabriella “Biella” Coleman (who happens to have enormous editorial reach) tweets this article out and gets everyone from Zoom users to privacy experts to journalists involved in the discussion. Editorials such as The Verge, NBC News, Mashable and countless others pick this story up and get involved.
  • August 7th: Zoom is now in full damage control mode (and I must give them credit for being proactive and involved; believe it or not – there are brands who don’t do this at all) – they publish a blog post that rectifies the 10.4 clause. The updated terms were immediately shot down as they still didn’t address the specific wording of the terms.
  • August 8th: The CEO of Zoom, Eric Yuan, makes a LinkedIn post attributing the updated terms as a “process failure” and something that Zoom “will fix immediately”.
  • August 11th: Zoom updates its terms for the second time, with a clause that now says, “Zoom does not use any of your audio, video, chat, screen sharing, attachments or other communications-like Customer Content (such as poll results, whiteboard and reactions) to train Zoom or third-party artificial intelligence models.”

Now, you might say that “we won” and that “this is a victory”, but for what it’s worth – something like this should have never happened in the first place. It’s a stark reminder for any brand and organization out there that plans on advancing its product with AI through the use of its own customer content.

My own take on the situation is this: if you’re going to use my data/content to train an AI model, then at the very least tell me (the customer) and let me opt-out if I so choose. There is absolutely no reason for you not to do this, unless you want to find out the hard way how people really feel about having their privacy invaded.

Please note that if you continue to read the whole article below – you are reading a version of the article that was published on August 6th with an update added on August 8th. This statement you’re reading right now is the most up-to-date clarification of the situation as it unfolded.


In a detailed perusal of the newly updated terms, two sections – 10.2 and 10.4 – stand out for their broad-ranging implications on how Zoom is permitted to utilize user data. These sections establish Zoom’s rights to compile and utilize “Service Generated Data,” which is any telemetry data, product usage data, diagnostic data, and similar content or data that Zoom collects in connection with users’ use of their services or software.

Zoom’s updated policy states that all rights to Service Generated Data are retained solely by Zoom. This extends to Zoom’s rights to modify, distribute, process, share, maintain, and store such data “for any purpose, to the extent and in the manner permitted under applicable law.”

What raises alarm is the explicit mention of the company’s right to use this data for machine learning and artificial intelligence, including training and tuning of algorithms and models. This effectively allows Zoom to train its AI on customer content without providing an opt-out option, a decision that is likely to spark significant debate about user privacy and consent.

Additionally, under section 10.4 of the updated terms, Zoom has secured a “perpetual, worldwide, non-exclusive, royalty-free, sublicensable, and transferable license” to redistribute, publish, access, use, store, transmit, review, disclose, preserve, extract, modify, reproduce, share, use, display, copy, distribute, translate, transcribe, create derivative works, and process Customer Content.

Zoom justifies these actions as necessary for providing services to customers, supporting the services, and improving its services, software, or other products. However, the implications of such terms are far-reaching, particularly as they appear to permit Zoom to use customer data for any purpose relating to the uses or acts described in section 10.3.

Privacy advocates and legal experts are expected to scrutinize these updated terms closely. Many argue that they push the boundaries of what is acceptable in terms of consent, data privacy, and individual rights. While Zoom’s intentions may be focused on improving their platform and delivering better service, the breadth and depth of these changes may leave many users uncomfortable and seeking assurances about how their data is being used.

Zoom has yet to comment (added below) on the updates and potential concerns raised by these changes. As this unfolds, the debate around privacy in the digital age and the responsibility of companies in respecting user privacy continues to intensify.


An update

Biella Coleman picked this story up and tweeted it out, which got this trending and invoked a response from Zoom.

Aparna Bawa, COO at Zoom, left a comment on Hacker News,

To clarify, Zoom customers decide whether to enable generative AI features (recently launched on a free trial basis) and separately whether to share customer content with Zoom for product improvement purposes.

Also, Zoom participants receive an in-meeting notice or a Chat Compose pop-up when these features are enabled through our UI, and they will definitely know their data may be used for product improvement purposes.

I myself also got an email from a spokesperson, who said that, “Zoom customers decide whether to enable generative AI features, and separately whether to share customer content with Zoom for product improvement purposes.”.

While these are all lovely responses, and at least they’re being involved in the discussion, they don’t actually answer the 10.4 clause in the Terms of Service, which states,

You agree to grant and hereby grant Zoom a perpetual, worldwide, non-exclusive, royalty-free, sublicensable, and transferable license and all other rights required or necessary to redistribute, publish, import, access, use, store, transmit, review, disclose, preserve, extract, modify, reproduce, share, use, display, copy, distribute, translate, transcribe, create derivative works, and process Customer Content and to perform all acts with respect to the Customer Content, including AI and ML training and testing.

Unless these are rectified and clarified, this means that Zoom can do exactly what they say they can do because you are agreeing to grant all the abovementioned permissions. I cannot stress enough that these aren’t just “words at a wall” – in the 10.2 clause, you literally consent to Zoom using your data for AI/ML; you do not have a choice to opt-out because opting-out is not part of the terms.

Zoom has published a blog post on the matter; you can see the tweet here.

I’m linking to the tweet (and you can also read this Hacker News comment) because there should be discourse (unfortunately, for Zoom, it’s not in their favor) around this rather than blank statements.

UPDATED (8/8/2023):

Zoom has gone ahead and made some adjustments to the 10.4 clause, which now states that, “For AI, we do not use audio, video, or chat content for training our models without customer consent.”.

I recommend you read this comment from Sean Hogle (an attorney specializing in tech and intellectual property) on what those adjustments actually mean because it looks like it doesn’t do much in terms of privacy.