YouTube Community Guidelines Enforcement - Google Transparency Report
YouTube Community Guidelines Enforcement - Google Transparency Report
3,260,974
When a channel is terminated, all of its videos are removed.
Number of videos removed during this time period due to a channel-level termination: 56,578,636.
A YouTube channel is terminated if it accrues three Community Guidelines strikes in 90 days, has a single case of severe abuse (such as predatory
behavior), or is determined to be wholly dedicated to violating our guidelines (as is often the case with spam accounts). When a channel is
terminated, all of its videos are removed.
This exhibit shows the number of channels removed by YouTube for violating its Community Guidelines per quarter.
Channels removed, by removal reason
4.6%
4.8%
5.0%
80.4%
Spam, misleading and scams Child safety Nudity or sexual Misinformation Harmful or dangerous 1/3
This chart shows the volume of channels removed by YouTube, by the reason a channel was removed. The majority of channel terminations are a
result of accounts being dedicated to spam or adult sexual content in violation of our guidelines.
When we terminate a channel for receiving three Community Guidelines strikes for violating several different policies within a three month period, we
categorize it under a separate label - “Multiple policy violations” - because these accounts were not wholly dedicated to one policy violation.
8,497,876
YouTube relies on teams around the world to review flagged videos and remove content that violates our Community Guidelines; restrict videos (e.g.,
age-restrict content that may not be appropriate for all audiences); or leave the content live when it doesn’t violate our guidelines.
This exhibit shows the number of videos removed by YouTube for violating its Community Guidelines per quarter.
Videos removed
9,000,000
8,000,000 8,198,119
7,000,000
6,000,000
5,000,000
4,000,000
3,000,000
2,000,000
1,000,000
238,204
61,553
0
Automated flagging User Organization
This chart shows the volume of videos removed by YouTube, by source of first detection (automated flagging or human detection). Flags from
human detection can come from a user or a member of YouTube’s Priority Flagger program. Priority Flagger program members include NGOs and
government agencies that are particularly effective at notifying YouTube of content that violates our Community Guidelines.
Videos removed, by views
60%
59.91%
50%
40%
30%
24.34%
20%
10%
7.23% 2.42%
4.80% 1.30%
0%
0 views 1-10 views 11-100 views 101-1,000 views 1,001-10,000 views >10,000 views
YouTube strives to prevent content that breaks our rules from being widely viewed—or viewed at all—before it's removed. Automated flagging
enables us to act more quickly and accurately to enforce our policies. This chart shows the percentage of video removals that occurred before they
received any views versus those that occurred after receiving some views.
Videos removed, by removal reason
2.6%
3.0%
4.9%
5.9%
7.6%
59.4%
12.2%
Child safety Harmful or dangerous Violent or graphic Harassment and cyberbullyi… Nudity or sexual 1/2
This chart shows the volume of videos removed by YouTube, by the reason a video was removed. These removal reasons correspond to YouTube’s
Community Guidelines. Reviewers evaluate flagged videos against all of our Community Guidelines and policies, regardless of the reason the video
was originally flagged.
YouTube’s Community Guidelines are enforced consistently across the globe, regardless of where the content is uploaded. When content is removed
for violating our guidelines, it is removed globally. For information about content removals or restrictions based on local laws, see Google’s
Government requests to remove content transparency report.
Apr 2024 – Jun 2024 All countries/regions
1 India 2,311,929
2 Russia 968,019
3 Brazil 836,330
5 Indonesia 525,627
6 Bangladesh 244,391
7 Pakistan 226,281
8 Türkiye 136,159
9 Philippines 122,561
10 Mexico 118,482
1,372,493,981
YouTube is a vibrant community in which millions of people post billions of comments each quarter. Using a combination of people and technology,
we remove comments that violate our Community Guidelines. We also filter comments which we have high confidence are spam for creators to
review and approve if they choose.
This exhibit shows the volume of comments removed by YouTube for violating our Community Guidelines and filtered as likely spam which creators
did not approve.
The data does not include comments removed when YouTube disables the comment section on a video. It also does not include comments taken
down when a video itself is removed (individually or through a channel-level suspension), when a commenter’s account is terminated, or when a user
chooses to remove certain comments or hold them for review.
Comments removed, by source of first detection
99.6%
Most removed comments are detected by our automated flagging systems but they can also be flagged by human flaggers. We rely on teams
around the world to review flagged comments and remove content that violates our Terms of Service, or leave the content live when it doesn’t
violate our guidelines.
This chart shows the volume of comments removed by YouTube for violating our Community Guidelines, by source of first detection (automated
flagging or human detection). The majority of actions we take on comments is for violating our guidelines against spam.
Comments removed, by removal reason
4.3%
5.0%
6.9%
80.8%
Spam, misleading and… Harassment and cyber… Child safety Violent or graphic Hateful or abusive 1/2
This chart shows the number of comments removed by YouTube, by the reason a comment was removed. These removal reasons correspond to
YouTube’s Community Guidelines. The majority of actions that we take on comments is for violating our guidelines against spam.
arrow_forward
Back to top
Share
English