The short-form video app said 99.7% of the removed content was proactively identified, and 96.2% was taken down within 24 hours of being posted.
Globally, TikTok deleted 189.5 million videos during the quarter — representing around 0.7% of all uploads. Of these, 163.9 million were removed through automated detection tools, while 7.4 million were later reinstated following further review.
The company also removed 76.9 million fake accounts and 25.9 million accounts suspected of belonging to users under the age of 13.
According to the report, 30.6% of the removed videos contained sensitive or mature themes, 14% violated safety and civility standards, and 6.1% breached privacy and security policies. Additionally, 45% of the content was flagged for misinformation, while 23.8% included AI-generated or edited media.
TikTok said the quarterly report underscores its continued efforts to ensure a safe digital environment and maintain transparency. “The regular publication of enforcement reports reflects our commitment to transparency and community safety,” the company said.
Read: Senate introduces bill to ban social media accounts for under 16s
Similarly, during the first quarter of 2025 TikTok removed nearly 25 million videos in Pakistan, according to its Q1 2025 Community Guidelines Enforcement Report, which covers activity from January to March.
According to the report, a total of 24,954,128 videos were taken down in Pakistan for violating the platform’s community guidelines. The proactive removal rate in the country remained exceptionally high at 99.4%, with 95.8% of the flagged videos removed within 24 hours of being posted.
The report further revealed that 30.1% of all removed videos globally contained sensitive or mature themes, making it the most common reason for enforcement.
Other violations included breaches of privacy and security guidelines (15.6%), safety and civility standards (11.5%), misinformation (45.5%), and the use of edited media or AI-generated content (13.8%).
TikTok said that its quarterly enforcement reports are part of its ongoing commitment to transparency and accountability. The company noted that the reports are designed to help users, regulators, and the general public better understand how content moderation is carried out at scale and what types of violations are being addressed most frequently.