TikTok removes 3.78m Nigerian videos in three months
New data from the West Africa Safety Summit reveals the scale of automated moderation and rising scrutiny over online safety.

TikTok removed 3.78 million videos in Nigeria between April and June 2025 for violating its Community Guidelines, according to figures released at the West Africa Safety Summit in Dakar. It is the first time the company has published a quarterly breakdown for an individual African country. Nigeria’s total represents roughly two percent of global removals during the period and highlights the sheer scale of the platform’s enforcement in the region.
The new numbers place Nigeria within a wider global picture. Across the world, TikTok removed 189 million videos in the second quarter of 2025. In Nigeria, 98.7 percent of the removed videos were detected before anyone viewed them, while 91.9 percent were taken down within 24 hours. These results sit slightly below TikTok’s global averages of 99.1 percent proactive detection and 94.4 percent removal within a day. Even so, the figures show how heavily the platform relies on automated systems designed to stop harmful content before it spreads.
TikTok said artificial intelligence carried out most of the removals. Globally, its AI tools accounted for 163.9 million of the 189 million deletions. The company also removed 76.99 million fake accounts and took action against 25.9 million accounts suspected to belong to users under 13. In Nigeria, 49,512 LIVE sessions were banned for breaking monetisation rules, part of TikTok’s push to offer more transparency around creator earnings and livestream behaviour.
TikTok positioned both the Dakar summit and the enforcement report as part of an effort to balance global policies with local realities. “While global, we remain hyper-local in our everyday efforts,” said Duduzile Mkhize, TikTok’s Outreach and Partnerships Manager for Sub-Saharan Africa. She told delegates that sustained collaboration with policymakers, regulators and civil society groups across West Africa was essential to prevent a fragmented and unsafe digital environment.
The summit brought together government officials, regulators and civil society groups from countries including Senegal, Mali, Côte d’Ivoire, Burkina Faso, Chad and Ethiopia. For many participants, the detailed data was a welcome development. “The convening of stakeholders in Dakar and the sharing of concrete enforcement figures prove that the work we do alongside TikTok is not in vain,” said Akinola Olojo, a Nigerian expert on preventing violent extremism and a member of TikTok’s Sub-Saharan Africa Safety Advisory Council. He described the discussions as an important step toward building systems that help communities resist radicalisation and use online spaces safely.
Also Read: This is how social media gets your data
The Nigerian numbers raise several difficult questions. High proactive detection rates show the reach of automated moderation, but they also prompt concerns about accuracy, appeals processes and whether AI tools can fairly judge content that depends on cultural or contextual nuance. Civil society groups have repeatedly asked how accuracy is measured, how free expression is protected and how platforms can ensure that enforcement does not silence marginalised voices.
The new transparency around livestream monetisation is also likely to spark debate. TikTok issued warnings or demonetised 2.32 million LIVE sessions globally in the quarter and took action against 1.04 million creators for monetisation breaches. For creators in West Africa, these numbers will intensify questions about who earns money on the platform and how clearly enforcement decisions are communicated.
TikTok presented the Dakar meetings as the beginning of deeper regional engagement. The company said the discussions would shape future safety policies tailored to West African languages, cultures and emerging threats. It is a significant commitment, but one that will require sustained cooperation with governments, civil society and local communities to build trust and workable safeguards.
For now, the headline figures remain striking. Globally, harmful or rule-breaking content continues to account for less than one percent of uploads, standing at around 0.7 percent in the second quarter of 2025. TikTok cites this to argue that its proactive systems are effective. Yet the Nigerian total millions of removals in just three months shows how quickly harmful content can scale and how consequential moderation decisions are for safety, speech and livelihoods across the region.
The Dakar summit ended with a shared list of priorities: greater transparency, stronger local partnerships, improved language-sensitive moderation, better child safety protections and clearer appeals processes. If these commitments lead to concrete action, West Africa could see a more accountable model of platform governance. If not, the data released in Dakar risks becoming a one-off disclosure that raises more questions than it answers.




