In a major content purge, TikTok has deleted more than two million videos from Nigeria between July and September 2024, citing violations of its community guidelines.
The social media giant revealed this in its Q3 Community Guidelines Enforcement Report, stressing that the move aligns with its global commitment to online safety and content integrity.
According to the report, 99.1% of these videos were removed within 24 hours, while a staggering 92.1% were taken down before they could even be reported by users.
A Global Effort with Local Impact
TikTok, which boasts over a billion users worldwide, has intensified its content moderation strategies to swiftly detect and eliminate harmful material. The platform stated:
“With over a billion people around the world using the platform and millions of pieces of content posted every day, TikTok continues to invest in technologies that improve content understanding and assess potential risks, allowing the platform to remove harmful content before it reaches viewers.”
Between July and September 2024, TikTok deleted over 147 million videos globally, with 118 million of them removed through automated processes. Nigeria was one of the countries significantly affected by the crackdown, as the company ramped up efforts to enforce its policies.
What Kind of Content Was Removed?
The report highlights that Nigerian videos were mainly removed under three key categories:
- Sensitive and Mature Themes:
- 99.4% of flagged videos under this category were deleted before any user reported them.
- This category includes content deemed inappropriate due to explicit themes or harmful discussions.
- Regulated Goods and Commercial Activities:
- 99.1% of such videos were removed before being reported.
- These include fraudulent schemes, scams, and illegal trading of goods such as firearms and explosive weapons.
- Mental and Behavioral Health Concerns:
- 99.9% of videos promoting potentially harmful mental health content were removed before user complaints.
- The company emphasizes its commitment to shielding younger audiences from material that could negatively affect their mental well-being.
A Firm Stand on Online Safety
TikTok has repeatedly stated that user safety and content integrity remain at the core of its mission. The platform claims to be working on creating a space where users can freely express creativity without being exposed to harmful material.
“TikTok’s mission to inspire creativity and bring joy is underpinned by a strong commitment to user safety, well-being, and content integrity. The platform prioritizes a positive environment where users feel free to create, connect, and be entertained,” the company stated.
As part of these efforts, TikTok is investing heavily in Trust and Safety professionals who collaborate with advanced technology to enforce its Community Guidelines.
A Growing Pattern of Content Regulation
TikTok’s massive content purge in Nigeria is part of a broader trend of stricter content moderation policies by major social media platforms. In recent years, companies such as Meta (formerly Facebook) and YouTube have also intensified their enforcement strategies to curb misinformation, violent content, and other violations.
The move comes at a time when governments worldwide are pressuring tech companies to do more to combat harmful content, particularly regarding issues like misinformation, cybercrime, and online harassment.
For Nigeria, a country where social media plays a significant role in politics, activism, and entertainment, such large-scale video removals may spark debates on censorship versus content regulation.
Concerns Over Algorithmic Bias?
While TikTok’s proactive approach to content moderation is being applauded in some circles, others worry about potential algorithmic biases in the process. Critics argue that automated content moderation could unfairly target certain creators or lead to the wrongful removal of legitimate content.
“There needs to be greater transparency in how these decisions are made,” said a Nigerian digital rights advocate. “When millions of videos are being removed, we need to ask if all of them genuinely violate guidelines or if the system is biased against certain types of content.”
As TikTok continues refining its policies, content creators in Nigeria may need to stay updated on evolving platform regulations to avoid having their videos flagged or removed.
Looking Ahead: Stricter Regulations or More Freedom?
With TikTok’s crackdown intensifying, Nigerian content creators face a rapidly changing digital landscape. The company’s continued investment in AI-powered moderation tools means that even more videos may be automatically flagged in the future.
However, the debate over content moderation versus digital freedom is far from over. While some argue that platforms like TikTok must ensure a safe environment, others caution against overregulation that could stifle free expression.
For now, TikTok has reaffirmed its dedication to collaborating with global experts to improve safety measures and protect its community. Whether this results in fairer moderation or stricter policies remains to be seen.
