By Sonny Aragba-Akpore
Government’s Crackdown on Offensive Content Sparks Debate
The deactivation of over 13 million social media accounts of Nigerians by government recently was received with knocks and cheers by the public. Was it a wrong move? Some said yes while others said no. For a better understanding of the scenario that led to the delisting, let us situate it properly.
RELATED: NITDA issues Code of Practice for online platforms operating in Nigeria
Was the government empowered by law? The answer is yes. Was there enough ground for delisting? A yes is appropriate. Were the people notified of government intentions? The answer is a yes also.
So what went wrong? In 2024, Facebook, Instagram and others took similar actions by deleting millions of social media accounts on the ground that such users had violated the ground rules for the platforms.
Same last year, the Federal Competition and Consumer Protection Commission (FCCPC) imposed a fine of $220m on Meta Group for infractions and violation of competition rules. Meta went to court and lost.
Legal Backing and Precedents in Tech Regulation
So, when the government in its wisdom decided to deactivate social media accounts of those violating ground rules, it was believed to have been done in good fate. The delisted accounts allegedly violated code of practice on offensive content.
The government’s action is contained in a ‘Code of Practice 2024 Compliance Report’ submitted by promoters of interactive computer service platforms such as Google, Microsoft and TikTok, among others. The accounts shut down were on Facebook, Instagram, Tik Tok and X(Twitter) for violating the code.
Content Removal at Unprecedented Scale
Hadiza Umar, Nigeria Information Technology Development Agency (NITDA) Director of Corporate Communications and Media Relations, said in a statement last week that 58,909,112 offensive contents were taken down from various platforms for violating the Code olf Practice for Interactive Computer Service Platforms. She commended Google, Microsoft, and TikTok for complying with the Code of Practice.
This Code of Practice was issued jointly by the Nigerian Communications Commission (NCC), the National Information Technology Development Agency (NITDA), and the National Broadcasting Commission (NBC).
“The compliance reports provide valuable insights into the platforms’ efforts to address user safety concerns in line with the code of practice and the platforms’ community guidelines,”
There were 754,629 complaints registered across the platforms, while 420,439 pieces of content were taken down and re-uploaded following user appeals.
“The submission of these reports marks a significant step towards fostering a safer and responsible digital environment for Nigerian users.
“It also demonstrates the platforms’ commitment to ensuring a secure and trustworthy online environment for all.
Collaboration Between Regulators and Tech Platforms
“This achievement reflects the provisions of the code of practice, which mandates that large service platforms are registered in Nigeria and comply with relevant laws, including the fulfilment of their tax obligation, while reinforcing the commitment to online safety for Nigerians.
“While NITDA acknowledges these commendable efforts, we emphasise that building a safer digital space requires sustained collaboration and engagement among all stakeholders.
“We remain committed to working with industry players, civil society, and regulatory partners to further strengthen user safety measures, enhance digital literacy, and promote trust and transparency in Nigeria’s digital ecosystem” Umar emphasized.
In July 2024, the Federal Competitions & Consumer Protection Commission (FCCPC) in collaboration with the Nigeria Data Protection Commission (NDPC)imposed a whopping $220m fine on Meta Group,owners of Facebook, Instagram and WhatsApp.
Its offence was violation of data privacy of individuals and corporate customers. Then analysts saw this as killing a fly with a sledge hammer.
The Bigger Question: Safety vs. Digital Freedom
Earlier, Meta Platforms had justified the encroachment of privacy when it delisted and deactivated 63,000 Facebook and Instagram accounts allegedly being used by certain category of subscribers for scam activities including sextortion and what is commonly referred to as “yahoo” in Nigeria, thus starting a battle that will linger and consume the beleaguered consumers.
In imposing the $220m fine, FCCPC in a statement signed by its then acting Executive Chairman, Adamu Abdullahi, said that Meta had denied Nigerian users control over their data, shared data without consent, and abused its market dominance.
It said, “The final order also imposed a monetary penalty of Two Hundred and Twenty Million U.S. Dollars only ($220,000,000.00) (at prevailing exchange rate where applicable) which penalty was in accordance with the FCCPA 2018, and the Federal Competition and Consumer Protection (Administrative Penalties) Regulations 2020.”
The FCCPC noted that this decision was reached after a joint investigation by it and the Nigeria Data Protection Commission (NDPC), which lasted for 38 months (May 2021 to December 2023). The investigation examined Meta’s conduct, privacy policies, and operations.
But a WhatsApp spokesperson said this decision will be appealed. “We disagree with both this decision and the fine and will appeal,” the spokesperson said. The Group appealed the government’s decision and lost.
FCCPC vs Meta
On April 25,2025 the Competition and Consumer Protection Tribunal (CCPT) upheld the $220 million fine imposed on Meta Platforms Inc., the parent company of Facebook and WhatsApp LLC, by the Federal Competition and Consumer Protection Commission (FCCPC) for engaging in discriminatory and exploitative practices against Nigerian consumers.
Delivering judgment in Abuja, the three-member tribunal panel led by Hon. Thomas Okosun ruled that the FCCPC acted lawfully and within its constitutional powers. The panel also awarded the Commission $35,000 to cover the cost of its 38-month-long investigation, which began in 2021 in partnership with the Nigeria Data Protection Commission (NDPC).
The case was based on alleged breaches in Meta and WhatsApp’s privacy practices, data handling policies, and consumer engagement standards, which the FCCPC considered non-compliant with Nigerian law.
The tech giants had appealed the FCCPC’s Final Order issued in July 2024, which found them liable for anti-competitive conduct and unfair business practices.
NITDA’s Deactivation Based on Rules of Engagement
But NITDA’s deactivation of over 13 million social media accounts is predicated on the rules of engagement. Part II, Section 10 of the Code mandates that a compliance report is submitted by Large Service
Platforms (LSP) to NITDA. The rationale behind the yearly compliance report is significant, as it plays a crucial role in cultivating a safer and more accountable digital environment in Nigeria.
By requiring LSP to submit compliance reports, the Code aims to ensure transparency, increased accountability, and enforce adherence to regulatory standards, thereby bolstering user safety and fostering a reliable cyberspace.
The Code sets various compliance requirements for Platforms to meet, aligning with the broader objectives of safeguarding user interests and combatting online harms. These compliance requirements amongst others include Account Deactivations.
“Platforms must promptly close and deactivate accounts of bad actors to protect users from potential risks.”
Platforms are Obligated to Remove Harmful Content
“Content Removals state that Platforms are obligated to remove harmful content, whether flagged by users, authorities, or through internal reviews, to maintain a safe online environment.
In terms of Content Reinstatement, Platforms must provide mechanisms for users to appeal for the reinstatement of removed content, promoting fairness and user empowerment.
And in terms of User Complaint Handling, Platforms must establish effective complaint resolution mechanisms to address user concerns promptly and improve the overall user experience.
“Compliance with these requirements is crucial for prioritising the well-being of Nigerians online by reducing exposure to harmful content, enhancing user trust, and fostering a positive digital experience.
Each Platform’s compliance effort reflects a commitment to meeting these standard and ensuring a safer online space for each user.”
Platform’s Compliance revealed Commitment to User’s Protection
Analysing how each Platform complied with the Code revealed their tailored approaches and commitment to user protection. For instance, Google’s proactive content moderation measures, responsive feedback handling, and reinstatement procedures, showcased user-centric approach to compliance.
The LinkedIn’s enforcement of community guidelines and professional policies, swift content takedowns, and resolution of user complaints underline its dedication to maintaining a trusted platform for professionals;
TikTok’s collaborative efforts with stakeholders, strict moderation policies, and user empowerment initiatives highlight its commitment to combating harmful content and facilitating a safe online environment.
“By complying with the Code’s requirements, these Platforms not only fulfil regulatory obligations but also contribute to cultivating a safer, healthier, and more responsible digital landscape that prioritises user well-being and fosters trust among Nigerian users, “the NITDA document explains.
Delisted Users Violated Ground Rules for Safer Online Space
In reality, the over 13 million users delisted violated some or all of the ground rules for a safer online space, NITDA insists.
NITDA’s analysis of the 2024 compliance reports submitted by social media platforms operating in Nigeria now in the second year of reporting under the Code indicates modest progress toward meeting the Code’s requirements.
While platforms such as Google, TikTok, and LinkedIn have taken visible steps to improve compliance, including account deactivations, harmful content removals, and enhanced user reporting mechanisms, overall adherence remains inconsistent and uneven across key obligations.
Notable efforts have been observed in the content moderation efforts and user complaint resolution by some platforms. Google, TikTok, and LinkedIn have exhibited compliance efforts, resulting in the deactivation of over 28 million accounts and the removal of more than 58 million pieces of harmful content. Their commitment to transparency is evident in efficiently handling appeals for content re-uploads and resolving millions of user complaints.
Some Platforms yet to Demonstrate Code’s Level of Accountability
However, some platforms are yet to demonstrate the level of accountability and proactive engagement required to meet the spirit and letter of the Code. Of particular concern is Meta’s failure to submit its content moderation report using the template prescribed by NITDA, which undermines comparability and limits the ability to assess compliance uniformly.
“Most concerning is the complete lack of compliance by X (formerly Twitter), which has failed to adhere to submission of 2024 compliance report, and other requirements of the Code, which include failure to incorporate in Nigeria, have a physical contact address, and designate a local compliance officer.”
NITDA said “moving forward, it is imperative that platforms strengthen their internal compliance processes, scale up investment in trust and safety operations tailored to Nigeria, and engage more constructively with regulators. Greater collaboration, transparency, and consistency will be essential to build meaningful trust with Nigerian users and regulators alike.”
The 2024 reporting cycle reflects a growing awareness of regulatory expectations but also underscores the need for stronger enforcement and sustained commitment from platforms.
“NITDA remains committed to its role in promoting a safe and secure digital environment through active oversight, multi-stakeholder engagement, and continued public awareness.” the document says.
“As Nigeria advances toward building a resilient digital future, the full and genuine implementation of the Code of Practice by platforms is not only a regulatory expectation but a shared responsibility
necessary for safeguarding users and upholding the integrity of the country’s digital space, “the document added.