Every click, every scroll, and every online interaction leaves behind a digital footprint — proof that in the age of algorithms, the internet never forgets. In today’s Nigeria, data has become the lifeblood of the booming entertainment, fashion, and technology sectors.
But as artificial intelligence (AI) deepens its influence on data collection and use, a critical question arises: What kind of legal framework do we need — not just in theory, but in practice — to protect our digital privacy?
From Analogue to AI: A Shifting Privacy Landscape
Historically, privacy in Nigeria was a matter of physical boundaries and silent spaces. Protected under Section 37 of the 1999 Constitution, citizens were guaranteed the privacy of their homes, correspondence, and telephone communications.
In that pre-digital era, data was physical, and breaches were easier to trace. But with the advent of the internet — and now AI — data has become ambient, borderless, and invisible. Initial legislative efforts like the Nigeria Data Protection Regulation (NDPR) 2019 laid a basic foundation, but being a subsidiary regulation, it lacked robust authority.

The Nigeria Data Protection Act (NDPA) 2023 aimed to address this gap more comprehensively. Yet, questions persist: Is the NDPA constitutionally grounded? Does it effectively tackle the subtle and sophisticated ways data is now being harvested and used?
The Constitutional Quandary
For any data protection regime to be sustainable, it must rest on solid constitutional ground. But is privacy — and by extension, data protection — a federal or state issue?
Because “privacy” isn’t explicitly mentioned in either the Exclusive or Concurrent Legislative Lists, one could argue it falls under residual matters — the domain of Nigeria’s 36 states. If this interpretation holds, the NDPA’s federal legitimacy could be contested, leading to a fragmented system with varying state-level laws and enforcement approaches.
That legal ambiguity risks undermining both trust and compliance — a danger not just for lawyers and lawmakers, but also for businesses, startups, creators, and consumers. Nigeria needs to either clearly define data protection as a federal concern or delegate it through a clear legislative mandate.
Regulation Must Empower, Not Exploit
Our regulatory philosophy must evolve. If the Nigeria Data Protection Commission (NDPC) is viewed as a revenue-generating agency rather than a rights guardian, its legitimacy is threatened.

Public statements about revenue goals — like NDPC’s target to “recoup” funds — signal a troubling direction. This undermines the mission and deters the very groups data protection laws aim to support, especially startups and small businesses. Burdensome annual fees, audits, and compliance mandates can throttle innovation without meaningfully improving data safety.
This raises a bigger question: Do we even need a standalone Data Protection Commission? In a data-driven world, wouldn’t it be more effective to embed privacy oversight across existing institutions — like the Consumer Protection Council, sectoral regulators, or the judiciary — with harmonized rules and clear accountability?
AI and the New Face of Exploitation
AI is redefining privacy in ways we never imagined. It captures not just explicit data, but inferred preferences — what we linger on, scroll past, or emotionally respond to.
Imagine a fashion app that adjusts prices or suggests products based on your mood or hesitation. This isn’t theoretical. It’s happening. And today’s “terms and conditions” aren’t enough protection.
During a session hosted by the Intellectual Property Lawyers Association Nigeria (IPLAN), a creator’s case revealed how exploitative terms buried in digital platforms enabled unconsented data use. This underscores the need for radical transparency in AI systems — explainable algorithms, clear opt-out choices, and enforceable rights against profiling.
Without that, we risk enabling a system where users are manipulated by unseen forces — nudged, priced, or judged by machines they can’t question.
Privacy Is Power
Privacy isn’t a privilege — it’s a human right. But rights only matter if they’re usable. Artists, traders, designers, and students should be able to access and correct their data without costly legal action. Solutions should be intuitive — think online portals, local data ombudsman offices, and fast, fair dispute resolution channels.
Conclusion: Toward a People-Centered Future
The NDPA 2023 is an important milestone. But true digital safety goes beyond legislative text. We must prioritise vision, accessibility, and enforcement that protects people, not profits.
Let’s not build bureaucratic castles of compliance. Let’s design a digital ecosystem where rights are preserved, innovation is encouraged, and AI serves the dignity of every Nigerian — not the secrecy of a few corporations.
What we create today will not just shape data governance, but democracy itself.
Folarinwa Aluko
Legal Practitioner and Partner, Trumann Rockwood Solicitors
✉️ fmaluko@trumann-rockwood.com
Read More:
The Lawyer’s Guide to Authenticating Electronic Evidence in Nigeria: Folarin Aluko