Scroll through your feed, and you get an ad for an item you looked at earlier today.
It feels handy at first, a sign of how innovative platforms are getting. But then it crosses the line. When ads know too much, like which street you walked on or what you whispered about last night, they start feeling invasive. Afternoons that began with gentle product suggestions turn into afternoon alarms about data being sold.
This essay discusses how Africans find comfort and alarm in the same personalised ad before it slides into creepiness.
Personalised ads feel useful at first.
Personalised ads offer an easy win. They aim to match your interests with products or services you want. For many Africans, who navigate social media for news, connections, and inspiration, ads that focus on their needs are as welcome as filtered drinking water. Platforms have learned to serve the right content at the right moment.
Meta’s global ad reach shows how massive the stage has become. In January 2025, Facebook ads reached 2.28 billion people, covering nearly 28 per cent of the world’s population, according to DataReportal—Global Digital Insights. Personalised content appears in timelines across continents, including Nigeria and Kenya.
Concern sets in when ads hit too close to home
Personalisation thrives until it begins to feel like surveillance. A global study found that 68 per cent of consumers worry about how much data is collected, and 60 per cent believe their personal data is routinely misused, privacyengine.io. Those numbers reflect a growing unease.
In Africa, worries over privacy go beyond the abstract. In Mozambique, a GeoPoll study showed that 31 per cent of users cite privacy concerns as a key barrier to social media use GeoPoll. These concerns matter because many still depend on digital spaces for livelihood, learning, and social engagement.
Laws racing behind reality
Governments are responding, but unevenly. Yellow Card’s report shows that 39 of 55 African countries have data protection laws, and 34 have established data protection authorities as of 2025, CIO Africa. That is progress.
Still, a significant gap remains. A 2023 analysis of East African websites revealed that only 75 per cent had privacy policies, and only 16 per cent disclosed which third parties collected user data arXiv. Citizens may click “accept” without understanding what they give away. Language and literacy barriers make this worse.
Real stories, real stakes
Imagine a student in Abuja who starts seeing exam help ads right after searching for past papers. It feels helpful. Then, ads target the student’s neighbourhood shops, classes, and preferences in unsettling detail. That shift can trigger the thought: Who is watching me?
Entrepreneurs in Lagos report that buyers often ask for reassurance that their data won’t be misused before they commit to online payments. That hesitation slows commerce, not because the products lack quality, but because trust feels fragile.
What needs to happen next
Platforms must stop treating privacy as an afterthought. They need simple, straightforward controls, not buried menus. African governments must enforce data laws, not just pass them. Users need more than warning banners; they need education that helps them manage their digital trail.
Regulations should require transparent disclosures in plain language. Platforms should limit ad tracking by default, not after users opt out. Brands should use data responsibly and with respect, not just to convert.
We live in a connected era that makes personalised ads feel like convenience. However, convenience becomes intrusion when platforms collect more than we consent to. Africa today stands at a crossroads. We can choose digital growth that respects privacy or let innovation trample trust. Personalised ads will stay useful only if they stop being haunting.