• Sat. May 2nd, 2026

Raised by the Feed and Shaped by the Algorithm

By Howard Mabhugu

HARARE -NOT long ago, parents imagined their children growing up to become doctors, engineers, or entrepreneurs. Today, a growing number quietly hope their child might become the next viral creator before they can even spell their own name.

Childhood has not just changed. It has been redesigned, and at the center of that redesign is social media.

Globally, the average internet user now spends about 2 hours and 20 minutes per day on social platforms, according to DataReportal’s 2024 and 2025 global reports. Among younger users, that number is often significantly higher, with some groups approaching three hours or more daily.

Over time, those hours accumulate into years spent scrolling, watching, and reacting. While digital spaces can educate and connect, they are also carefully engineered to keep users engaged for as long as possible.

This is not accidental. Social media platforms rely on algorithms that learn user behavior and continuously serve content designed to hold attention. The longer users stay, the more valuable their attention becomes. In that sense, these platforms operate less like neutral tools and more like highly optimized engagement systems.

For children, this environment can be particularly complex. Alongside entertainment, feeds may expose young users to risky challenges, inappropriate content, and bad actors. Recent data highlights the scale of the problem. Online grooming crimes have risen sharply in recent years, with some reports indicating increases of up to 80 percent. In the United Kingdom alone, offences have risen by nearly 90 percent over a six-year period, while reports of online child exploitation and enticement continue to increase across multiple countries and platforms. Law enforcement agencies increasingly warn that social platforms have become primary access points for these crimes.

At the same time, a new dynamic has emerged: parents as content creators. Family content has evolved into a business model, generating millions of views and significant income. In response, some governments are beginning to act. France has introduced laws protecting children’s image rights and regulating monetized content involving minors. In the United States, states such as Illinois and Minnesota now require that children featured in monetized content receive compensation and retain rights over that content. Other countries, including Australia, Malaysia, Spain, and Portugal, are introducing age restrictions and parental consent requirements for social media use.

The direction is clear. Governments are beginning to treat children’s digital presence as a rights issue rather than simply a parenting choice.

That shift is no longer hypothetical. In a recent landmark case reported by the BBC, a young woman successfully sued Meta and Google, arguing that their platforms were deliberately designed to be addictive and had harmed her mental health. A jury found both companies liable and awarded her millions in damages, a decision the companies are now appealing. The case signals a turning point. The impact of algorithms on childhood and behavior is no longer just a social concern. It is beginning to enter the legal system.

The conversation becomes even more complex with the rise of artificial intelligence. AI companions are increasingly being positioned as tools for emotional support, and in the short term, they appear to work. Research from Harvard Business School and the Journal of Consumer Research suggests that AI companions can reduce feelings of loneliness, sometimes at levels comparable to human interaction in the short term.

However, the same research raises important concerns. These systems are often designed to be agreeable, validating user input and responding in ways that sustain engagement. This introduces the concept of AI sycophancy, where systems tend to affirm user beliefs, even when those beliefs may be irrational or harmful. This tendency is not purely accidental. It reflects how these systems are trained and optimized. Models that feel supportive and agreeable tend to retain users more effectively, making sycophantic behavior, in part, a byproduct of design priorities.

That design choice comes with trade-offs. Human relationships require disagreement, negotiation, and emotional complexity. By contrast, AI can be consistently accommodating. While this may feel comforting in the moment, it risks shaping expectations in ways that make real-world relationships more difficult over time.

Emerging research suggests that heavy or dependent use of AI companions may correlate with increased loneliness, emotional dependence, and reduced social interaction in the real world.

The trajectory is becoming clearer. Social media captures attention. Algorithms shape behavior. AI begins to simulate connection. As these systems evolve, the line between interaction and substitution starts to blur.

The real question is not whether technology will shape the next generation. It already is.

The question is this:

If childhood, identity, and relationships are increasingly shaped by systems designed to maximize engagement rather than well-being, what exactly are we optimizing for?

And more importantly, who decided that was the goal?

Howard Mabhugu is a software developer, blockchain developer and tech podcaster who explores the intersection between technology and society; using his voice to spark meaningful and thought-provoking conversations around digital culture and innovation.

Contact:

Cell: +263 784 942 650

Email: hmabhugu.3@gmail.com

LinkedIn: https://www.linkedin.com/in/howard-mabhugu-7a9a42228/

Website https://bh3techs.co.zw/


Discover more from Etimes

Subscribe to get the latest posts sent to your email.

0 0 votes
Article Rating

Leave a Reply

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Discover more from Etimes

Subscribe now to keep reading and get access to the full archive.

Continue reading

0
Would love your thoughts, please comment.x
()
x