Bugattis, Brotherhood, and the Business of Belonging: How the Manosphere Finds Boys Before Parents Do

What color is your Bugatti?

If you have been around Gen Z boys in the last few years, chances are you may have heard this curious question tossed around. 

The phrase “What color is your Bugatti?” has been echoed in settings everywhere from classrooms to gaming chats.  You might be surprised to learn this quote was never really about cars. It’s a catchphrase made famous by Andrew Tate, a British-American-Romanian influencer and self-styled “success coach” who was recently imprisoned for egregious crimes and is now under ongoing investigation. For boys, repeating the line can feel like an inside joke. For parents, it’s a subtle clue that a child may be brushing up against a much larger online ecosystem—the manosphere.

The manosphere is a loosely connected yet powerful network of influencers, forums, and video channels that mask misogyny, conspiracy theories, scapegoating, and alt-right rhetoric under the guise of motivational advice targeted toward young men. 

As Stephen Balkam, FOSI’s CEO, noted in his blog on the Netflix drama Adolescence, we’re watching an entire generation of boys wrestle with toxic digital role models who disguise harmful messages as empowerment.

Figures like Tate, Adin Ross, Fresh and Fit, and dozens of other “manfluencers” pose as mentors, offering young men advice on confidence, fitness, financial success, and dating. But unlike most self-improvement trends, the manosphere carries a darker subtext: you don’t feel powerless because you’re struggling—you feel powerless because feminism, women, or social progress have taken what’s rightfully yours.

For caregivers and educators, this isn’t background noise. It’s an ecosystem that can exploit boys’ real struggles with identity, mental health, and belonging, often isolating them further and profiting off of this isolation.

Why Boys Click in the First Place

Most boys don’t seek out misogyny; they seek answers: How do I build confidence? Make money? Get stronger? They find a creator that resonates with them, and the algorithm does the rest.

Exposure to this content is often a matter of algorithmic pathways and lack of oversight rather than active searching. Research conducted by Ofcom shows that children are far more likely to come across harmful or misogynistic content when parents are less engaged in what their kids are doing online. 

Vodafone’s 2024 AI Aggro-rithms report underscores just how quickly young boys can be led down dark rabbit holes: 69% of boys aged 11–14 were exposed to misogynistic content within 30 minutes of being online, and one in ten saw it in as little as 60 seconds. Strikingly, 59% of boys were led there through completely unrelated searches, showing how aggressively algorithms push this material. 

Boys click out of curiosity, but they stay because the content identifies and poses solutions to their deepest insecurities—fears about confidence, belonging, and masculinity that they feel no one else is addressing.

But the “solutions” presented to these problems are never neutral; instead, they are pitched as quick fixes that tie self-worth to dominance, wealth, or control over women. What starts as gym or money advice quickly pivots to blaming feminism or “weak men,” and once a young child engages even lightly, recommendation systems tend to feed them more of the same, making fringe voices and ideologies feel mainstream.

When Self-Improvement Turns Into Exploitation

Researchers at Dublin City University created test accounts designed to mimic teenage boys. These accounts only lightly engaged with typical, non-inflammatory content. Within just a few hours, up to 78% of those accounts were shown mostly toxic “alpha male” and anti-feminist material, turning casual searches for fitness tips or motivation into a near-constant stream of misogyny. 

The exposure alone is greatly concerning, but it’s also troubling how many boys embrace these messages once exposed to them.

A 2024 poll by Hope Not Hate found that 79% of 16 and 17-year-old boys in the UK were had consumed content from Andrew Tate.  Only 58% of the same 16 and 17-year-old boys said that they had heard of then UK Prime Minister Rishi Sunak (58%). Over half of those boys held a positive view of Tate, often citing that he “wants men to be real men” or “gives good advice.” Only 1% of 16 and 17 year-old girls shared a positive view of Tate.

Teachers and parents worldwide are already witnessing the fallout.  Research shows that the more male pupils engage with online misogyny, the more girls and female staff report sexist discrimination at school. This remains true even after accounting for school size and socioeconomic factors. 

While young people suffer, manfluencers profit. A 2025 Center for Countering Digital Hate (CCDH) report found that Tate-branded misogynistic videos on YouTube amassed 54 million views in a single year, with 31 of the 100 videos studied carrying ads from major brands despite violating platform policies. Beyond Tate earning advertising revenue through his own channels, a large share of these videos comes from “fan accounts” that double as marketing pipelines. Many include affiliate links to Tate’s paid course, meaning attention on YouTube can be converted into sign-ups and revenue for Tate’s business even while he’s officially banned. 

In other words, harmful content isn’t just slipping through the cracks. It’s actively being amplified in ways that make it harder for adults to keep young people safe online. 

What We Can Do:

The current online climate for young men and boys can feel overwhelming. But there are many powerful counter-influences, including parents. 

The manosphere thrives on silence, shame, and boys who feel unseen. Its influence doesn’t come from the quality of its advice, but from its ability to prey on vulnerability and package it as community. That’s why parents and educators need to first understand the playbook: the pivot from “work harder” to “women are the problem,” the algorithmic rabbit holes that speed it up, and the profit streams that keep it alive.

As Stephen Balkam outlined in his piece, the practical steps are already clear: boundaries, consistency, and conversations. But putting these into practice requires more than just setting screen limits—it’s about building trust and making digital life a regular part of family life. 

Start small: sit down with your child and ask what content they enjoy and what makes them uncomfortable. Explain, in simple terms, how recommendation algorithms work, and help them spot when their feed starts pushing more extreme material. Make these check-ins routine, not reactive: just a few minutes over dinner or on the way to school can normalize open talk about online habits. Just as important, point them toward healthier voices online (such as athletes, educators, or creators who model confidence without tearing others down), so the algorithm has positive signals to amplify. These measures matter urgently in creating a world where boys feel seen, supported, and are less likely to turn to harmful digital role models.

If we don’t recognize the manosphere for the trap that it is, a business model built on exploiting boys’ pain, then the next “Bugatti” joke won’t just be a meme. It will be another warning we ignored. But if we choose to see it, it becomes the key to unlocking the conversation we can no longer afford to ignore. 

Arpitha Sistla