SocietyScience & Technology CIJ Report Shows How Tech Giants Systematically Remove Content About Women’s Health

CIJ Report Shows How Tech Giants Systematically Remove Content About Women’s Health

Big Tech platforms' content moderation shows a gross bias. Erectile dysfunction ads and posts are popular while women's healthcare content is often flagged.

A groundbreaking report by the Center of Intimacy Justice (CIJ) reveals how tech giants like Meta, TikTok, Amazon and Google are systematically suppressing sexual and reproductive healthcare content for women while allowing similar information for men to be shared without a hiccup.

The report, “The Digital Gag: Suppression of Sexual and Reproductive Health on Meta, TikTok, Amazon, and Google,” is the most comprehensive investigation on this issue to date. It captures and analyses the experiences of 159 non-profits, businesses, and content creators in 180+ countries, uncovering a pattern of bias that silences free speech while endangering public health.

“Censorship has impacted our ability to reach a wider audience on social media, which is preventing people from receiving potentially life-saving sexual health information and education,” says Emily Vignola, a public health non-profit professional, in the report.

How platforms enforce double standards

A glaring double standard comes to light in how Big Tech platforms moderate content. Ads and posts about men’s sexual and reproductive health issues, like erectile dysfunction, are widely accepted. Yet similar content about women’s healthcare is routinely flagged, restricted, or removed, even causing account suspension.

Here’s a breakdown of how each of these platforms censored women’s healthcare content, as per the report:

1. Meta

Meta, which owns Facebook and Instagram, emerged as a very restrictive platform for women’s health content.

  • 63 percent of sexual and reproductive health groups had their organic content removed
  • 84 percent of businesses and 76 percent of non-profits faced ad rejections.

Take these two ads, for example. The first one is for men’s erectile dysfunction by hims, and the second is for women’s UTI treatment by wisp.

This ad got approved.

hims (Sponsored Ad)

“Rise to the occasion – for less. 🍆 Get hard or your money back.

Sildenafil uses the same exact active ingredient as Viagra®, but is 90% cheaper…

With the ad image text: “Get hard or your money back.

Yet this got rejected.

wisp (Sponsored Ad)

Did you know that 1 in 30 UTIs will become a kidney infection? Get prescription UTI treatment trusted by doctors to cure infections quickly. Same-day meds without insurance or appointments—just the care you need, when you need it.”

With the ad image text: “The fastest UTI treatment out there.

When such ads were rejected, 31 percent weren’t even informed why. Other times, when Meta shared the reason, they were flagged under guidelines like “Adult Content,” “Illegal Products or Services,” and “Restricted Goods and Services” among others.

2. TikTok

TikTok, the platform known for its viral reach and relatable content, is equally complicit in suppressing women’s healthcare content.

  • 55 percent of groups had their organic content removed
  • 48 percent faced ad rejections, with many receiving vague reasons by the platform citing “community guidelines
  • 52 percent of creators believe their content was shadowbanned. They never got an explanation for why.

Plus, the “Sensitive” label on TikTok automatically limits the reach, and it can’t even be appealed. 39 percent had their content labelled so.

If you’re going to flag something, be clear on what I’m violating so I can do better in the future,” says content creator, Shelby Goodrich Eckard, PCOS Support Girl, in the CIJ report.

3. Google

Google, through Google Ads and YouTube, has also been implicated in removing and censoring content, sometimes even ‘age-gating,’ which means restricting content reach to people under a certain age threshold set by the platform.

  • 66 percent had ads rejected, often classified under their “sexual content” or “inappropriate material” policy
  • 58 percent faced age-gating and monetisation limits on educational videos about women’s health, while content about men’s health remained unrestricted.

‘Google does not [distinguish] between educational content and obscene content. The algorithms make no distinction, as [these practices] are keyword-based. And keyword restrictions are very broad and sweeping,’ says Arti Shukla, Love Matters India, a non-profit.

In fact, Google even recommends using medical terms like “vaginismus” instead of “sex pain” or “pain during sex.” What would someone in pain search for on Google? Even data showed that searches for “sex pain” were five times more than “vaginismus.”

It’s like the common folks are being penalised for a lack of medical education, which further keeps them in the dark about their health.

4. Amazon

One of the biggest marketplaces for health products has also been accused of practising bias.

  • 64 percent of businesses had product listings removed, often labelled as “adult” or “explicit
  • 34 percent faced account suspensions, blocking their revenue source and making those products unavailable for buyers
  • A Search Bias was also discovered. Amazon offered significantly fewer suggestions for terms like “vaginal health” compared to “erectile dysfunction

‘I cannot grow my company [or] hire more help because, at any minute, Amazon can shut my entire account down. . . My account and/or listings have been shit down multiple times,’ says Tara Langdale-Schmidt, president of Vuvatech LLC, which sells dilators, in the same report by CIJ.

Suppression of women’s health causing stigma and shame

When platforms suppress helpful content around health issues, they isolate the people seeking help. It is an outright denial to access critical healthcare products and information.

When they label reproductive health ads “explicit” or age-restrict videos educating the viewers, they send a corrosive message to the world: Women’s bodies are inherently inappropriate. Imagine the internalised shame this can cause, especially in young people.

Self-censorship of women’s sexuality

Creators and businesses are forced to use “algospeak,” an alternative language that flies under the algorithm’s radar. For example, in content, “lube” is replaced with “loob,” and “doing it” is used instead of “intercourse.”

This practice is not only complicating communication, it is also reinforcing the idea that women’s bodies are taboo. Especially when sexual education is already lacking, this censorship only worsens stigma.

“When content creators and brands are forced to self-censor their language to avoid content takedowns, it encourages a culture of shame around sex-education language and results in imprecise and less accurate educational content,” says Sarah Brown, formerly at Lorals, in the report.

The cost of being silenced

The financial toll due to this censorship is no less. For 85 percent of these health groups, fundraising has become an uphill battle as platforms erode their credibility by harming their reach and engagement.

Businesses reported annual revenue losses ranging from USD 20,000 to 5 million due to Meta’s arbitrary ad removals, while Amazon’s erratic product restrictions cost others up to $1 million annually.

“It limits our ability to generate revenue, spread awareness, and dismantle shame around women’s health,” says Kate Taylor of EvenlyBreast, a company addressing breast asymmetry stigma, in the report in CIJ.

Gender inequality and public health crisis

Obviously, tech platforms didn’t start sexism, but their policies codify it. By disproportionately targeting women’s health content, they are reinforcing that men’s needs are legitimate but women’s are optional.

They are sabotaging businesses selling menstrual products or pelvic health devices by suspending their Amazon accounts, while male wellness brands thrive. They are silencing marginalised voices by flagging resources for LGBTQ+ individuals, worsening health disparities.

At a global level, content suppression on platforms isn’t just a tech issue, it’s a public health crisis.

A lack of awareness about health conditions can lead to delayed diagnoses and further delayed treatment. Abortion is already a taboo topic in many countries. When information about abortion is suppressed, it directly impacts reproductive rights by restricting access to care. Marginalised communities are already underserved by healthcare systems. The additional barriers to accessing information are only exacerbating inequities.

Delayed diagnoses, untreated conditions, untimely health interventions, and lack of awareness about reproductive health options have very real consequences on real people.

Can the code be rewritten?

The CIJ report needs to serve as a wake-up call for tech platforms, policymakers, and the general public. To combat this digital erasure of women’s healthcare, three pillars of action are non-negotiable:

1. Transparency

Platforms like Meta and TikTok need to start clarifying their content moderation policies. They need to provide clear, valid, and acceptable explanations for removals, creating a distinction between what is educational content and what is obscene.

Application: Platforms must publish detailed, plain-language explanations for content removals, including which specific policy was violated and how.

2. Accountability

Tech giants cannot police themselves. Internal “oversight” boards lack teeth, and self-reported transparency reports often aren’t transparent enough. 

Application: Mandatory third-party independent audits of moderation algorithms should be enforced. A great template is the EU’s Digital Services Act (DSA) which requires platforms to disclose systemic risks, including gender bias.

3. Advocacy

Grassroots campaigns and policy reforms must come together to forge change. More people should advocate for better systems, online and offline.

Application: Lawmakers should tie platform liability protections to compliance with anti-discrimination standards.

This isn’t just about fairness, it’s about survival. It’s about universal human rights. When platforms censor health educators and remove access to health products, they aren’t “protecting” their users. They are isolating them and making them helpless. They are upholding a world where women’s pain is ignored, their bodies are policed, and marginalised people’s health is expendable.


About the author(s)

Forget textbooks, Mrudavi got hooked on writing through her childhood obsession with fiction novels. Now, she tells engaging stories that address real-world topics with a touch of her experiences. When the writing bug takes a break, Mrudavi can be found curled up with a good book or with her favourite people, fueling her imagination with endless cups of iced lattes.

Leave a Reply

Related Posts

Skip to content