Call us now 01926 402 498
Following pressure from payment processors, Steam and Itch.io have removed thousands of NSFW adult games.
Two of the biggest game hosting platforms, Steam and Itch.io, have recently removed large amounts of sexually explicit content after coming under pressure from their payment processors. This decision followed public backlash and advocacy efforts by anti-violence organisations, after a game involving rape and incest was briefly hosted on both sites.
But while the removal of that particular game sparked the conversation, the recent action goes much further. It marks a significant shift in how platforms respond to violent pornographic content, and it highlights urgent questions about digital regulation, platform accountability, and survivor safety in online spaces.
In April 2025, a game depicting non-consensual sexual violence and incest (No Mercy) was briefly listed on Steam and Itch.io. After widespread condemnation, both platforms removed the game. Soon after, Itch.io issued a statement announcing it had “deindexed” all adult NSFW (Not Safe For Work) content from its site.
“To ensure that we can continue to operate and provide a marketplace for all developers.” Itch.io explained, “we must prioritize our relationship with our payment partners and take immediate steps towards compliance.”
Steam followed suit, delisting hundreds of adult games. Many focused on sexual violence. These changes were not just about one game. They were about compliance with payment processors like Mastercard, Visa, and PayPal, who appear increasingly unwilling to support platforms that profit from violent, exploitative, or illegal content.
Itch.io is currently conducting a full audit of all NSFW content on its platform. Developers now face new restrictions and must ensure their games comply with updated guidelines. While some adult content may return following review, games that contain or glamorise sexual violence, incest, or abuse are expected to be permanently removed.
Steam, owned by Valve, has also revised its storefront policies, banning content that violates the standards set by payment processors. This includes games that promote non-consensual sexual acts, even if labelled as “fiction” or “fantasy.”
For years, survivors and campaigners have raised alarm over rape simulator games, calling them a form of digital sexual violence. These games often mirror real-life abuse, reinforcing harmful myths about consent and normalising coercive or predatory behaviour. They contribute to an ecosystem where violence against women and children becomes entertainment.
Safeline believes that digital spaces must be held to the same standards of safety, ethics, and accountability as offline ones. When violent content is normalised in games, it doesn’t stay on screen. It affects attitudes, beliefs, and behaviours.
Some users have criticised the move as “financial censorship,” arguing that companies like PayPal and Visa shouldn’t dictate what content is allowed online. But others, including anti-violence advocates, say this is a long-overdue step toward protecting users from harm.
There’s a difference between adult content created responsibly and exploitative media that glorifies rape, incest, and abuse. Just because something is labelled as “fiction” doesn’t mean it’s free from real-world impact.
Safeline supports the move to regulate digital sexual violence and create safer online environments. Platforms that profit from abusive content should be held accountable. Not just by payment providers, but by their users, communities, and governments.
This is about more than content moderation. It’s about ensuring survivors are not retraumatised by what they encounter online, and about protecting young people from harmful content that distorts their understanding of consent and healthy relationships.
Stay up to date with Safeline’s work by following us on Social Media:
Facebook, Instagram, LinkedIn, TikTok and X!