Bois Locker Room is only a small peephole into the rape culture that thrives on the internet

Ketaki Desai & Chandrima Banerjee | TNN

The Bois Locker Room, an Instagram group of Delhi schoolboys that shared sexual images of their classmates and spoke of gangrape, horrified many. But the truth is that rape culture is clearly visible on the internet.

Last year, a Mumbai school suspended eight 13-14 year-olds, for making similar sexually explicit and violent remarks about their classmates. Meanwhile, students of a Chennai college had a Facebook group, which later moved to Reddit, and shared nudes and personal information about their classmates. In a screenshot shared by a Twitter user, one student writes: “She didn’t share her nudes herself. It was her bastard ex. PS: I have them too”. The community was later banned by Reddit.

Reddit has many pornographic subreddits — but TOI has found that many of these are not only used to share porn but also photos pulled from social media accounts — sometimes of influencers, other times of women they know. One subreddit, ‘Indians who make you fap (masturbate)’, which has now been banned, was created to share pictures of one’s friends, classmates, neighbours, to discuss “what they would do (to her)” and give out their names.

Most have thousands of subscribers, and include pictures of girls with links to “leaked videos”. Some are dedicated to girls who either are or look like teenagers. Some posts are clearly acts of what is colloquially called revenge porn – where an ex-partner leaks intimate pictures. In a post seen by TOI, a man had pulled down the pants of a sleeping young woman and photographed her.

It’s easy enough to trace these virtual locker rooms. You could start with YouTube videos that aggregate sexualised clips of female Bollywood actors. The comments section — where words like “whore” and “rape” are thrown around casually — is only a gateway. Amid graphic descriptions of violent sexual fantasies are invitations for “sex chats” — which are essentially asking if others with similar rape fantasies would like to “do it together” on WhatsApp or Hangouts chats. Most use email addresses created just for this purpose, with names like lustman000 or spicyactresslover, but a surprising number seem to use their primary emails and phone numbers.

On Instagram, a basic search for terms like “teen” and “girls” leads to accounts offering images of underage girls, with disclaimers saying they are “sourced” from various places. Some accounts mention they are run by “boys” — which is code for ‘girls’ photos only’. Some strike a bargaining note — “pls dont report if u have problem tell in dm”. And while many are virtual circle-jerks, where the only point is arousal, others go beyond to solicit sex or offer “modelling services” of teenaged Indian girls.

Advocate and cybercrime victim counsellor Dr Debrati Halder says: “Since 2018, we have noticed a rise in such group chats where photos of girls are taken from profile pictures on WhatsApp, Facebook or Instagram and shared without their consent.” Many victims may not even know their images have been shared. Only about 10% succeeded in taking down their pictures, says Halder.

“In rural areas too, there are a lot of sexual comments, texts, and morphed photos being shared on a number of digital communication platforms,” says Poonam Muttreja, executive director of Population Foundation of India which works on adolescent reproductive and sexual health.

They are not merely explicit or raunchy comments, as the recently circulated Girls Locker Room chats were. These make rape threats, and direct harassment towards the victims. “There’s a sort of targeted violence when it’s about someone you know,” says Richa Kaul Padte, author of Cybersexy: Rethinking Pornography. Morphing, she adds, is a “bodily violation, even if you don’t feel it physically. People say, ‘oh it’s not you’, but it is you, your face, your image.”

Non-consensual intimate images are not pornography, they are hate speech, says Nishant Shah, professor of Aesthetics and Culture of Technology at ArtEZ University of the Arts, The Netherlands, who has studied these spaces for a decade. “It’s not about the content, but the intention — which is anchored in expressions of hatred, naturalisation of violence and diminishing the dignity of the person.”

Such groups are “incredibly common”, says Shah. “They’re not even all male, they’re often mixed groups where women who try to resist are bullied and asked to learn to take a joke.” Locker room groups exist in workplaces as well. “As someone who has been working on employee conduct for a long time, I know they exist in every workplace, at least at entry and middle-management levels,” says Pallavi Pareek, founder of Ungender legal advisory.

Toxic masculinity is everywhere, just look at the graffiti on a boys’ bathroom, says Amitabh Kumar, founder of feminist non-profit Social Media Matters. In a Delhi government school, they saw “comments about gang-raping teachers, and explicit cartoons”. The internet creates a disinhibition effect, says Bishakha Datta, executive director of Point of View, a non-profit working at the intersection of gender, technology and disability. “A lot of people behave very differently online than they do in person. A man won’t threaten you with rape on the street, but would do it on Twitter even if his name and photo is on his account,” she says. The other factor, she adds, is herd immunity: “When you are part of a pack, it’s easy to behave like there are no consequences.”

Sharing an image of someone’s private area without consent is illegal in India under section 66E of the IT Act. Reddit prohibits images or videos of nudity or sexual conduct posted without permission, including fake depictions. A spokesperson from Facebook, which owns Instagram, said they use sophisticated technology to proactively find and remove such content. “We have absolutely zero tolerance for any behaviour or material that exploits young people online”.

But internet platforms need to take even more responsibility, says Datta. People are often discouraged from reporting because of the templated responses they receive.

With inputs from Sonam Joshi

HOW A SLEAZE GROUP WORKS

On one WhatsApp group, which TOI joined, nudes and seminudes are shared — of actors, female acquaintances, former girlfriends. Conversations quickly go from how each would have sex with or rape the woman to who would do it better. Several people joined and left repeatedly over the course of a day. Within minutes of joining, one member — clearly an adolescent, going by the profile photo — asked if there were any other “fap group” recommendations. Suggestions were quickly shared and deleted so that there is no trail.

AI THAT CAN STRIP AWAY CLOTHING

As technology evolves, there are new forms of harassment like DeepNudes, where artificial intelligence tools create highly realistic fake nudes from normal clothed images.