Online Gaming Is Pretend – But the Harassment In-Game Isn’t
As online gaming gets more advanced, the definition of harm changes.
In 2016, Jordan Belamire wrote about her harrowing experience of being chased, pinched and groped by a male avatar in a VR game. The avatar would constantly follow her around and try to grope her every chance he got. She writes about how violating it felt: “I went from the god who couldn’t fall off a ledge to a powerless woman being chased by an avatar named BigBro442.”
The expansive worldbuilding and escapist storylines of video games are universally appealing, and with 420 million active online gamers in 2022, the industry is swiftly growing in India. It’s easy to see why an immersive, embodied experience can be thrilling, but for women and trans gamers, this can also be a space that entraps them in real-world hierarchies. Like in the “real world,” verbal abuse, virtual stalking, and sexual harassment continue to restrict their freedom and mobility in video games. More importantly, sexual violence is a deeply traumatising experience, even if it takes place in a virtual world – making the question of how we define the world of video games and its extension into VR critical. Likewise, it also challenges our conventional ideas of what harm and violence can look like.
Arya*, a trans gamer, found himself quickly consumed by the video game Valorant. But it wasn’t long before he realised that having a “female voice” lead to visceral abuse from fellow gamers. Once, a man on his team told Arya that if he missed this shot he was most definitely “ch*akka” [a transphobic slur]. As a result, he entirely stopped using the voice chat feature. “[But] the thing is”, Arya tells me, “in FPS (First Person Shooter) games, communication is very important; telling people where the enemy is or where you got killed last is crucial.” Everyday, he grapples with the question of whether he should not speak and invite abuse for not communicating or if he should and receive abuse for his gender identity. “I take this call depending on my mood for the day and how brave I’m feeling,” Arya says.
S., another gamer, was stalked by a masculine avatar as a pre-teen. In a multiplayer game room on Gaia Online, this avatar would follow them and other feminine avatars around the space persistently. It seemed innocuous initially, but when asked to stop the avatar simply didn’t listen. He continued following them around, trying to chat, making occasional sexual comments.
Stalking feminine avatars is very common: “In any game where you can walk around, people have been stalked,” S. says. Sometimes characters follow you around while making sexually inappropriate comments, or insist on chatting. Other times, they follow you in silence. “Both are uncomfortable situations to be in,” S. adds. On the platform, there was no way for users to verify any information about each other beyond their avatars. This lack of regulation makes it easy for predators to use multiplayer spaces to enter the lives of young people, particularly young girls. While S.’s feminine avatar made them vulnerable in the game, the predator’s avatar protected him. His manufactured identity meant that he could not be held culpable for virtually stalking a pre-teen – irrespective of his gender or real-world age.
For S., the discomfort of being stalked kept growing to a point where they would run and hide behind racing cars when this particular avatar was online. Many others in the space would do the same to cope. The only way for S. to escape the stalking was to leave the chat room permanently.
Padmini Murray, who founded Design Beku, an organisation that explores how technology can be feminist and decolonial, does not think that this is a new problem: “Ever since we have had digital spaces, however limited it is in terms of what it offers, there has been violence ... These new technologies have created new ways of making women feel unsafe,” she says.
As far back as 1993, journalist Juilan Dibel wrote about a sexual assault that took place in LambdaMOO, a chat room and virtual world with early internet users. A user named Mr. Bungle, using a virtual voodoo doll, forced simulated sexual acts on others in the room. After the incident, inhabitants of LambdaMOO resorted to a form of self-regulation: They confronted the perpetrator about his actions, and articulated the act as “virtual rape.”
Whether these instances count as “sexual harassment” is still furiously contested. After all, avatars are not physical bodies and therefore cannot be hurt in the specific ways that physical bodies can. Only, violence isn’t that simple. Historically feminists have understood sexual violence as extending far beyond being a physical crime – it is an exercise in power and control.
When players participate in a virtual world, their avatar is an extension of themselves. They go through a range of experiences, good and bad, through these avatars as they explore virtual landscapes. In these spaces, the physical body might be untouched and absent, but the self is very much present. Dibel’s account reminds us that digital spaces don’t exist in isolation – they are occupied by real humans who carry their identities, privilege and marginalisation with them. Along with algorithms, they are equally influenced by the skewed power dynamics of human interaction.
Things like walking into an abandoned castle, falling off a tower, or even shooting someone might feel real only momentarily. But sexual harassment – perpetrated by actual humans hiding behind the avatars – feels and is a lot more real. It is a violating experience that extends beyond the moment it is inflicted.
These lived realities of women and marginalised genders urge us to consider how we might redefine harm in the context of virtual, disembodied selves. “[W]ith VR, where the claim is of a simulation of an embodied experience, it is a kind of digital violence that we haven’t seen before,” explains Murray. Parents today therefore need to be better equipped with credible information about the specific realities of virtual spaces in order to guide their children. S., for example, only understood the implications of how susceptible to violence multiplayer spaces are as an adult. “I should have been more supervised.”
VR is still in its early stages, with its far ranging possibilities unclear to us. We’re already seeing the damaging implications of 3D avatars from across the globe interacting with each other in a world that simulates reality, but is lacking in clear ideas of justice or social and behavioural norms. The potential for violence arguably only gets higher as gaming technologies evolve. This raises two questions: What happens when harm of any kind is inflicted? Who, if anyone, is accountable for safety in a virtual world?
From a micro perspective, fixing reporting systems and reprimanding perpetrators is not rocket science. A proactive strike policy, in which each strike results in a warning or suspension – depending on the nature of the offence – is a fairly straightforward solution to implement. “Even while a claim [of misconduct] is being investigated, it is possible to suspend an account,” says Bishakha Datta from Point of View. She adds: “The other thing, which has never been considered, is a penalty – an actual monetary fine. There are a range of offences that occur online, and I don’t see why for some of the lower level ones a penalty system can’t be put in place.” An economic deterrent also ensures that even if perpetrators go ahead and make new accounts after being suspended, they will be conscious of their behaviour.
However, Yadu Rajiv, a game designer from Bangalore, believes that avatar-based violence is not just a design problem. “You can't design your way around it [violence] – at best, you can put in safeguards,” he says. According to him, if you are opening up communication channels, it is bound to be misused. For instance, while the voice chat feature can be a medium of violence, removing it ends up making the game inaccessible to players with disabilities. The problem, Rajiv suggests, is deeper and requires better moderation and redressal mechanisms.
For these mechanisms to be implemented effectively, structural issues with how the medium is viewed and treated need to be addressed first.
In August this year, Microsoft launched a new eight-strike policy for Xbox, claiming that it was serious about curbing toxicity on the platform. On the face of it, it seems fairly strict, but a policy like this can work only if it is implemented meticulously. While suspensions are an obvious deterrent, platforms are almost always hesitant to implement them because they aren’t profitable. This is reflective of a deeper structural problem that is harder to grapple with: the nature of the relationship that users have with platforms.
In his research paper on virtual liberty, Jack M. Balkin argues that the two things that structurally control a virtual world are “code and contract.” The code or the design constructs the virtual landscape, but it is the contract or the “terms of service” that defines a player’s relationship with the platform that provides that space. Tech companies are notorious for evading responsibility, especially when it comes to their users' safety or privacy. Moreover, since most of them are US based, they are legally protected from what third parties post and do on their platforms. In 2018, Nicholas Suzor raised concerns over how tech companies framed their “terms of service” documents. He argued that these documents have a constitutional function, but in their current form they are largely designed to protect platforms and offer little to no protection to the users. Suzor suggests the need for a more democratic internet, calling it “Digital Constitutionalism.”
Digital Constitutionalism refers to the political project of rethinking power structures and arriving at a framework of governance and rights in the context of shared online spaces. As private owners of shared social spaces that are occupied by billions of people on a daily basis, platforms are offering way more than a simple service or tech product – they essentially govern integral parts of people’s daily lives. And their contracts and relationship with users should account for this.
Many digital rights activists believe that the first step towards better digital governance is to understand that the people who inhabit these spaces are not merely users or consumers – they have rights that are almost akin to citizenship. The key to bringing this shift in the terms of engagement with platforms is establishing transparency, Datta says. It is inconsequential to have reporting systems or community guidelines if all the data collected through them is locked up. An interesting exercise in transparency would be for platforms to release monthly reports to their users that account for the number of complaints that were received, the categories they fell in, which ones were resolved, which ones weren’t, and why. This kind of contractual obligation is crucial to building trust.
It is also fairly well-evidenced at this point that non-dominant genders face abuse online. Most big platforms have the data that corroborates this, but it is not being used to make any meaningful policy intervention. If used effectively, Datta believes this data could be used to make a lot of interface-level changes For starters, users that are likely to be vulnerable to harassment could be informed of the reporting mechanisms and available resources right on the landing page. It instantly signposts to them that the platform is interested in hearing their concerns and acting on them, she explains.
As the biggest players in tech plough more and more money into building virtual worlds, the problem of sexual violence is only going to worsen. And tackling this problem requires platforms to be aware that they aren’t going to be occupied by a “neutral” set of users. “We really need to do away with thinking that all users are exactly the same, and have exactly the same needs – because that is a myth,” Datta says. This in itself would lead to a huge shift in culture.
*Names of interviewees have been changed to protect their identity.
Bio: Anjali Menon is a writer and visual artist from Bombay. She writes about gender, violence, the internet, and books.