From Revenge Porn To Deepfakes, The Tech Tackling Gendered Violence Online

Technology has opened the door to new forms of harassment. Can it now help to tackle those issues?

The tech tackling gendered violence online
photo: Team Woo
The tech tackling gendered violence online
photo: Team Woo

Technology has opened the door to new forms of harassment. Can it now help to tackle those issues?

By Ruchira Sharma08 Mar 2023
8 mins read time
8 mins read time

This year, the UN theme for International Women's Day is "innovation and technology for gender equality". Taking this as a starting point, woo sets out to ask how new technology could stand to improve the fight for gender equality across the most intimate parts of women's lives.

In 2014, more than 100 celebrities’ naked pictures were stolen and plastered across the internet. An anonymous hacker had worked out the celebrities' Apple iCloud login details, and within hours intimate images of Jennifer Lawrence and Rihanna, among many, were on 4Chan and Reddit. The event, dubbed ‘Celebgate’, was one of the first large-scale insights into the emerging future of digital sex crimes.

Around this time, a 13-year-old girl at my school also had a nude leaked by an ex-boyfriend. That photo found itself on the screens of countless phones, including students much older, who horrifically shamed her. My friends and I didn’t have the language to describe what happened to her back then, but now we do: ‘revenge porn’ i.e. the distribution of a private sexual image of someone without their consent. The same year the UK announced a law to tackle the crime, with those convicted given a maximum sentence of two years in prison.

Now, non-consensual image sharing is just one example of the online gender-based crimes routine for girls and women. In 2023, we’re exposed to cyberflashing, cyberstalking and online rape threats daily. A 2021 report by Ofsted warned that nearly 90% of girls, and nearly 50% of boys, said being sent explicit pictures or videos of things they did not want to see is something happens to them. Similarly, a YouGov poll commissioned by the dating app Bumble from the same year showed almost half of women aged 18 to 24 received unsolicited sexual images within the last year.

The ways in which women can be harassed evolves alongside developments in tech. Research company Sensity AI predicts between 90 to 95% of all online deepfake videos are non-consensual porn, and around 90% of those videos feature women. Technology has opened the door to new forms of gendered harassment. Can it now help to tackle those issues?

Several start-ups and apps have been designed help survivors record sexual assault. Take Project Callisto, a digital tool that makes reporting sexual assault safe and supportive. Co-founded by Jess Ladd, a victim of sexual assault whilst at a California university, the encrypted record form allows survivors to document what happened to them and decide on what steps they may want to take next. Vault Platform, a start-up founded in 2018, similarly enables anyone experiencing misconduct in the workplace to record a private, time-stamped report that is stored as evidence on a blockchain, so it cannot be tampered with.

Whilst recording and reporting survivor testimonies is key to a safer society, is it possible for tech to go one step further? Could it preempt and stop assault or harassment from even happening in the first place?

"On a daily basis, we are dealing with and combating major human rights issues"
Maria Allgaier, co-founder of Freyja

Not Your Porn is a sex positive movement fighting to protect non-consenting adults, teens and sex workers from image-based sexual abuse. It was started in 2019 by Kate Isaac after her friend found intimate images of her were uploaded onto PornHub non-consensually. Co-founder Elena Michael tells woo that designing new technology to preempt and tackle sexual crimes women face online is “essential” to ensuring women are safe in the future.

The organisation has worked with a handful of new platforms and start-ups to ensure they’re safe by design, such as Freyja - a new social media platform that reduces sexual violence and improves sex education. It gives a voice to marginalised groups, including sex-educators and adult performers, and the founders acknowledge the overlap of education and adult entertainment. As a result, Freyja offers a safe environment where people can find both.

Co-founder Maria Allgaier says Freyja was originally designed to improve sex education, but it grew beyond that plan. “On a daily basis, we are dealing with and combating major human rights issues such as revenge porn, homophobia, transphobia, child abuse, monopolisation, battling taboos, and destigmatising sex work,” she said. “Freyja is much more than just a company, we are a social movement fighting for great change.’

In many ways, Freyja has not had to retrospectively tackle issues on its platform as it was designed to be safe thanks to consultations with organisations such as Not Your Porn. “Freyja have got a whole board of people from the industry that voiced their concerns, they've got a standard of operating and they've really thought about every single thing,” says Michael. She argues this is what the future of adult content platforms should be.

But the reality is there is a disconnect between the more ethical, smaller start-ups and the huge platforms that have been a wild west for stolen adult content and online misogyny. “We need to be facilitating those smaller companies, as well as trying to regulate and change the cultures inside the bigger companies,” Michael explains. She commends platforms like Bumble who she argues “are really on it” and “were super forward thinking with cyberflashing”, but makes the point that the UK cannot leave safety measures to companies’ good will. It must implement official standards online platforms are forced to meet.

Bumble was the first dating app to explicitly moderate cyberflashing. In 2019, the platform launched Private Detector, a feature that uses artificial intelligence to automatically detect and blur nude images. Recipients are then alerted so they can choose to view, delete or report the image. Similarly, in 2021 dating app Tinder launched the Are You Sure? feature, which uses machine learning to flag potentially offensive messages. It detects inappropriate language, defined by being overtly sexual or violent, and offers an automated message prompting users to reconsider their message.

“Bumble has been taking steps to tackle cyberflashing for years,” and campaigned for new legislation in the US and the UK, said Nima Elmi, Head of Public Policy for Europe at Bumble. She added any new law would need to be based around non-consent, “irrespective of the sender’s intentions”, as this “is the emerging international standard that we’re seeing across the United States” and effectively recognises the harm to the recipient.

Last year it was announced that cyberflashing would become a new criminal offence as part of the new Online Safety Bill, with perpetrators facing up to two years in prison. The change means that anyone who sends a photo or film of a person’s genitals, for the purpose of their own sexual gratification or to cause the victim humiliation, alarm or distress could be prosecuted. Non-consent was not how the crime was measured, but rather the perpetrator’s intent i.e. to cause distress, which can be very hard to prove in court. Many sex crime policies face these same issues, such as the 2015 image-based abuse law drawn up to tackle revenge porn. “The problem with that law is that the threshold, which is intended to cause distress, is very difficult to prosecute,” says Michael. This means that victims may struggle to build a case, and perpetrators can dodge punishment for committing the crime. Evidently, technology can be harnessed to proactively tackle harms against girls and women, but it relies on effective regulation to work alongside it.

Whilst laws are arguably slow to catch-up, lots of smaller companies are driving the use of technology in interesting and important ways. Yannick Schuchmann is the tech developer behind Am I In Porn - a search engine start-up which uses technology to help users search the internet for non-consensual content.

“We have a database of faces and use face recognition to make a representation of a face and store these faces in the database,” he tells woo. “And when you upload a selfie of yours, we use that same face recognition to create a representation and use it to match it against our database to find any similarities.”

Much like many of the organisations cropping up, Am I In Porn came about after a friend of the three founders became a victim of revenge porn four years ago and was disappointed by the police’s response to her case. Now, Am I In Porn is working to create an integrated system where users can pull up content of themselves and create a report that users can download or send to the relevant authorities to create a legal case.

Am I In Porn’s struggle comes down to money however. Schuchmann says funding has been an issue as many don’t want to be associated with pornography, even with a venture that is explicitly against non-consensual content. Legislation is key to ensuring women are safe, but so is funding for start-ups effectively tackling harms against girls and women. Without this, their longevity is at risk, and the potential for a safer future is jeopardised.

The onus shouldn’t be on tech to educate and stop communities from harming women, but for now the wave of start-ups and apps filling that chasm is promising. The problem is safety by design is just one element of what we need to address the issues of harm against girls and women, and should work in tandem with the law, tech’s transparency over the scale of its issues and ongoing scrutiny of these larger platforms that have a monopoly on society.

“This field is relentless,” says Michael. “In some ways we’re not enough. The work has to be collaborative.”