Illustration by Tactical Tech, with visual elements from Yiorgos Bagakis and Alessandro Cripsta.

This article was written by Safa Ghnaim in collaboration with Goethe-Institut Brazil and originally published on DataDetoxKit.org. An edited version is republished by Global Voices under a partnership agreement. 

Artificial Intelligence (AI) tools — especially those that generate images, videos, and audio — have been marketed as “creativity apps” for individuals and “efficiency tools” for businesses, but there are few ways to control how they’re actually used and the harms that these artificially created visuals can cause.

It may come as no surprise that AI tools are making the problem of online harassment even worse, such as through the creation and sharing of non-consensual intimate imagery (NCII) — these are intimate photos or videos, including nudity or sexually suggestive or explicit images exposing someone’s real or AI-generated body, shared without their permission.

NCII affects many people — not just the celebrities you might hear about in the media — and it’s not easy to deal with. Even platforms and law enforcement are struggling to keep up.

The problem is bigger than you might think

The technology is advancing so quickly that now it only takes one image of someone (and it could even be a totally wholesome picture) to be able to create sexually explicit content using one of many AI-powered tools.

While certain AI tools supercharge the problem of harassment, making it easy to create NCII of anyone, other AI tools are being used to tackle it. However, AI-powered tools that are trained to detect AI images are not perfect, and much of the work to identify and take them down still falls on the people working as content moderators.

One of the most resonant cases in 2024 involved sexually explicit AI-generated deepfake images of Taylor Swift. These images first appeared on 4chan and, within minutes, spread like wildfire across various social media sites. One of the images was seen over 47 million times before it was removed. It’s possible that these images are still being shared online since there is no real way to completely wipe them off the internet.

But this is not an isolated case. According to a 2021 study, 41 percent of over 10,000 survey respondents in the US said that they had personally experienced a form of online harassment. Of the respondents under 35 years old, 33 percent of women and 11 percent of men said they had specifically experienced sexual harassment online.

On a similar note, a 2023 analysis of over 95,000 deepfake videos found that up to 98 percent of them were deepfake pornography, and, of those, 99 percent of the individuals targeted were women. Other vulnerable groups, such as minors and LGBTQ+ people, are also disproportionately victims of online sexual harassment.

What steps can you take to protect yourself from this kind of harassment?

There are some guardrails in place by online platforms to support you in locking down your information from unwanted eyes. While these tips won’t build you an impenetrable fortress, they can make it harder for bullies or those trying to do harm to get to you.

Every platform is different and has settings and options accessible to users. As an example, here are a few things you can do to tighten controls on your Instagram and TikTok:

  • Set your profile to “private.” On platforms like Instagram and TikTok, you can set your profile to private so that only people who you approve as followers can see what you share in most cases. However, they can still see comments you make on other people’s posts and can even still send you messages. Learn to set your profile to private on Instagram and TikTok.
  • Remove followers or block people. If someone is giving you a hard time or making you feel uneasy, you can remove them as a follower or block them altogether. But if you know the person in real life, you’ll need a different strategy. Learn how to block people on Instagram and TikTok.

It takes a village

Looking at how many people are getting targeted by online harassment, it seems logical to assume there are a lot of harassers, too. But the fact is, it only takes one bad actor to do widespread harm. Would you consider someone who re-shares NCII a harasser, even if they didn’t originally generate the images? In the example of Taylor Swift, it took only a few people to create the NCII of her, but it wouldn’t have gone viral without many people sharing it.

So, how can you be an ally to someone who is being targeted and harassed? No matter which social media platform you use, it likely will have a “report” function. On apps like Instagram, you can report specific posts or entire profiles or accounts if you spot anything that looks like abuse or harassment. Reporting on Instagram is a great option to flag problematic people or things you see or hear. Instagram won’t share the identity of the person reporting with the person being reported.

When you “report” on Instagram, the platform may remove the post or may warn, deactivate, or ban the profile or account, depending on what happened and whether it goes against their Community Guidelines. Worth noting is that Meta’s Community Guidelines are not always helpful, and post takedowns and account bans have caused recent controversies.

If you personally know the person who is being targeted, gently reach out to them if you feel comfortable doing so. It might be that they had no idea this was happening and could react with strong distress, anger, or sadness. If you feel prepared to support them, you can offer them resources (like those linked at the end of this guide) and help them monitor and document the harassment.

Even though you most likely will not want to look at the harassment again, it might be helpful to document it before it’s taken down. Consider taking a video capture or screenshot of the post or comment that includes the account name of the other person and the date. Save the documentation somewhere safe and out of sight on your phone or computer. For example, on some phones you can set up a password-protected “secure folder.”

Documentation, when done well, can be useful in case the person being targeted decides to bring the issue to law enforcement and needs some sort of evidence.

It’s important that the person being targeted decides what they want to do. Do they want to contact law enforcement, get a lawyer, or reach out to their school, university, or workplace? Or would they rather keep it confidential as much as they can? Especially in a situation of NCII, so much choice is being taken away from them, so make sure to support them in getting back in control.

Know where to go for help

If you or someone you know is targeted with NCII, know that there are dedicated organizations out there who are ready to help. You don’t have to deal with it alone. Here are just a few supporting English-speakers:

  • Chayn (worldwide): Chayn provides resources and support to survivors of gender-based violence.
  • StopNCII (worldwide): StopNCII has a bank of resources as well as a tool that can help you get non-consensual intimate images taken down.
  • Take Back the Tech (worldwide): Take Back the Tech offers explainers and resources like Hey Friend, with tips on how you can support your friends when they are targets of harassment.
  • RAINN’s National Sexual Assault Hotline (USA): RAINN provides a private hotline where you can chat online or call a staff member who has undergone crisis training.
  • Take It Down (USA): Take It Down helps you, step-by-step, to file a removal request for NCII.
  • Cyber Civil Rights Initiative (CCRI) (USA): CCRI includes step-by-step guidance and lists of US-based attorneys and laws. They have a list of international resources, too.
  • Revenge Porn Helpline (United Kingdom): The Revenge Porn Helpline gives advice to adults who have been targeted.
  • Umi Chatbot (Australia): The Umi Chatbot is a quick way to get information about how to deal with NCII. The website also has resources about collecting evidence and reporting.

For more tips and resources on how to deal with online harassment:

 





Source link

author-sign