Web Analytics

FREQUENTLY ASKED QUESTIONS

If you’re hearing about this for the first time, image-based sexual abuse (IBSA) is the creating, threatening to share, sharing, or using of recordings (still images or videos) of sexually explicit or sexualized materials without the consent of the person depicted and/or for purposes of exploitation. It can happen with images and videos through hidden camera, deepfake/editing, AI, hacking/leaking, child sexual abuse, rape, sex trafficking, catfishing, sextortion and prostitution. 

These images can circulate fast and be posted to pornography websites, social media sites, search engine sites or shared without consent to other platforms, often before the victim even realizes it has happened.  

The cruel irony is that there are currently no federal laws requiring websites which allow pornography & other tech platforms on their platforms to verify the age and consent of every person depicted in those images. This is a massive gap in the law, and one that must be and can be immediately rectified. These platforms currently have the technology that is required to erase non consensual content but simply don’t want to to implement it. 

Image-based sexual abuse (IBSA) can inflict serious, immediate, and often irreparable harm on victims and survivors, including mental, physical, financial, academic, social, and reputational harm.

IBSA can greatly endanger the physical and mental safety, health, and wellness of victims and survivors. Those targeted by IBSA may be threatened with physical and sexual assault; stalked online and at their homes and workplaces; and harassed both online and offline. They also report a range of health burdens, from headaches and sleeplessness to anxiety, depression, post-traumatic stress disorder, and suicidal ideation.  Tragically, some victims of IBSA have died by suicide.

Victims and survivors of IBSA may also face extreme financial risks.  Their career opportunities may diminish if they miss school days, transfer institutions, or discontinue their education.  They might receive reduced income due to missed workdays, job termination, or lost opportunities for new employment or promotions.

Other extreme financial costs include housing relocation; fees for lawyers, therapists, security services, or reputation management services that monitor and remove images; and costs for new technology, including new digital devices or home security systems.  Victims of sextortion may also be coerced by the perpetrator into making significant financial payments.

Additionally, some victims and survivors experience reduced moral support from family, friends, significant others, and their community.  They may also encounter stigma or blame for the abuse, both by wider society and their personal support networks.

In 2017, Nationwide study found that 1 in 12 American adult social media users were victims of IBSA and 1 in 8 were threatened with IBSA.

The same study revealed that those who self-identified as women were significantly more likely (about 1.7 times as likely) to have been targets of IBSA compared to those who self-identified as men. While IBSA affects both women and men, evidence to date indicates that the majority of victims are women, and that women victims often face more serious consequences as a result of victimization. IBSA – like domestic violence, rape, and sexual harassment – thus disproportionately harms women and undermines gender equality.

Notably, however, in recent years there has been a significant increase in the number of male victims, specifically in financial sextortion scams.  For further details, see our Foundation RA Bulletin on Sextortion Scams here.

Following our 2017 study, Foundation RA and Florida International University collaborated on what may be the nation’s first large-scale, peer-reviewed research on sextortion during the height of the COVID-19 pandemic. Our research was funded by the National Science Foundation, and the first article was published in Victims & Offenders journal in January 2022. These findings revealed that those who experienced sexual intimate partner violence (IPV) prior to the pandemic were more at risk for sextortion during the pandemic. Of those who previously experienced sexual IPV:

  • Native Alaskan and Indigenous North American women were 6.77 times more likely than white women to experience sextortion during the pandemic.
  • African American women were 7.33 times more likely than white women to experience sextortion during the pandemic.

The same research revealed that those who faced sexual intimate partner violence (IPV) before the pandemic and identified as bisexual (8.9%) or lesbian (7.1%) reported highest rates of sextortion. This observation aligned with Foundation RA’s earlier 2017 study, which showed that 17.9% of bisexual women reported having been targets of IBSA, a higher rate than any other group surveyed.  Preliminary analysis from the 2017 study also revealed that respondents who identified as gay men and bisexual men appeared to be at great risk, though additional research is needed.

Last, findings also showed that participants aged 18–29 were most likely to report sextortion victimization during the pandemic, at 5.4%.

Image Based Sexual Abuse (IBSA) refers to the distribution of private, sexually explicit images of individuals without their consent. This includes both images originally obtained without consent (e.g. by hacking phones, using hidden cameras, or recording sexual assaults) as well as images that were originally consensually obtained (e.g., within the context of an intimate relationship) and later distributed without consent. IBSA is frequently referred to by the misleading term “revenge porn,” and Foundation RA discourages the use of this label.  This is because the word “revenge” suggests that the victim-survivor did something to cause the violation of their intimate privacy. This inappropriately places blame on the victim, instead of the offender. Further, the phrase “revenge porn” erases situations where the offender is motivated not by personal grievance, but instead seeks financial gain, social status, gratification of voyeuristic impulses, entertainment, or simply fails to consider the victim-survivor’s humanity at all. Many perpetrators are not motivated by revenge or by any personal feelings toward the victim. IBSA is also sometimes referred to as Nonconsensual Intimate Imagery (NCII) or as Nonconsensual Pornography (NCP).

Synthetic IBSA refers to visual material that is digitally manipulated using machine learning algorithms to make it appear that a person is nude, partially nude, or engaged in sexual conduct. The image, however, is not “real.

Synthetic IBSA is often referred to as “deepfakes” imagery and sometimes as “digital forgeries.”

Sextortion or sexual extortion is the threat to distribute an individual’s real or synthetic intimate material without that person’s consent if the person does not comply with certain demands.  The offender typically attempts to coerce someone to pay money, send more nude or sexually explicit images, perform sex acts, stay in an abusive relationship, relinquish custody of children, or some other act that is against that person’s wishes.

Child sexual exploitation/abuse material (CSEM or CSAM) is a visual depiction of an individual 18 years or younger who is nude, partially nude, or engaged in sexual conduct, even if that individual is now an adult.

*Please note that the information provided on Foundation RA’s website is primarily intended for adult victims and survivors of image-based sexual abuse. In the United States, the  National Center for Missing and Exploited Children (NCMEC) is the best source of assistance for minors.

Please visit our Step by Step Guide, which includes information and resources, as well as a step-by-step guide for victims and survivors of IBSA, synthetic IBSA, and sextortion.

If you are outside of the United States, this roster of organizations may be most helpful for you.

How to Find the Images

You may have heard from someone that your intimate images are online, but you might be unsure which sites are hosting them.  Below are a few suggested ways that you can search for those images.

  • First, you can search for your name using a search engine, like Google or Bing, to see if any offending material is in the results.
  • Another option is to conduct a reverse image search if you have access to the photo of concern.  One site that offers this is Google, and their instructions can be found here:  Reverse Image Search. If you are concerned about an intimate video, you can take a screenshot of various frames of the video and then use the reverse image search function.

If you took the images yourself

If you snapped the photos or took the video yourself, you own the copyright to them. To further protect yourself, you can take the extra step and register your images with the U.S. Copyright Office: https://copyright.gov/registration/ Copyright gives you the authority to demand that sites remove your images based on copyright infringement (also known as a DMCA takedown).

If you want to submit your own DMCA takedown notices, you may need to find contact information for the site owner. You can do this by searching the domain name on DomainTools.com. Remember, you want to contact the site owner and the host, not the “registrar” shown on the whois info listed.

If you would like additional information regarding the copyright and DMCA process, the expert team at Without My Consent has created this guide that may be useful.

How to Request Removal

Even if someone else took the image and you do not own the copyright, you can still ask a website to remove an intimate image if you did not consent to its distribution.

If you have documented the images, or if you prefer to skip that step, you can move on to requesting image removal from web sites.  Remember, if the images are removed before legal professionals collect digital forensic data, crucial evidence could be destroyed. Do consider this carefully before requesting image removal.

Below is a list of tech companies, their policies, and their removal procedures. Please note that while we strive to provide up-to-date information, tech companies may relocate their pages from time to time.

You are more then welcome to join our combat against online abuses. You can Donate Us or by Volunteering your Expertise.

Many nonconsensual or abuse videos are being edited and reuploaded on porn sites and being claimed as “consensual” or “verified” content when the people in the videos are not actually the people claiming the content. Recent undercover videos exposing major porn sites like “Pornhub” admit that they do not effectively verify the age, consent and ID of each person appearing in content. This can make it nearly impossible to have abuse content removed from these platforms causing permanent long-term damage and repeated traumatization to victims.

Perhaps you have images or videos of yourself or your partner on your phone, computer or other electronic device. These may be extremely personal and were never intended to be shared publicly. Hackers can easily access this content and then sell it for profit or post it online.

Hackers can hack any device at any given moment, they can access your camera through your cell phone, through computer webcams, through nanny cams as well hacking into your personal information through public spaces. Nobody is safe on the internet when it comes to hackers. This can be very devastating for the victims, through financial and emotional loss. Many commit suicide because of these crimes.

This type of abuse is also growing rapidly in prevalance as technology expands and hidden cameras are becoming more and more undetectable. This could simply be a camera placed somewhere in the room by an intimate partner or it could be a hidden camera placed in a public place like a restroom, locker room or even hotel room. Most people assume they have privacy in certain areas when, in fact, there could be a hidden camera literally anywhere.

Signup to our newsletter

Subscribe to our Newsletter to get latest updates.