Perpetrators of AI sexual abuse often view their actions as a joke, new research shows
A survey of over 7000 people in Australia, the United Kingdom, and the United States found that 3.2% of the population reports engaging in creating, sharing and/or threatening to share sexual deepfakes. Men, younger adults, non-white respondents, and individuals with a disability were more likely to engage in this behavior. 18% of people deliberately viewed these images, most often out of curiosity. The research was published in Computers in Human Behavior. Sexual deepfakes are synthetic sexual images, videos, or audio recordings created or altered with AI or other digital tools. They are usually created to make it appear that a real person is naked, engaged in sexual activity, or saying sexual things, even though this did not actually happen. A sexual deepfake can use a real person’s face, body, voice, or likeness, and combine it with fabricated sexual content. Many sexual deepfakes are nonconsensual, meaning the person depicted did not agree to the creation or sharing of the material. Nonconsensual sexual deepfakes can be used for harassment, humiliation, blackmail, revenge, or sexual exploitation. They can …

