Economy

FBI warns of increasing use of AI-generated deepfakes in sextortion schemes

Enlarge

The FBI on Monday warned of the increasing use of artificial intelligence to generate phony videos for use in sextortion schemes that attempt to harass minors and non-consulting adults or coerce them into paying ransoms or complying with other demands.

The scourge of sextortion has existed for decades. It involves an online acquaintance or stranger tricking a person into providing a payment, an explicit or sexually themed photo, or other inducement through the threat of sharing already obtained compromising images to the public. In some cases, the images in the scammers’ possession are real and were obtained from someone the victim knows or an account that was breached. Other times, the scammers only claim to have explicit material without providing any proof.

After convincing victims their explicit or compromising pictures are in the scammers’ possession, the scammers demand some form of payment in return for not sending the content to family members, friends, or employers. In the event victims send sexually explicit images as payment, scammers often use the new content to keep the scam going for as long as possible.

Read 9 remaining paragraphs | Comments

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Close
Close