When deepfakes were talked about last year, the speculation was that they would be used for political nefariousness. It’s ‘mostly porn’, what a surprise!…:
[…] The research comes from cyber-security company Deeptrace. Its researchers found 14,698 deepfake videos online, compared with 7,964 in December 2018.
They said 96% were pornographic in nature, often with a computer-generated face of a celebrity replacing that of the original adult actor in a scene of sexual activity.
While many of the subjects featured were American and British actresses, the researchers found that South Korean K-Pop singers were also commonly inserted into fake videos, highlighting that this is a global phenomenon.