Focused Monitoring... Resolution to Request Correction for 614 Cases

K-Pop Idol Deepfake Damage... Broadcasting Review Committee Urges Agencies to Actively Report View original image

[Asia Economy Reporter Cha Min-young] The Korea Communications Standards Commission announced on the 18th that it has focused on monitoring a total of 614 cases of information involving the production and distribution of sexual fake videos such as 'deepfakes' targeting K-pop idol singers, who have recently gained worldwide popularity, and has decided to request correction (access blocking).


Deepfake is a new type of crime that abuses artificial intelligence (AI) technology to create fake pornographic videos for sale or to blackmail victims.


Among the 614 cases blocked this time, the sexual fake information used the likeness of female idol singers, of which 418 cases (68.1%) were distributed in the form of deepfake videos on illegal pornographic sites. 196 cases (31.9%) were distributed as synthesized images through social network services (SNS).


According to the Korea Communications Standards Commission, "Editing, synthesizing, processing, and distributing content in a form that can induce sexual desire or shame against the will of the subject without consent is a clear criminal act under Article 14-2 of the Act on Special Cases Concerning the Punishment of Sexual Crimes."



The Commission stated, "Since there are limits to eradicating illegal information through public regulation, the production and distribution of sexual fake information, which started from misguided fandom, must be stopped immediately," and added, "When illegal information such as sexual fake videos is confirmed, it is essential for agencies to take the lead and actively report to the Commission to protect the victim's personality rights and aid in recovery."


This content was produced with the assistance of AI translation services.

© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

Today’s Briefing