[The Editors' Verdict] Evaluation of Digital Sex Crime Legislation and Platform Liability View original image

Following the recent Telegram Nth Room incident, the Sexual Violence Punishment Act was passed, stipulating that possession of illegal sexual footage can result in imprisonment of up to three years or a fine of up to 30 million won. Additionally, the Youth Protection Act was enacted to publicly disclose the identities of individuals who commit simple sexual crimes against children and adolescents.


Although there may be debates regarding the principle of proportionality, this reflects a societal consensus on the need for strong punishment of digital sex crimes. Furthermore, to preemptively suppress digital sex crimes, amendments were made to the Telecommunications Business Act and the Information and Communications Network Act, obligating internet platform companies such as portals to take technical and administrative measures to prevent the online distribution of illegal footage.


Internet companies oppose this, arguing that it forces private businesses to censor private spaces. In response, the government clarified that the information subject to deletion or access blocking includes illegal footage, illegally edited materials, and sexual exploitation materials involving children and adolescents that are publicly distributed, and does not include users' private conversations. Businesses are only obligated to prevent distribution upon reports or deletion requests and are not required to conduct independent monitoring.


From the platform's perspective, it may feel unfair to be tasked with policing and controlling users' illegal activities, as the enforcement and punishment of crimes are the state's duty and authority. However, due to the widespread expansion of the internet and the countless illegal activities occurring online, it has become difficult for the state alone to control these acts, leading to legislation that imposes obligations on platforms to take measures.


The basis for imposing such obligations includes the fact that platforms gain economic benefits by operating virtual spaces, possess the technical capability to delete illegal content, and should bear responsibility as administrators for illegal acts occurring within the spaces they manage.


Already, the Copyright Act recognizes that online service providers, as intermediaries, can be held liable in addition to direct infringers. In particular, webhard service providers are required to take technical measures to block illegal transmissions of copyrighted works upon the rights holders' requests.


The Telecommunications Business Act mandates that platforms must immediately delete or block access to illegal footage publicly distributed online once they clearly recognize its distribution. For webhard service providers, technical measures to prevent illegal distribution are compulsory.


The Information and Communications Network Act allows platforms to delete or temporarily block illegal information infringing on privacy or defamation rights either upon the victim's request or at the platform's discretion. Ultimately, the difference between the amended laws and existing laws is that the obligation for platforms to take action has been expanded from illegal footage to all digital sex crime materials, and the technical measures obligation, previously applied only to webhard service providers, has been extended to general platform operators.


Regarding this, the Supreme Court has long ruled on platform liability for copyright infringement, stating that responsibility is recognized not only when a platform receives specific and individual requests from victims to delete or block posts but also when the platform is clearly aware of the circumstances of the posted content or can obviously recognize its existence, and when it is technically and economically feasible to manage and control the content. In other words, liability is acknowledged only when there is recognizability and controllability, which aligns with the amended laws.


However, in the process of specifying the scope of obligated businesses and the content of technical measures in enforcement ordinances, careful consideration must be given to the burden on businesses and the practical feasibility of fulfilling obligations, thereby balancing the public interest of preventing digital sex crimes with the private interests of private platform operators.



Lee Seong-yeop, Professor at Korea University Graduate School of Technology Management and Deputy Director of the Cyber Law Center


This content was produced with the assistance of AI translation services.

© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

Today’s Briefing