Seo-yeon Park was lying beside her partner in a motel room near Sinchon, a lively neighborhood in the South Korean capital Seoul, when she was stirred awake by something moving near the foot of her bed.
A young man was standing over her, his face hidden behind a smartphone. He moved the phone from one hand to the other, readying a new angle as Seo-yeon’s partner slept at her side. Seo-yeon leapt up, and the intruder ran off. She chased him out of the motel into the streets, but he was too fast, disappearing down a sidestreet.
She figured he had picked the lock or gotten in some other way. “I was very angry because my wallet was there and my money was there, too,” Seo-yeon told me. But he didn’t want her money. All he took was her photo.
She rushed to the motel owner, urging him to call the police and asking if she could look at closed-circuit surveillance camera footage from the motel manager’s office. But the owner offered little help, telling her there was no such footage. She later learned that he’d lied to her and shared the video from the incident with the police. But the response was telling.
At only 17, Seo-yeon had reason to believe that she was the target of a digital sex crime and that the man would publish the photo of her, asleep, on one of the many thousands of sites that publish illegal photographs and videos of women. Few institutions were available to assist Seo-yeon. No cameras, no government officials and no law enforcement agency offered much help, even though incidents and attacks like this were becoming more commonplace. Three months later, the intruder was arrested and sentenced, but because he was a teenager, he was released on probation without time served.
For many young people in Korea, this story will sound familiar. Despite years of public outrage and legislative efforts to curb digital sex crimes, the country remains home to a profitable industry that exploits non-consensual images of women, many of them underage, and even coerces them into sexual acts that are filmed and distributed online. This type of covert filming even has its own name: “molka” in Korean, meaning mole camera, referring to both the camera and the footage.
몰카
[moːɾkʰaː]
In 2018, a man was found to possess 20,000 illegally-captured videos when he was arrested for installing spy cameras in motel rooms. The country’s now-former president, Moon Jae-in, soon thereafter acknowledged that illegal spy cameras had become “part of daily life.” That same year, thousands took to the streets, demanding legislative action on molka crimes as part of the global #MeToo movement. But today, stories about camera installations for illegal filming still make headlines weekly.
Some of Seo-yeon’s friends soon became targets of digital sex crimes too, their intimate images leaked online by strangers with pinhole cameras lurking in bathrooms or subway stations or motel rooms. Most often those images and videos were taken by strangers. Other times they were distributed across social media by embittered former partners. Seo-yeon herself never found out what happened to the photos taken of her. She did not want to know.
Instead, she wanted to find a way to stop these crimes from happening. Seo-yeon formed a group called Digital Sex Crimes Out, an organization that, from 2017 through early 2022, sought harsher laws against illegal filming and digital sex crimes in South Korea. She went by the nom de guerre Ha Yena for her activist work educating the public and law enforcement about the real-world consequences of those digital crimes: they endangered children, triggered stalking incidents and provoked immense psychological harm. Sometimes they ended in suicide.
As Ha Yena, she became part of a small but significant network of people in South Korea who are fighting to prevent digital sex crimes, sometimes at the expense of enacting questionable privacy laws. Well into the era of #MeToo, Seo-yeon and her contemporaries found themselves at a crossroads between privacy protection and crime prevention, echoing the many battles that have played out as more governments around the world introduce legislation meant to curb online crime.
Korea is a society of advanced technology — it boasts some of the most robust internet infrastructure in the world — but it is also a place where custom and tradition have a powerful influence on social norms and public policy. Technology and its uses continually outpace political and social reforms.
South Korea’s highly digitized society and lightning-fast internet speeds make it easy to circulate illicit footage. Once a file is on the internet, it can be difficult, if not impossible, to remove it once and for all. In one criminal case, illegal videos and photos were posted online and accessible for a monthly subscription fee. Molka is not just appealing for its salacious content but also for its profitability. Two estimates suggest nonconsensual videos shared online can fetch between $1,667 to $4,167 per gigabyte of footage, roughly an hour and a half of recordings.
According to the Seoul Metropolitan Government, there were at least 75,431 closed-circuit television cameras operating in the city as of December 2020, about one camera for every 132 residents. The country has a legal framework for protecting identifying information about individuals, but there are significant exceptions that allow law enforcement and other agencies to keep relatively close watch on people of interest. There is an atmosphere of routineness around surveillance. People seem to accept it as a part of daily life, a necessity for the relative security it ensures against violent crime and robbery and the contact-tracing abilities, which aided in South Korea’s Covid-19 response.
But if surveillance seems ever present and acceptable in Seoul, a cross-cutting culture of privacy also prevails: Vehicle windows are typically tinted for UV protection but also privacy, and rare is the invited house guest, no matter the intimacy of one’s relationship to a family. The illusion of control over one’s personal domain is routinely undercut by those thousands of closed-circuit security cameras.
And although fines are often levied against those who distribute or are caught with molka footage, they seem to rarely dissuade further crimes, setting an entire country on edge. Parents who allow their daughters to live outside of the family home before marriage, a rarity in traditional Korean families, tell their children to get apartments on top floors to avoid being videoed through first-floor windows or hallway cameras.
In 1997, the South Korean department store chain Hyundai (of motor vehicle renown) installed dozens of cameras in the bathrooms of its buildings in Seoul, after executives cited incidents of thieves rushing into restrooms with merchandise to hide in handbags. Public criticism was swift, and the cameras came down. But soon, the use of cameras across the country boomed. Electronics were made cheap and easily available in shops and stalls around Seoul and other major cities. And by the early 2000s, most South Koreans were carrying that same equipment — mobile phones — in their pockets.
The push-and-pull between privacy and security is nothing new. As with any technology, some cameras were installed with legitimate intentions — to monitor private property, oversee patients at nursing homes or monitor babies as they slept — while others had more dastardly uses. Cameras could be used for spying on employees in break rooms and bathrooms, or given as gifts, in the form of a hidden camera alarm clock, to an unsuspecting colleague who could then be tracked.
In 2010, national police data shows between 1,100 and 1,400 spy camera crimes were committed. By 2018, that number grew to 6,400. Of the 16,201 people arrested between 2012 and 2017 for making illegal recordings, 98% were men, while 84% of people recorded during that period were women. Prominent cultural figures, including K-pop stars, were accused and convicted of trafficking in such footage around this time.
The Digital Sex Crime Victim Support Center, established in April 2018, helps targets of digital sex crimes by deleting videos and providing additional support for criminal and civil investigations, medical care and legal assistance. Of those who sought the center’s help between 2018 and 2021, more than 76% were women, with the highest proportion being in their teens and twenties.
Molka crimes became a central theme for South Korea’s #MeToo movement. Oftentimes perpetrators would walk away with time served and negligible fines while their videos continued to circulate on the internet. In the summer of 2018, upwards of 70,000 women took to the streets of Seoul to demand an end to molka crimes and protest the lackadaisical response from government and the judiciary.
People convicted of molka crimes can face up to five years’ imprisonment and fines of over $26,000, but data suggests that they rarely face such steep penalties. From 2014 to 2016, over 60% of those charged with digital sex crimes received fines of less than $2,200 on average. Those fines are often levied against those who distributed the footage, or are caught with it, and seem insufficient for preventing further crimes.
The question of how to prosecute these crimes and stamp out their long-tail effects has been more complicated than one might imagine. Child abuse images are illegal in Korea, as is pornography. In criminal cases involving pornography, all parties involved in its creation — including those who appear in a film or an image — are considered responsible. Digital sex crimes are largely handled in the same way as illegal pornography. Police have begun to show some awareness in cases where people had no knowledge that they were being filmed. But the prevalence of these incidents has laid bare the popular assumption that targets of molka crimes can somehow be blamed for what has happened to them.
Sexism and bias among law enforcement seem to be a contributing factor. A 2021 report by Human Rights Watch found that during police investigations, officers cast doubt on those who reported being filmed without consent, suggesting that targets had somehow invited or provoked these incidents. Officers would berate people for wearing provocative clothing or sending images to their intimate partners, things the authorities believed they shouldn’t have done in the first place.
Digital sex crimes reached new highs during quarantine restrictions at the peak of the pandemic. At the Covid-19 quarantine ward inside the Wonju hospital, a man who filmed molka videos was arrested and sentenced to 10 months’ imprisonment for filming in a woman’s shower stall.
The uptick could also be attributed to the country’s quick implementation and adoption of 5G networks. South Korea has one of the highest internet penetration rates in the world, with 98% of the country’s population online at the start of 2022.
Today, a casual search on Naver, Korea’s answer to Google, yields dozens of cases in which young people are seeking help because of digital sex crimes, often describing run-ins with law enforcement officials who are unsympathetic or clueless about the damage that can be done in a virtual space.
“Some guy was following me on the bus stop and I heard his smartphone camera shutters go off,” one poster wrote on May 15, 2020, “but he denies he took any photos of me and deleted all of the photos so there’s no proof of them. So I’m trying to find a lawyer to represent my case to protect other victims like me.”
Another wrote on October 26, 2021: “I was at a motel w/ my girlfriend and noticed a camera across the street and reported it to the police. They caught him in three weeks. The police IDed the footage and confirmed it was my gf in the video. … How can I check if the illegal video was distributed somewhere?”
Another wrote on June 7, 2022: “My bf illegally filmed me after I got out of shower in a motel. I can’t get it out of my mind. I feel so ashamed and guilty. …If I press charges to the police will I know the status of the investigation?”
That these posters were willing to report these crimes at all was a sea change in attitude from just a few years ago. While most lawmakers may still be catching up, many young people and technology experts have adopted more nuanced perspectives on where culpability should lie and how justice might be sought for targets.
Two years after the break-in at Seo-yeon’s motel room, student activist Seo-hui Choe was on her phone late into the evening. As a member of a group called Project ReSET (Reporting Sexual Exploitation in Telegram), Seo-hui would use a VPN and various fake identities to log into chat rooms on Telegram, Discord and the popular Korean chat app KakaoTalk, where she would join chats where sexually exploitative videos were being shared.
For more than a year, she had been following media reports about videos of sexual assault and child pornography circulating through private, encrypted messaging apps. Phishing-style attacks and social manipulation — catfishing, online-dating, promises of K-popesque stardom — led users to produce exploitative content, which was then used to blackmail them for more images and videos.
The male-dominated chat rooms and online communities were reported to authorities, but police largely ignored the threat. Student journalists and activists, like Seo-hui, began gaining access to the rooms and reporting what they saw to Telegram and police. The social media app, according to Seo-hui, did nothing. And the authorities said they were powerless to pursue an international company like Telegram. The company did not respond to requests for comment for this article, but its terms of service do prohibit the distribution of illegal pornography on publicly viewable channels.
Seo-hui reviewed and reported the disturbing footage she found. She would see young women’s photos trafficked or videos and other media being offered for sale. Payments were accepted through cryptocurrencies.
A deepening divide between young men and women was soon exploited to help elect the conservative Yoon Suk-yeol to the presidential Blue House in early 2022: He ran on a platform of anti-feminism that promised to abolish Korea’s Ministry of Gender Equality and Family. Campaign promises aside, a ministry spokesperson said in an email that “the new government will make it a national task of guaranteeing the right to be forgotten and strengthening protection and support of victims of digital sex crimes.”
Meanwhile, the monitoring became a recurring nightmare for Seo-hui. Her work began to be noticed, and she became a target for abuse. Some online demonized her and her fellow activists, believing them to be radical, dangerous feminists under the employ of the Ministry of Gender Equality and Family and the previous presidential administration.
“After reporting these things, I was supposed to sleep, but all I could remember were the victims and the footage,” Seo-hui told me. “My blankets felt like they were on fire.” Her skin would crawl. Sometimes she would cry.
The media reports and Seo-hui’s own work led to continued revelations about the existence of these communities. In two group chats in particular, participants were distributing sexually exploitative videos and blackmailing dozens of women into sharing private videos online. Some of the footage they shared included rape. More than 103 individuals, 26 of whom were minors, had their videos or images sold to over 60,000 people. The chat rooms, known as the “Nth Room” and “Doctor’s Room,” were eventually shut down, and the users behind the channels arrested and convicted. On November 26, 2020, Cho Ju-bin, the 26-year-old “Doctor” who controlled the eponymous chat room, was sentenced to 40 years in prison for blackmail and sexual harassment.
Shocking as the case was, this was not the first time that the harms and real world impacts of digital sex crimes, to say nothing of the difficulties surrounding regulation and prosecution, should have been apparent. There was already a precedent for pursuing and preventing such crimes. The website Soranet, described in headlines around the world as “South Korean porn,” was taken offline in 2016 due in part to a joint operation between the Dutch government and the U.S. Immigration and Customs Enforcement, as the website’s servers were hosted in the United States before moving to and then being seized in the Netherlands. Soranet’s co-founder was sentenced to four years in prison, a sentence criticized by many campaigners as too light. But taking down a website and arresting its founders had little effect on the proliferation of such material. Images kept being uploaded faster and shared more widely.
As Seo-hui scrolled through chat room after chat room, she saw little hope on her screens that this latest wave of outrage and police action would change much, even with reports of arrests and prosecutions leading primetime news broadcasts. The laws, in the aftermath of the Nth Room scandal, still seemed to consider sexual exploitation to be a form of pornography, and treated all parties as co-conspirators.
“People didn’t understand that sharing illegal sexual exploitation videos is a crime,” Seo-hui said. “So we wanted to educate people on this issue. It’s not porn, but sexual exploitation. After Nth Room we were given a lot of promises that weren’t kept. So we were just relaying messages from the victims to the police.”
The videos, she wanted to impress upon the police and the public, were a violation; they were nonconsensual and had to be treated as serious crimes, not as the consequences of naivety and debauchery. Seo-hui says the Nth room was neither an anomaly nor a turning point in bringing about real change and accountability for the molka crime industry.
“It’s just the tip of the iceberg,” she told me. “It happened before, and it’s happening still.”
Since 2020, Seo-hui has stopped monitoring the internet for examples of digital sex crimes. The vicarious trauma of witnessing those crimes took a toll on her. And she reached a point where she felt powerless to affect change. If she were to stop monitoring and reporting for a minute, she told me, dozens more rooms and tertiary conversations cropped up when she returned. If she stopped for one night, putting her phone away so that she could rest before another day of classes, thousands more would appear across the web.
Seo-yeon, too, felt disheartened and diminished by the prospect of an ever-increasing numbers of digital sex crimes and a society that showed little respect for women. She disbanded Digital Sex Crimes Out in part because of the growing resistance to their work and the risk to her personal safety.
“There is inequality online,” she said. “But nowadays I just avoid those environments altogether so I think less about inequality.”
Instead, Seo-yeon decided to focus on her career as a software engineer as a form of resistance. “I wanted to understand computer technology in order to understand how to push for laws to prevent digital sex crimes,” she told me. Seo-yeon says there is still much work to be done. “Just because we have a new law, that doesn’t mean everything is functioning well now,” she said about her work advising Korean courts on digital sex crime prosecutions.
“I don’t think it’s just a Korean issue. In countries like the United States and the United Kingdom, illegal sex videos are big business,” she said. “Everyone lives in the digital age.”
Seo-yeon is not alone in her belief that this is not just a Korean issue. It may be that Korea is simply further ahead of most other countries when it comes to the quality of its technological infrastructure and the omnipresence of cameras.
“I see these women as the canaries in the coal mine in a way,” Heather Barr, the associate director of the Women’s Rights Division at Human Rights Watch and the author of a report on digital sex crimes in South Korea, told me. “I think that what’s happening — this particular issue — is very dystopian, but also a sign of where the rest of us may be going.”
Behind a locked door and down a nondescript lobby in the ritzy Gangnam district of Seoul is the office of the Santa Cruise company, self-described as a provider of “digital laundry services.”
The prevalence of molka crimes in Korea has given rise not only to groups like ReSET, but also to an industry of digital reputation managers. For roughly $2,000 a month the company does its best to wipe those digital traces from existence. Some customers have 10-year subscriptions. Others pay in three-, six- and 12-month intervals.
“I didn’t set out to do this job from the beginning,” Kim Ho-jin, the CEO of Santa Cruise, said. Santa Cruise began as a model and talent agency. But soon his clients came under attack online, with accusations and rumors flung about Google and Naver. “They were not able to go to school and were in and out of a mental hospital because of malicious comments.”
Kim began filing requests to the search engines to remove the material and actually had some success. People with similar problems began to seek his help. Kim now counts scores of entertainers, K-pop stars and company executives among his clients.
Today, more than a quarter of Santa Cruise’s business comes from people who believe they are targets of digital sex crimes and want to manage their online reputations. Each month, Kim’s team of young researchers send a report of the data found and deleted to their clients. Teenagers comprise roughly half of Kim’s business, and twenty-somethings represent about 30% of his clientele.
Digital entertainment culture is paramount for many teens and young adults in South Korea who extract social value and even their belief systems from idols on social media or television. In the last several years, this blurring of digital and physical existence preceded the suicide deaths of prominent K-pop figures like Goo Hara, Kim Jong-hyun and Jang Ja-yeon, all of whom died in their late twenties after being illicitly filmed in private or by partners during sexual acts. The videos were then distributed or streamed online.
Hate speech and derogatory comments online have also led to suicide deaths. While Korea’s strictly hierarchical culture may have deep roots, it has a profound hold in the digital world, where it is mirrored by the reward-driven and facile acceptance or rejection of people online. These are worlds that young people are ill-equipped to differentiate between. K-pop stars have also used spy cameras to film unsuspecting romantic partners or strangers — incidents that make these kinds of crimes seem somehow acceptable or even cool.
In 2019, the K-pop star Seungri, of Big Bang, and a nightclub owner were found guilty in a scandal that involved spy camera videos, prostitution and embezzlement, among a slew of other offenses. Seungri was alleged to have embezzled $951,000 but was sentenced to just three years in prison, later reduced on appeal to one and a half years and a reduced fine.
The issue has also entered the national zeitgeist through television, where one popular program depicted young men gifting a hidden spy camera to a colleague (“Business Proposal”) and another featured an episode in which the molka victim dies by suicide (“Hotel Del Luna”).
“These young people have a lot of power. So the problem is not so much what these people do but how society responds to them,” Dr. Pamela B. Rutledge, a social scientist and director of the Media Psychology Research Center in California, told me.
“You see something in the media and then you do it,” she said. “You see something, you process that in your psychosocial environment, and then you watch to see what happens to that person, all the while assessing whether that’s something you can actually do, but also to see whether they are rewarded or punished,” Dr. Rutledge said.
At the Santa Cruise offices, there was little to suggest anything beyond a desire to rid the internet of its ability to wreck lasting havoc over a mistake or regret. But a game of blame and shame was nevertheless enforced. For teenagers who cannot afford to pay for services provided by Santa Cruise and who are too ashamed, embarrassed or worried about telling their parents they need help, Kim offers his services pro bono, with one caveat. They are made to write a “reflection letter” about digital citizenship and the choices that led them to him.
“Even the victims are to be blamed,” Kim told me. “The fact that they film themselves is wrong in the first place. They have to recognize that. The people who fell victim to spy cams are also to be blamed because they weren’t being careful enough.” He added, “If they don’t agree to write these letters, I don’t delete the illegal content for them. So they have to agree with me.”
The crimes surfaced and reported both by Santa Cruise and ReSET were ammunition for the passage of what became known colloquially as the “Nth Room law.” In the aftermath of the Nth Room case, amendments to Korea’s Telecommunications Business Act brought new illegal content filtering and transparency reporting requirements for big social media companies. But most people did not realize what the law would mean until it went into force and filtering notifications began to appear on their phones.
In group or public chats, if you uploaded a video of anything — from a cute cat to an oblivious naked person sleeping in their bed — you would receive a notification that looked something like this: “According to the new Telecommunications Business Act, the Korea Communications Standard Commission is reviewing the content to see if it’s illegal.”
Another similar message read: “Identification and restriction of illegally filmed content: videos. Compressed files that are sent through group open chat rooms will be reviewed and restricted from sending if it is considered an illegally filmed content by related law. You may be penalized if you send illegal filmed content, so please be cautious while using the service.”
Public outcry over censorship soon overshadowed memories of the Nth Room case and the sexual crimes committed against scores of women and girls. One of the biggest opponents of the law filed a constitutional complaint long before the outrage reached the public and political spheres. Open Net Korea, a nonprofit with the aim of maintaining freedom and openness for internet users, said the policy infringed on the public’s freedom of expression and right to know.
“When something gets done in the National Assembly, I think it appeases the general public, and we get to move on from it,” Jiyoun Choe, a legal counsel at Open Net, told me at the organization’s office in Seoul. “But we shouldn’t move on unless it’s actually been taken care of and solved, which it’s not really being done right now.”
The filtering law went into effect in 2021, but most major companies based outside of Korea have yet to fully implement the process, due in part to some of the technical hurdles it presents. Under the law, companies can either put in place their own filtering systems that will prevent illegal content from being posted and distributed, or they can use a system built by Korea’s Communications Standards Commission. Those that choose to use their own systems must have them vetted and approved by the Commission. Filtering is a mandatory requirement for all websites operating in Korea that handle more than 100,000 daily users and offer some way for users to post original content.
The Commission’s system mimics software built by Microsoft that major tech companies like Google and Meta use to combat child and sexual trafficking. It assigns a unique number — similar to a barcode, but known in technical terms as a hash value — to photos and videos that contain illegal images so that they can be more easily found and removed from the web whenever they are re-shared or posted. This information is then placed in a database maintained by the Commission. This prevents the recirculation of footage found by the police or reported by users, like those working at Santa Cruise and ReSET.
But, of course, the system cannot prevent new crimes. “It doesn’t criminalize the actual activity itself,” Open Net’s Jiyoun said. “This law itself will not be effective in preventing people from going back on Telegram, or even deeper into the internet, to continue doing what they were doing altogether,” she said.
“It just asked the company to restrict what’s being shared.” In so doing, she said, “we are supposed to blindly trust that they’re doing their jobs. We don’t have information on how often [the hash database is] updated, or how they presume to know if this content was created illegally. So there’s a lot of problems with transparency.”
And the burden it has placed on companies is twofold: some fear they are signing onto a future censorship apparatus, given that the Korean government developed the software. What’s more, the increase in server and networking capabilities could unduly burden smaller social media platforms or special interest forums that cannot afford these additional costs.
While foreign companies like Meta and Alphabet were given an extension to implement their own connection to the government’s database of hash values, Pinterest was the only foreign company operating in Korea that agreed to use the government’s proprietary software to vet its users’ content. (Pinterest did not respond to Coda Story’s request for comment or additional information.)
“Many broadcasters misunderstand filtering technology. You think that when a video is uploaded, we look at the content and filter it,” said Kim Mi-jeong, of the Illegal and Harmful Digital Content Response Division Consumer Policy Bureau at the Korea Communications Commission, which oversees the Korea Communications Standards Commission. “That’s not the case. There is already a video that [the Commission] has determined is illegal. When a user uploads a video, technically, only feature information is compared to determine whether the video is illegal or not. So that’s a huge misunderstanding.”
In a statement, the Commission said it only targets “public bulletin boards, not private messages,” though many of these public boards also include anonymous chat rooms.
“Now, people have come to realize that anyone can be punished if they take pictures of someone else’s body,” said Lee Young-mi, an attorney and director at the Korean Women’s Bar Association. Lee noted that the benefit of such a law is that it shifts public perception on right and wrong, challenging long-held beliefs in a society that remains hierarchical and patriarchal.
“In terms of reducing the number of children who [are exploited], I think it’s very good. It’s positive,” Lee said of the filter law. But, she added, companies like Apple are not cooperating enough with law enforcement and government requests to turn over data and information. She said companies should be less concerned with privacy and more concerned with investigating criminals who use the technology, whether it be hardware or software, to do harm.
Soo-jong Lee, a criminal psychologist at Kyonggi University, told me that even with the enactment of the law, it was difficult to change the culture. She explained how, as a side effect of K-pop culture, people are seeking stardom through random chats and illicit messages. At the same time, “our culture also blames the victims,” she said.
“We say the world has changed; we say [blaming victims] is not acceptable. But it happens only on the surface. Below that surface, there is still a sense of purity that people care about. There must be a discriminatory view of women in particular.”
The filter law became political cannon fodder during the recent presidential election. It divided the ruling and opposition party candidates, with the People’s Power presidential candidate Yoon writing on Facebook, “If videos of cute cats and loved ones are also subject to censorship, how can such a country be a free country?” The Democratic Party presidential candidate Lee Jae-myung said, “All freedoms and rights have limits.” The divisions within the culture were being drawn along both gender and free speech lines, with many young men feeling the law was an overreaction.
People innovate faster than laws can be administered. Sometimes, that hurried approach to national or internet security runs afoul of the future. Despite the well-intended use of the system now, it could set the country on a path to future misuse.
“As there have been cases of malfunctions in censorship, I think that the law is poor and lax, and if you want to prevent the distribution of illegal things, I think that follow-up measures through reporting are usually correct,” one male teenager at the Seoul National University told me. He did not want to give his name. “I think that controlling and censoring things in advance can lead to a system where all users are seen as potential criminals.”
Seo Ji-hyun, a former public prosecutor at the Tongyeong branch of the Changwon District Prosecutors’ Office, became widely recognized as a pioneer of Korea’s #MeToo movement after she said in a live television interview that she was groped by her superior. Seo felt that the filter law was a step in the right direction, though ineffective. Those who criticized it, she said, missed the point that the subject matter needed to be addressed, and any measures to further crack down on sexual exploitation, trafficking and digital sex crimes were welcome.
She also believes that well-drafted and thought-out legislation can spur social and civic change. Yet, despite her own work on the Ministry of Justice Digital Sex Crimes Task Force this past year, little has changed. Her team made 11 policy recommendations to the National Assembly on how to prosecute and handle digital sex crime cases. Among those recommendations was a plan for how to effectively seize and prevent the redistribution of illegal videos. But only one recommendation, to unify an application process for survivor support, was implemented in April 2022.
It may be years before Open Net Korea’s constitutional court complaint is resolved. Jiyoun said that the law does not help targets enough and, in the meantime, places “a lot of the onus on the corporations, which could be detrimental to the internet.”
“Having [social media companies] be responsible for any content even entering their platforms would just give the companies incentive to not allow more content onto their platforms, which would be bad for democracy,” Jiyoun said. “Companies would have to cover their liability by not allowing anything to be uploaded to the internet. That would hinder the role of the internet as a vessel for information, where people can whistleblow or participate in the #MeToo movement, everything that is needed for democracy to thrive.”
Dressed in matching yellow vests, So-yeon Park and Bo-min Kim drove to the Wonju Hanji Theme Park, a cultural center devoted to traditional Korean printing methods and paper pressed from the bark of mulberry trees. The park’s managers greeted them with a warmth reserved for old friends.
So-yeon placed a yellow A-frame resembling a “Caution Wet Floor” sign outside the women’s restroom and made sure no one was inside. Bo-min set a hard plastic storage case on the sink vanity and pulled out a pair of spy camera detectors. The two women set to work.
“This is a new building, so there are no holes for installing a hidden camera,” So-yeon said as she pushed open all of the stall doors and held the camera detector to her eye.
The flashing red lights flickered across toilet bowls and the walls of each stall, waiting to bounce off hidden cameras and transmitters. She then pointed the instrument at the ceiling, checking the air conditioner unit and the extinguisher system. The two women placed small, circular blue stickers over anything that resembled a hole.
“If we see a trash can, a lighter or a soda bottle, we look into it,” Bo-min said, noting that spy cameras “can look like just about anything.”
Son-hae Young is no stranger to these mechanisms. The founder of Seohyun Security, a company that specializes in removing illegal cameras, wiretaps and other tracking tools, Son-hae trains police officers and corporations on security precautions and works with a team to sweep hotels and school buildings. “Inspection is done with the national budget here. It is necessary to let people know how hidden cameras are being modified and how fast the technology is changing,” he told me.
“There is no other place in the world,” he said, “where elementary, middle and high schools are regularly inspected for hidden cams.”
From a brown paper bag he pulled a chalkboard eraser, a bottle of Coca-Cola, a clock, a mirror and a USB storage drive. All of them contained hidden cameras.
In September 2018, in response to tens of thousands of #MeToo protesters holding banners proclaiming, “My body is not your porn,” the Seoul City Government announced it would increase public bathroom inspection by assigning 8,000 employees to inspect the city’s more than 20,000 bathrooms on a daily basis, a step up from the previous 50 employees and monthly inspections. By law, South Korean cell phones must now emit loud shutter noises when a photo is taken, a feature which cannot be deactivated. Teams are also deployed to check bathrooms and locker rooms in schools.
Today, Seoul’s 8,000-strong camera detection crew has been all but abolished. The official line was that they found few cameras, given how rapidly cameras were removed by inspection crews. They now run spot checks, sometimes teaming up with a security company like Seohyun Security, to check restrooms twice monthly. But on President Yoon’s watch, digital sex crimes are expected to rise substantially, even as underreporting remains a problem.
Various South Korean federal and municipal agencies responsible for efforts to curb the rise in spy cameras in public places — including the Seoul City Government’s Women’s Safety Business Team and the Department of Women and Family — declined to comment for this article.
As we drove back to City Hall, So-yeon told me she was proud of being a deterrent and felt that the visibility of her team reduced such crimes, at least in her city. Bon-min’s daughter lives in Seoul. “She is very afraid to go to the bathroom on the subway or anywhere public,” Bon-min said. “We are very proud, and she is very proud that we are doing this work.”
Back in Seoul, on the red line at the Jeongja subway station, a train glided in. The platform doors opened and a queue of commuters stepped out of the car before those waiting to board stepped in, everyone’s politeness on public display.
The doors closed behind the passengers. A large electronic eye, the prototypical vision of HAL 9000, appeared as the subway car doors met — a public service poster. “Ban illegal filming. Your shutter sound can lead to jail time,” the text below the eye read, followed by a stark hashtag warning: #iwillwatchyou.
No further explanation is required for riders who can never be certain of who is being watched and who is doing the watching.
To report a crime involving non-consensual intimate images, or learn more about how to support survivors, visit https://stopncii.org.