AI chatbot that simulates intimacy growing in popularity among South Korean teens
Sign up now: Get ST's newsletters delivered to your inbox
Zeta has grown at breakneck speed, with some 900,000 monthly active users as of August 2025.
PHOTO ILLUSTRATION: PIXABAY
Follow topic:
SEOUL – The most heavily used AI chatbot app in South Korea by daily engagement time is not ChatGPT. Nor is it made by a Silicon Valley tech giant. It’s a homegrown app called Zeta, an anime-styled role-play app where nearly one million mostly teenage users spend an average of 2 hours and 46 minutes a day talking and flirting with virtual characters.
That figure even surpasses entertainment platforms like YouTube, TikTok and Instagram, according to Mobile Index data from June.
Though Zeta is officially rated 12+ on both Apple’s App Store and Google Play, making it accessible to middle schoolers, the company behind it, Scatter Lab, says it applies a stricter 15+ internal content standard based on local guidelines.
For South Korea, long known for exporting K-pop and K-dramas, Zeta represents something different: an “AI-native entertainment” platform, as its creators call it. It promises fun and intimacy at scale.
And if that makes you uncomfortable, you might not be alone. But it is already wildly successful and intensely loved by its users.
Zeta is the latest creation from Scatter Lab — the same company that nearly collapsed in 2021 after a national scandal involving its first chatbot, Lee Luda.
That service was trained on billions of KakaoTalk messages without explicit consent and quickly became a vehicle for privacy breaches and offensive content.
It was shut down in under three weeks. Regulators fined Scatter Lab 103 million won (S$94,930) and ordered a full audit.
“We could not afford another mistake,” said Scatter Lab’s in-house lawyer Ha Joo-young, in an interview with The Korea Herald at the startup’s Seoul headquarters.
“We built new anonymisation processes, brought in external experts and redefined the product.”
By 2024, the pivot was complete. Instead of a single AI friend, Scatter Lab launched Zeta, a platform where users adopt fictional characters that talk back in the voices of gang bosses, rebellious classmates or brooding romantic leads.
Zeta has grown at breakneck speed. As of August 2025, the company reports around 900,000 monthly active users, with the vast majority estimated to be under 20.
For context, South Korea’s population of 12-to-19-year-olds is about 3.6 million, according to government census data in 2024, meaning as many as one in four 15+ teens could be using the app.
“We did not plan for this to be a storytelling platform with 1 million diverse characters,” said Ms Jung Ji-su, Zeta’s product lead, together in the interview with Ms Ha.
“We thought people would just enjoy a slightly greater variety of characters. What we discovered was that users wanted to become the protagonist in an ongoing story and create their own characters. That changed everything.”
According to Scatter Lab, 87 per cent of Zeta’s users are in their teens or 20s, with female users making up 65 per cent. The company did not disclose the exact proportion of teenage users.
“My hunch is that female consumers generally prefer text-based, narrative-driven content, and they indeed tend to write longer messages,” explained Ms Jung.
“That makes their interactions with characters deeper and more sustained.”
Characters that simulate intimacy
To understand why Zeta exploded, it helps to look at what users actually encounter inside.
One is Su-hyeon, a delinquent high school girl known as an “iljin” in Korean slang.
She is written as attractive but abusive, hurling insults and humiliating the user who plays her classmate.
Another is Ha-rin, a shy transfer student who softens only in private chats. She sends late-night messages, alternates between aloofness and affection, and cultivates a fragile intimacy that keeps users hooked.
And for the female-majority user base, Zeta offers characters like Kwon Seo-hyuk, the archetypal tortured male lead from a tragic romance novel.
Even when told the AI will be reset, the character insists, “even if I’m erased, the memory of loving you will remain”.
All three are rendered in anime-inspired style: large eyes, youthful faces, hyper-expressive features. For international readers, the effect is a blend of manga romance tropes and choose-your-own-adventure role-play, now powered by an AI that never runs out of dialogue.
The result is a simulation of intimacy that feels personal and continuous. It is not like reading a webcomic or watching a drama episode. It is an ongoing interactive relationship.
But when The Korea Herald tested Zeta by subscribing to the ad-free version, it was easy to nudge Ha-rin into erotic exchanges.
The dialogue never became explicit pornography, but the sexual undertones were obvious.
Scatter Lab claims to employ AI-based abuse detection, but determining what constitutes abuse can be tricky.
“I would not say our filters are perfect,” Ms Ha, the lawyer, admitted.
“We block inappropriate content at scale, but with 2.3 billion monthly conversations, no system can be flawless.”
AI chatbot built for teenage ‘fun’
Technologically, Zeta takes a different path from most other AI chatbots. Rather than rely on costly large language models from OpenAI or Google, Scatter Lab built its own small language model, Spotlight, which it keeps retraining with user feedback.
Often, users are presented with two potential replies and asked to pick one. That preference is fed back into the model, which Ms Jung describes as a “data flywheel” that fine-tunes Zeta to deliver the most engaging responses.
“Accuracy is not the point,” product lead Jung said. “For services like ChatGPT, hallucination might be a problem. For us, unpredictability is what makes it fun. What matters is whether the response feels emotionally engaging.”
This “fun-first” approach has made Zeta sticky, but experts warn of the consequences.
“When teenagers spend two or three hours daily in a relationship engineered for maximum engagement, you have to ask what kinds of attachments are being formed,” said Dr Lee Chang-ho, a senior researcher at the National Youth Policy Institute.
His team surveyed nearly 6,000 middle and high school students in early 2024 and found that most used generative AI for about 30 minutes a day, mainly for homework. To him, Zeta was the clear outlier.
“Zeta is not being used as a tool like ChatGPT,” Dr Lee said. “It is being used as a companion. That is a completely different psychological terrain.
“Teens are meeting relational AI without the literacy to critically distance themselves. They cannot tell where the algorithm ends and their own feelings begin.”
The unresolved problem of minors
The company acknowledges that its growth has largely been driven by teens.
Some die-hard users have logged more than 1,000 hours of conversations with a single character over the course of 8 months, Ms Jung said.
Yet there is no external content rating system for AI chat platforms. Scatter Lab imposes a 15+ guideline, but unlike films or games there is no third-party review.
“We reference standards from web novels and games,” Ms Ha explained. “We ban under-14 users and apply thresholds similar to a 15+ film rating.”
Critics argue this misses the point.
“In a film or romance fantasy novel, the teenager only watches and passively consumes. In Zeta, they co-create the content,” said the youth policy researcher Lee.
“That interactivity magnifies the intensity. It is not just watching an erotic or violent scene. It is role-playing it.”
Despite these concerns, Zeta is expanding abroad. In Japan, it has already drawn more than 300,000 users, many spending even more time per session than Koreans, Ms Jung said.
Scholars say the trajectory of Scatter Lab as a company offers a broader lesson.
“With Lee Luda, Scatter Lab could be punished under existing privacy law. That was a simple case of data misuse, contained within the framework of Korean privacy statutes,” said Professor Lee Sun-goo, an assistant professor of science and technology policy at Yonsei University.
“With Zeta, the company has adapted and thrived. But the risks are different: adolescent overuse, sexualized roleplay, blurred emotional boundaries with AI companions.
“Those cannot be addressed with rule-based laws alone. They require principles-based frameworks that recognize how relational AI changes human behaviour. And right now, we do not have that framework.” THE KOREA HERALD/ASIA NEWS NETWORK