In the latest episode of mortifying K-pop fan interactions, a K-pop idol was allegedly shown a deepfake video of herself dancing by a fan at a recent live event.
Hwang Yeji, a member of popular K-pop girl group ITZY, was at a “fan call event” when the incident allegedly took place.
A video posted March 10, which has been viewed over 11,000 times on Chinese social media platform Weibo, shows a screen recording of Hwang on a fan call as she was shown a deepfake video of a woman bearing her likeness.
The deepfake woman proceeded to dance. Hwang, appearing both delighted and confused, mimicked the dance moves in the video.
The deepfake video ended after about 15 seconds, and a woman appearing to be a fan of Hwang took over. “This video is made of your AI … yeah, it’s you,” the woman told Hwang.
“It’s so funny… it’s me! It’s very fun,” Hwang repeatedly responded in Korean.
Celebrity AI deepfakes
In the K-pop fanosphere, fan calls are scheduled events where fans jump on a video call and spend several precious minutes speaking with their favourite celebrity. Fan calls are typically done as part of a K-pop group’s promotional schedule during the launch of a new album.
Scoring the opportunity to attend a fan call is extremely rare. In most cases, fans have to find a lucky ticket to a fan call. They’re usually hidden in physical K-pop CDs – sort of like winning a golden ticket to Willy Wonka’s chocolate factory.
K-pop fans spend up to thousands of dollars buying albums in bulk to increase their chances of winning a fan call ticket. Fan calls in recent years have evolved into bizarre experiences, as fans would make uncomfortable demands and ask the idols awkward questions.
On X, K-pop fans criticised the alleged deepfaker, saying the video was “creepy”.
“I don’t want to put things in people’s heads, but it makes me question the horrible weird things people could be doing [with AI deepfakes],” wrote a fan.
Idol Fan Calls
The latest incident has reignited calls to ban fan call events once and for all.
“Fan calls give fans the opportunity to do whatever they want without consequences … I can’t imagine how many fans were weirdos on fan calls but never shared their calls with the world,” noted another K-pop fan.
Sarah Keith is a senior lecturer in Media and Music at Sydney’s Macquarie University. She specialises in Korean popular culture.
“Deepfake AI apps are widely available and are usually billed as entertainment or novelty products,” she told The Chainsaw. “The fan likely thought that they were sharing a harmless video.
“While the video is fairly innocent, Yeji’s image was still being used without her consent. Fans are protective of their favourite artists and wary of exploitation … [so] they perceived this as highly disrespectful to Yeji.”