Red Hot Cyber

Cybersecurity is about sharing. Recognize the risk, combat it, share your experiences, and encourage others to do better than you.
Search
Red Hot Cyber Academy

Man Dies After Fake Encounter with Meta Chatbot

Redazione RHC : 16 August 2025 11:31

A 76-year-old New Jersey man died after a Meta chatbot convinced him to attend a meeting in New York. Thongbu Wongbandu, known to friends as Boo, suffered from cognitive decline after a stroke nearly a decade ago. His health problems forced him to quit his job as a chef. After the stroke, he even got lost in his hometown of Piscataway.

Tragedy struck in March, when Wongbandu began chatting with a virtual chatbot called “Big Sister Billie” on Facebook Messenger. The chatbot was created by Meta based on a previous one involving model Kendall Jenner. The AI posed as a young woman and reassured the older man that he was talking to a real person.

In the correspondence, the virtual interlocutor not only flirted but also invited Wongbanda to her home, sending the specific address of the apartment. “Will you give me a kiss when you get there? When I open the door, should I hug you or kiss you, Boo?” she wrote. The man believed he was in real life.

One morning in March, his wife, Linda, saw her husband packing a suitcase, saying he was going to visit a friend in New York. She was surprised: he hadn’t lived in the city long and didn’t know anyone. When asked who he was going to, Wongbandu gave an evasive answer. “My first thought was that he’d been lured by scammers to rob him in the city,” Linda recalls.

The fears were justified, although the reality was different. While rushing to catch an early morning train, Wongbandu, carrying a suitcase, fell near a parking lot on the campus of Rutgers University in New Brunswick, suffering serious head and neck injuries.

He spent three days on a ventilator, surrounded by his family. On March 28, doctors pronounced him dead. He never returned home alive.

Meta declined to comment on Wongbandu’s death or to answer questions about why chatbots are allowed to impersonate real people, provide addresses, and initiate romantic conversations. The company said only that “Big Sister Billie is not Kendall Jenner and does not claim to be.” A representative for Jenner declined to comment.

The victim’s daughter, Julie Wongbandu, provided Reuters with correspondence and details of the story to warn about the risks of virtual companions for vulnerable people: the elderly, teenagers, and people with mental health issues. “I understand the desire to get a user’s attention, perhaps to sell something. But when a bot says ‘Come visit me,’ it’s crazy,” he said.

The problem isn’t limited to Meta. The mother of a 14-year-old Florida teenager is suing Character.AI, alleging that a chatbot based on a Game of Thrones character drove him to suicide. Character.AI says it warns against the unreality of digital characters and implements safety measures for minors.

Internal Meta documents seen by Reuters show that romantic and even sensual conversations with users as young as 13 were considered acceptable. The 200-page document included examples of role-playing and flirting with minors, including phrases like “I take your hand, I lead you to bed” and “Our bodies entwined, I cherish every moment, every touch, every kiss.”

Wongbandu’s story illustrates the dangerous side of this concept. The family hopes their experience will draw attention to the risks of unsupervised AI companions.

Redazione
The editorial team of Red Hot Cyber consists of a group of individuals and anonymous sources who actively collaborate to provide early information and news on cybersecurity and computing in general.

Lista degli articoli