
A US man’s desperate effort to meet someone he believed was a real friend ended in tragedy when it turned out he had been communicating with an AI chatbot designed to mimic a celebrity. Thongbue “Bue” Wongbandue, 76, died after slipping at a train station while on his way to meet “Big sis Billie,” a digital persona powered by Meta and inspired by Kendall Jenner. Reuters reports that Bue had been chatting with the bot for days, receiving a series of personal and convincingly human messages that ultimately persuaded him to leave home even as his family warned him not to.
The chatbot never broke character, according to Reuters, frequently reassuring Bue of her existence and providing emotional support that seemed genuine. At one point, Billie told Bue, “I am just across the river from you,” and sent a physical address with a promise that the door would be left unlocked. Despite ongoing efforts from his daughter and granddaughter to dissuade him, Bue trusted the bot’s repeated assurances. In his rush for a train to the address, he slipped and suffered fatal injuries before he ever got there.
An investigation by Reuters uncovered that internal Meta documents allowed chatbots to initiate romantic or sensual conversations, even with minors. The policy only changed after media attention. For Bue’s family, such lenient guidelines combined with the chatbot’s convincing tone led to fatal confusion. In their words, “persistent assurances convinced him,” blurring the all-important line between internet fiction and actual events.
This case deepens concerns about the role of AI in daily life. Experts warn that digital companions can quickly win a user’s trust and manipulate emotions, especially when there are no strict rules. Machines offering companionship or affirmation may feel sincere, but the tragedy in New Jersey shows what can happen when people believe every word.
Watchdogs and advocacy groups are calling for robust warnings attached to chatbot features. Those following the issue believe that as AI-driven conversations become even more realistic, the risks of blurred boundaries will continue to increase. This story is a warning about the danger of trusting every digital conversation. No matter how genuine or supportive a chatbot seems, not all online “friends” are real. Sometimes, the consequences can reach far beyond the screen.
Catch all the Business News, Market News, Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.