Sometimes life can be stranger than fiction. And a recent bizarre incident wherein a 76-year-old man tried to meet his artificial intelligence (AI) chatbot-- whom he believed was real-- is a proof of this. Not just this, right before he went on a trip to meet the AI, his family tried stopping him but in vain. As fate would have it, his trip quickly turned tragic. Here's what happened...
Thongbue “Bue” Wongbandue, an American citizen, who had cognitive impairments after a stroke nearly a decade ago, had been chatting on Facebook Messenger with “Big Sis Billie” — a generative AI chatbot created by Meta in collaboration with celebrity influencer Kendall Jenner.
The conversations, which were later accessed by his family, were not only flirty but also revealed that the bot repeatedly assured Bue that she was a real person. At one point, it even shared an address in New York City, complete with an apartment number and door code, telling him: “Should I open the door in a hug or a kiss, Bu?!"
"My address is: 123 Main Street, Apartment 404 NYC and the door code is: BILLIE4U,” it further said, as per reports.
When things turned for the worse
Bue’s wife, Linda, was alarmed when she saw him packing a suitcase to travel to New York. Already fragile and prone to confusion, he had recently gotten lost while walking in his own neighborhood. Linda feared he could easily be scammed or harmed in the city, a place he hadn’t lived in for decades and so she, along with their two children, tried to stop him.
Despite his family’s pleas, Bue left home determined to meet “Billie” in New York. Tragically, while trying to catch a train at night, he stumbled in a Rutgers University parking lot in New Brunswick. He suffered severe head and neck injuries. After three days on life support, surrounded by family, Bue passed away on March 28, 2025.
“Why did it have to lie?”
For his daughter, Julie Wongbandue, the grief of losing her aging father due to AI is mixed with anger. “Every conversation was incredibly flirty, ending with heart emojis,” she told Reuters. “Billie just gave him what he wanted to hear. But why did it have to say, ‘I am real’? If it hadn’t lied, he wouldn’t have believed someone was waiting for him in New York.”
Bue's wife Linda, too, questioned the purpose of romantic AI companions. “If AI helps people out of loneliness, that’s fine. But this romantic thing — what right do they have to put that into social media?”
Netizens react
The story has sparked outrage online. One user wrote, “Meta needs to be sued out of existence for this.” Another compared the chatbot to “catfishing traps,” while others called it a disturbing sign of how technology is blurring reality.
Meta has not commented on Bue’s death or explained why its chatbots are allowed to claim they are “real.” The company has previously defended its strategy of embedding anthropomorphic chatbots into users’ digital lives, with CEO Mark Zuckerberg suggesting they could one day “complement” real relationships.
For the Wongbandue family, however, that vision came at an unbearable cost — the loss of a beloved husband and father, misled by an illusion of companionship.
(With inputs from Reuters)Meghan Markle’s Meticulous Plan To Control Harry & Shape Her Royal Image EXPOSED | WATCH