As artificial intelligence (AI) continues to advance rapidly, more people worldwide are turning to chatbots for guidance. However, experts caution against over-relying on AI tools for everyday decisions and problem-solving—a warning highlighted by a Spanish influencer couple’s recent mishap. The pair ended up missing their flight after following travel advice from ChatGPT.
In a viral video, Mery Caldass is seen crying while her boyfriend, Alejandro Cid, tries to comfort her as they walk through the airport. “Look, I always do a lot of research, but I asked ChatGPT and they said no,” Caldass explained when asked if they needed a visa to visit Puerto Rico for a Bad Bunny concert. She said the chatbot assured them no visa was necessary but failed to mention that they required an ESTA (Electronic System for Travel Authorisation). Once at the airport, airline staff informed them they could not board without it.
“I don't trust that one anymore because sometimes I insult him [ChatGPT]. I call him a bastard, you're useless, but inform me well that's his revenge," she added, suggesting the chatbot held a grudge.
This is not the first time AI chatbot advice has gone wrong. According to a case study in the American College of Physicians Journals, a 60-year-old man was hospitalised after seeking dietary advice from ChatGPT on how to eliminate salt (sodium chloride) from his meals due to health concerns.
Following the chatbot’s suggestion, the man replaced table salt with sodium bromide—a substance once used in medicines during the early 1900s but now known to be toxic in large doses. Doctors reported he developed bromism as a result. "He had replaced sodium chloride with sodium bromide obtained from the internet after consultation with ChatGPT, in which he had read that chloride can be swapped with bromide, though likely for other purposes, such as cleaning," the report stated.
Sobbing influencers blame ChatGPT for ruining a dream vacation
— Brightly (@BrightlyAgain) August 14, 2025
An influencer couple has gone viral after missing their flight to Puerto Rico — thanks, they claim, to a visa mix-up caused by ChatGPT.
Video By merycaldasshttps://t.co/9g4VPHjkZZ pic.twitter.com/grq8mhhUbX
In a viral video, Mery Caldass is seen crying while her boyfriend, Alejandro Cid, tries to comfort her as they walk through the airport. “Look, I always do a lot of research, but I asked ChatGPT and they said no,” Caldass explained when asked if they needed a visa to visit Puerto Rico for a Bad Bunny concert. She said the chatbot assured them no visa was necessary but failed to mention that they required an ESTA (Electronic System for Travel Authorisation). Once at the airport, airline staff informed them they could not board without it.
“I don't trust that one anymore because sometimes I insult him [ChatGPT]. I call him a bastard, you're useless, but inform me well that's his revenge," she added, suggesting the chatbot held a grudge.
This is not the first time AI chatbot advice has gone wrong. According to a case study in the American College of Physicians Journals, a 60-year-old man was hospitalised after seeking dietary advice from ChatGPT on how to eliminate salt (sodium chloride) from his meals due to health concerns.
Following the chatbot’s suggestion, the man replaced table salt with sodium bromide—a substance once used in medicines during the early 1900s but now known to be toxic in large doses. Doctors reported he developed bromism as a result. "He had replaced sodium chloride with sodium bromide obtained from the internet after consultation with ChatGPT, in which he had read that chloride can be swapped with bromide, though likely for other purposes, such as cleaning," the report stated.
You may also like
Oman's ports see strong growth in ship traffic and cargo handling in first half of 2025
'Dearly loved' Romford 'murder' victim pictured as fourth man arrested
Arsenal sent clear verdict over controversial Man Utd winner as passionate statement made
'Netflix's best movie in years' is coming to cinemas for a very limited time only
Udaipur school tragedy: Education Dept issues strict safety guidelines