Influencer couple stranded after trusting ChatGPT for travel advice
In another case reported by the American College of Physicians, a 60-year-old man was admitted to hospital after following ChatGPT’s recommendation to substitute salt (sodium chloride) in his diet.

MADRID – A Spanish influencer couple was left stranded at the airport following inaccurate information provided by ChatGPT about visa requirements for their trip to Puerto Rico to attend a Bad Bunny concert.
A viral video showed Mery Caldass breaking down in tears while being comforted by her boyfriend, Alejandro Cid, as they realised they were unable to travel.
“Usually, I do a lot of research, but this time I asked ChatGPT and it said it wasn’t necessary,” Caldass said referring to the visa.
However, she was not informed that an ESTA (Electronic System for Travel Authorisation) was required.
Upon reaching the airport, airline staff told them they were not allowed to board the plane without the document.
“I don’t trust it anymore, sometimes I even get angry at it (ChatGPT).
“I said you’re useless, but at least give me the right information.
“Maybe it holds a grudge against me,” Caldass added, accusing the chatbot as though it bore resentment towards her.
This was not the first time people had faced serious consequences from relying on artificial intelligence chatbot advice.
In a case reported by the American College of Physicians, a 60-year-old man was admitted to hospital after following ChatGPT’s recommendation to substitute salt (sodium chloride) in his diet.

The man replaced it with sodium bromide purchased online, a substance once used in medicine in the early 20th century but now known to be dangerous if consumed in large quantities.
Doctors confirmed the man had developed bromism after ingesting the chemical.
The report stated that the man’s error stemmed from a misinterpretation, as chloride-to-bromide substitutions are typically linked to cleaning agents, not food. – AGENCY
Download Sinar Daily application.Click Here!
