AI hallucination is costing UK businesses customers every day
Right now, as you read this, someone in your area is asking ChatGPT a question about a product or service that you sell. ChatGPT is generating an answer. And there is a very real possibility that the answer contains completely wrong information about your business.
AI hallucination is the technical term for when artificial intelligence tools generate information that sounds confident, reads as authoritative, but is factually incorrect. When this happens to your business, the consequences are immediate and measurable. A potential customer asks "What time does [your business] close?" and ChatGPT tells them 6pm when you actually close at 9pm. They do not come. You never know why. A customer asks Perplexity "Does [your business] offer [specific service]?" and Perplexity says no when the answer is yes. You have just lost a sale to a competitor without ever knowing it happened.
The most dangerous thing about AI hallucination is that you cannot see it happening. There is no analytics dashboard for it. There is no notification. There is no way to know unless you actively test it. And most businesses never do.
AI tools present hallucinated information with the same confidence and authority as accurate information. The customer has no way to tell the difference. They trust the AI, they act on the wrong information, and your business pays the price.
As more people switch from Google to AI tools for their daily queries, the impact of hallucinated information grows exponentially. Every wrong answer is a potential customer lost. Every fabricated detail is trust destroyed before you even had a chance to earn it.
These are the kinds of errors we find in every single AI Hallucination Audit we conduct. They are not rare edge cases. They are happening to businesses across the United Kingdom right now.
A Birmingham restaurant found ChatGPT was telling customers they closed at 9pm when they actually close at 11pm. They estimated losing 15 to 20 evening covers per week before discovering the error.
A plumbing company in the West Midlands discovered Perplexity was telling users they did not offer emergency callouts — their most profitable service line. The AI had confused them with a different company.
A solicitor in Birmingham city centre found Google Gemini was directing potential clients to an old office address they had moved away from two years ago. Clients were arriving at an empty building.
A dental practice found ChatGPT was citing reviews that did not exist — including a negative review that had never been written by any real patient. The AI had fabricated the review entirely.
Every one of these errors was costing the business real money, real customers and real reputation damage. And in every case, the business owner had no idea it was happening until we showed them.
"The most expensive lie is the one you do not know is being told about you."
AI hallucination about your business is not random. It happens for specific, identifiable reasons — and understanding those reasons is the first step to fixing the problem.
Outdated training data. Large language models like ChatGPT are trained on massive datasets that have a cutoff date. If your business changed its hours, services, pricing or location after that cutoff, the AI is working with old information. Even models with internet access often rely on cached or indexed data that can be months or years out of date.
Conflicting signals across the web. If your opening hours say one thing on Google Business Profile, something different on your website, something else on Yelp and yet another thing on your Facebook page, AI tools have to guess which one is right. They often guess wrong. Information inconsistency across platforms is the single biggest cause of AI hallucination about local businesses.
Weak entity signals. If your business does not have a clear, well-defined presence as an entity across multiple trusted data sources, AI tools struggle to distinguish you from similarly named businesses. They may merge your information with a competitor, confuse you with a business in a different city, or simply make up details to fill gaps in their knowledge.
Missing structured data. Your website may contain all the correct information, but if it is not structured in a way that AI tools can easily parse — using schema markup, clear headings and machine-readable formats — the AI may misinterpret or ignore it entirely.
The good news is that every one of these causes is fixable. The bad news is that if you do not fix them, the hallucinations will continue and potentially get worse as AI tools become more widely used.
The AEO-REX AI Hallucination Audit is a comprehensive, manual investigation into exactly what AI tools are saying about your business. Here is what you get.
Find out exactly what ChatGPT, Perplexity and Gemini are telling your potential customers about you. You might not like what you find.
Check If AI Is Lying About Your Business