Hallucination Detection
What Are Hallucinations?
In the context of AI visibility, a hallucination is any factually incorrect claim an AI platform makes about your brand. This includes wrong pricing, outdated feature descriptions, confusion with a competitor, or entirely fabricated information. Hallucinations are one of the biggest risks of AI search — potential customers are getting wrong answers about your business, and you might not even know it.
How Detection Works
During every audit, 99Visibility compares AI platform responses against the brand data you've provided in your account settings. This includes:
- Your brand name and correct spelling
- Your product or service description
- Your pricing (plans, tiers, amounts)
- Your key features and capabilities
- Your team or leadership information (if provided)
When an AI response contains a claim that contradicts your verified data, it's flagged as a hallucination. The more complete your brand profile is, the more accurately 99Visibility can detect issues.
Severity Levels
Critical
Wrong information that could directly cost you customers or damage your reputation. Examples:
- Incorrect pricing (e.g., AI says $29/mo when you charge $49/mo)
- Competitor confusion (e.g., AI attributes a competitor's feature to your brand)
- Fabricated claims (e.g., AI says you offer a feature you don't have)
- Factually wrong descriptions of your core product
Critical alerts appear prominently on your Dashboard and trigger an email notification.
Warning
Outdated or partially incorrect information that should be corrected but isn't causing immediate harm. Examples:
- Outdated feature descriptions (e.g., references a feature you've renamed or retired)
- Stale pricing from a previous pricing structure
- Slightly inaccurate company description
Info
Minor issues or observations worth monitoring but not urgent. Examples:
- AI describes your brand in overly generic terms
- Minor wording differences that don't affect accuracy
- Missing context (AI is technically correct but incomplete)
Common Hallucination Types
Wrong Pricing
The most common hallucination. AI platforms frequently report outdated or incorrect pricing, especially if your pricing has changed since the model's training data was collected. This is particularly dangerous because potential customers may make purchasing decisions based on wrong prices.
Outdated Features
AI describes features you've discontinued, renamed, or significantly changed. This usually happens when your product has evolved but AI training data hasn't caught up.
Competitor Confusion
AI mixes up your brand with a competitor — attributing their features to you or vice versa. This is common in crowded markets where multiple brands solve similar problems.
Fabricated Claims
AI confidently states something about your brand that has no basis in reality. This is the classic "hallucination" — the model generates plausible-sounding but entirely false information.
Fixing Hallucinations
When 99Visibility detects a hallucination, it generates an Accuracy Fix recommendation. General strategies include:
- Add structured data. Product schema with correct pricing, Organization schema with accurate descriptions. Structured data is the most direct way to give AI platforms correct information.
- Update your content. Ensure your website clearly states the correct information in easily extractable formats (headings, lists, tables).
- Publish authoritative content. Press releases, updated "About" pages, and FAQ pages with definitive answers help override incorrect training data.
- Create comparison pages. If AI confuses you with a competitor, a well-structured comparison page clarifies the differences.
After implementing fixes, run a new audit to check whether the hallucinations persist. Some corrections take effect within days (especially on Perplexity and Google AI), while others may take longer on platforms that rely on training data (ChatGPT, Claude).
For more on how AI platforms decide what to say, see How LLMs Choose What to Cite.