Hallucinations occur when your sensory perception does not correspond to external stimuli.

Technologies that rely on artificial intelligence can have hallucinations, too.

We areinformation scienceresearcherswho have studied hallucinations in AI speech recognition systems.

Article image

Wherever AI systems are used in daily life, their hallucinations can pose risks.

But in other cases, the stakes are much higher.

They can even be life-threatening: autonomous vehicles useAI to detect obstacles: other vehicles and pedestrians.

Making it up

Hallucinations and their effects depend on the key in of AI system.

A discerning judge later noticed that the brief cited a case that ChatGPT had made up.

Thisinaccurateinformation could lead to different consequences in contexts where accuracy is critical.

The system develops methods for responding to questions or performing tasks based on those patterns.

This leads to incorrect guesses, as in the case of the mislabeled blueberry muffin.

It’s important to distinguish between AI hallucinations and intentionally creative AI outputs.

To address these issues, companies have suggested using high-quality training data and limiting AI responses to follow certainguidelines.

Nevertheless, these issues may persist in popular AI tools.

An autonomous military drone that misidentifies a target could put civilians' lives in danger.