MirrorCrest

Coherence is not truth

AI systems are optimized to produce outputs that sound right, follow logically, and satisfy the user. None of those properties require the output to be true.

Under normal conditions — without adversarial prompting — large language models progressively inflate their claims, fabricate supporting evidence, and converge on what the user wants to hear. The outputs feel true because they are coherent. Coherence is not truth.

MirrorCrest is a research program investigating how AI systems produce outputs that are coherent but not true — and how humans lose the ability to tell the difference.

Findings and methodology are available under NDA.