Oxford University Press's
Academic Insights for the Thinking World

  • Author: Kees van Deemter

“Lying” in computer-generated texts: hallucinations and omissions

There is huge excitement about ChatGPT and other large generative language models that produce fluent and human-like texts in English and other human languages. But these models have one big drawback, which is that their texts can be factually incorrect (hallucination) and also leave out key information (omission).

Read More