Hallucinating… just another word for…
“Of course it’s biased - it has learned English from the entire English corpus,” says Vivien Ming, AI expert.
Hallucination and bias seem to be a focus of criticism and resistance of and to AI. The idea that bias can somehow be “trained” out of AI is cute, but unlikely given the inputs.
Our reaction to hallucination by machines is equally funny.
We become outraged that something was made up and provide detailed arguments with sources and quotes to show how badly AI got it wrong.
Of course, we need to monitor this new technology closely and not be misdirected when it gets it wrong. Ultimately we must be the editors.
But I love that we get angry. As if it’s an entirely human right to make stuff up when we don’t know.
Turns out that machines are just as good as we are at that particular skill.