18 May 2025

Do AI Bots Dream of Electric Trucks? - part the second

Well... may be some are coming around to the solution to hallucination: check with ground facts. Zeynep Tufekci recounts The Musk Ox's AI shambolic entity. And a funny read it is.
Large language models, the kind of generative A.I. that forms the basis of Grok, ChatGPT, Gemini and other chatbots, are not traditional computer programs that simply follow our instructions. They're statistical models trained on huge amounts of data.
[my emphasis]
OK, so there's the first point: none of this shit is deterministic. We all know this, but few will admit it. One small step for mankind... and all that.

She then opines
Companies have developed various methods to try to rein them in, including relying on "system prompts," a kind of last layer of instructions given to a model after it's already been developed. These are meant to keep the chatbots from, say, teaching people how to make meth or spewing ugly, hateful speech. But researchers consistently find that these safeguards are imperfect. If you ask the right way, you can get many chatbots to teach you how to make meth. L.L.M.s don't always just do what they're told.
She further opines on methods to deter hallucination
But it's not that straightforward, and therein lies perhaps the most dangerous, thorny truth about L.L.M.s. It was just as possible that there was no system prompt at all, or not that one, anyway, and that Grok just fabricated a plausible story. Because that's exactly what L.L.M.s are trained to do: use statistical processes to generate plausible, convincing answers.
So, in all, we can speculate that some AI purveyors have tried some steps to curtail computer generated disinformation. We can, clearly, speculate that some purveyors of AI seek to generate disinformation. I'll leave it to you, gentle reader, where The Musk Ox's Grok fits in. And we know, if one takes these missives seriously, that the cure for hallucination and disinformation is to merge the RDBMS with correlation processing. It's likely a bit more work to develop the ground fact database, over and above just letting massive amounts of storage and compute grind through the entirety of the innterTubes (of course, we know by now that The Musk Ox is a chemical driven hallucinating meat sack). But that is the job.

No comments: