Discussion about this post

User's avatar
Timothy Burke's avatar

"Collapse" might be a strong word but there are two aspects of hallucination chains that really do seem to portend that possibility. The one is that the kind of chain you described in the post is essentially fully automated--those chains can form without any human being involved in the process. (Even the "internet bait" query that kicked it off is substantially fueled by bots on Reddit at the moment.) Generative AI trainers can make some interventions if and when that looping starts to get out of control, but the scale of this behavior is so potentially huge and rapid that I could see it being almost impossible to interrupt if it's left alone too long.

I think the more pressing problem is that the info-slop that hallucination chains could generate at large scales is also a serious disincentive to human beings who create higher quality information in various forms, especially in repositories like Wikipedia that generative AI RAGs and deep -research lookups dependent upon. If human information and knowledge producers simply pull out of those kinds of spaces and retreat into much more siloed and firewalled archives where they're being paid for the value of what they contribute and where there's some protection from slop, that could lead to a really rapid collapse of the kinds of information that wider publics presently expect to find online.

Expand full comment
1 more comment...

No posts