Discussion about this post

User's avatar
John Quiggin's avatar

"As I mentioned in a previous post, that might mean leaning further into the summary as being a summary of results — describing what’s in the result set, complete with nods to sourcing, rather than trying to turn results answering vastly different questions into a single answer."

FastGPT, bundled with Kagi, does exactly this. My search experience has never been better than it is now. Kagi isn't free, but I'd rather pay in cash than in corrupted search results.

Expand full comment
Rania's avatar

This is such a great example of how an AI/LLM doesn't "think" in the way we understand it! It's just matching words together when it's seen those words go together, and has (to my knowledge, at least) no reliable way of knowing if the resultant output makes any sense. I like your suggestion to have it summarize the results themselves, rather than the content of those results; that seems like something it's much better equipped to do in a way that provides something useful

Expand full comment
2 more comments...

No posts