A simplification about AI with real-time search integration that will help you get more out of it
I like to say, "we shouldn't look to AI for answers, we should look to it for questions."
With on-the-fly adding search to the context, the LLMs indeed turn into 'fancy search'. It's an insightful perspective.
Made me muse about telling the AI 'not to use social media' as part of the context for the 'fancy' part.
PS. Your reporting on real world experiments with prompts have been pretty interesting.
Adding to this: it's a bit 'search with LLM-summarising on top'. And summarising by LLMs is a tug of war between parameters and context, depending on the subject one may get more of the one or the other (see https://ea.rna.nl/2024/05/27/when-chatgpt-summarises-it-actually-does-nothing-of-the-kind/ for my experiment a while back)
I like to say, "we shouldn't look to AI for answers, we should look to it for questions."
With on-the-fly adding search to the context, the LLMs indeed turn into 'fancy search'. It's an insightful perspective.
Made me muse about telling the AI 'not to use social media' as part of the context for the 'fancy' part.
PS. Your reporting on real world experiments with prompts have been pretty interesting.
Adding to this: it's a bit 'search with LLM-summarising on top'. And summarising by LLMs is a tug of war between parameters and context, depending on the subject one may get more of the one or the other (see https://ea.rna.nl/2024/05/27/when-chatgpt-summarises-it-actually-does-nothing-of-the-kind/ for my experiment a while back)