People always misuse searchengines by writing the whole questions as a search…
With ai they still can do that and get, i think in their optinion, a better result
People always misuse searchengines by writing the whole questions as a search…
With ai they still can do that and get, i think in their optinion, a better result
People that use LLMs as search engines run the very high risk of “learning” misinformation. LLMs excel at being “confidently incorrect”. Not always, but also not seldomly, LLMs slip bits of information into a result that is false. That confident packaging, along with the fact that the misinformation is likely surrounded by actual facts, often convinces people that everything the LLM returned is correct.
Don’t use LLM as your sole source of information or as a complete replacement for search.
EDIT: Treat LLM results as gossip or as a rumor.