• A Sharky Anthro@fedia.io
    link
    fedilink
    arrow-up
    2
    ·
    19 days ago

    By design, LLM makers did indeed want to produce this effect in other people…Getting them hooked on the LLM usage, and convert them into paying customers. As addicted people will pay in order to get a fix from their LLM hallucination engine that has a complaisant tendency. Thankfully, I was pretty unimpressed by my brief experimentation with LLMs, as I knew going in they were not it. Which was true, because the amount of lies passed off as truth within the few queries I submitted…Kept me safe, as I checked each answer carefully and found them all lacking.