• BlameThePeacock@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    23 days ago

    Eliminating programmers will be possible when we figure out how to eliminate engineers in designing buildings.

    Only a true AGI will be able to do that, and while LLMs feel like a step towards AGI, they are still missing the critical ongoing learning component that needs to happen for an AGI to exist. The way the current systems are trained simply doesn’t allow for accepting and adopting new information continuously.

      • entwine@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        22 days ago
        • Take a human and have him study every single repo on GitHub

        • Take an AI and train it on every single repo on Github

        Which of those two will continue to make novice mistakes like SQL injection and XSS vulnerabilities?

        These AI “coding agents” aren’t learning or thinking. They’re just natural language statistical search engines, and as such it’s easy to anthropomorphize them. Future generations will laugh at us, kinda like how we laugh at old products that contain cocaine, asbestos, lead, uranium, etc.

          • entwine@programming.dev
            link
            fedilink
            arrow-up
            1
            ·
            22 days ago

            But by definition they are learning and it is not conceptually different from how we learn.

            (citation needed)

            “Machine learning” is neither mechanically nor conceptually similar to how humans learn, unless you take a uselessly broad view and define it as “thing goes in, thing comes out”. The same could be applied to a simple CRUD app.