• Furbag@lemmy.world
    link
    fedilink
    arrow-up
    18
    ·
    8 hours ago

    Investors don’t care about games as art, they carr about games as a vehicle for making money.

    If they are pushing for AI in games, it’s because they think it will make them money, not because they think it will be good for games.

  • ShaggySnacks@lemmy.myserv.one
    link
    fedilink
    English
    arrow-up
    51
    ·
    11 hours ago

    Yeah, I’d say that’s one of the reasons they don’t like it! Others include the use of artists’ work without consent, environmental issues, the quality of AI output, and the feeling that automating culture production can only result in what is now commonly called "AI slop

    Summed it prefectly why people hate AI in culture. AI can be very useful in science, medicine, engineering, and similar professions. When the AI is built upon very specific data set. There is no conscious reasoning behind why the AI did what when it makes art.

    Generative AI is just slop. It takes previous works and repackages it what the code says. When people make art, there are hundreds of micro decisions that people make. Those micro decisions are gone when AI makes it. Gabi Belle did a great video of why they hate AI art. https://youtu.be/QtZDkgzjmQI

    • INeedANewUserName@piefed.social
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      2
      ·
      10 hours ago

      AI is generally only considered useful in professions people aren’t actually familiar with. AKA it isn’t in its current form to actual experts in anything.

      • webadict@lemmy.world
        link
        fedilink
        arrow-up
        24
        ·
        9 hours ago

        “Generative AI is great at doing everything I suck at, but it’s completely terrible at the things I actually know!”

        Too many people think that this and do not seem to understand that it is pretty shitty at everything. Well, except getting people to kill themselves, I guess. It’s pretty good at doing that.

        • webadict@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          8 hours ago

          Cue the serial killer telling me that I don’t know what I’m talking about and that they could get people to kill themselves so much better and easier.

        • Beth@piefed.social
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 hours ago

          I was watching Ryan hall and his little AI bot the other day. It occasionally goes off the rails… Weird how he keeps trying though. Sometimes a bit entertaining, but if something I was using was malfunctioning that much I would not consider it a useful tool.

        • jj4211@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          8 hours ago

          The silver lining for the AI companies is that there’s a lot of real humans getting real money that are also really shitty at what they are paid to do.

      • jj4211@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        8 hours ago

        Coincidentally, Hollywood is pretty good at portraying every profession except the one I know!

  • LostWanderer@fedia.io
    link
    fedilink
    arrow-up
    27
    ·
    11 hours ago

    Good, those dirty fuckers don’t deserve accolades or reward for peddling their lies about the capabilities of LLMs (which are limited because these are just tools). It’s honestly better that creative endeavors like games development is human lead, because LLM garbage is so flat and empty. Humanity might have tricked rocks into carrying out complex calculations and other operations using silicon and electricity…We haven’t taught it to think or feel. Human beings with lived experiences should be the only people involved in the creative and technical aspect of games development.

    I hope they eventually take the L on peddling LLMs as AI, moving on to normal grifts I can point and laugh at them about. ROFL

  • pixxelkick@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    23
    ·
    10 hours ago

    keep AI out of games

    Good luck, its here to stay, get used to it lol.

    Anyone who thinks the average developer isnt using AI heavily in their code is delusional, its been baked into every major IDE for like 2 years now.

    Its in there, its permeated every layer of game dev, it works when you use it right, and the only time people care is when you make it obvious (IE including it in your final art of the game)

    But no one even blinks an eye at all the other layers AI is used in unless you announce it.

    You should just assume every game you play made after 2024 has chunks of it that are AI generate. The plot, writing, code… its in there, and you prolly haven’t even noticed.

    • november@piefed.blahaj.zone
      link
      fedilink
      English
      arrow-up
      9
      ·
      8 hours ago

      Good luck, its here to stay, get used to it lol.

      So are we. Get used to it.

      You should just assume every game you play made after 2024 has chunks of it that are AI generate. The plot, writing, code… its in there, and you prolly haven’t even noticed.

      Oh, we’ve noticed that AAA game quality is shittier than ever, trust me.

    • jj4211@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      8 hours ago

      While people may be opposed even in theory to more tame things like a little code completion, there’s plenty of room to very obviously notice GenAI slop.

      If people use LLM to generate text, they tend to make too much text, and it shows in how offputting it is. LLM may be able to generate a modest text without notice, but people will put in a two liner and get pages of garbage back and use that.

      And of course famously the GenAI textures are generally offputting. Maybe you can have ‘generic metal texture’ and no one will notice, but try for specific details and it generally gets caught.

      It is possible that human output that is similarly crappy gets mistaken for GenAI output, but oh well, slop is slop either way. It’s just that GenAI extends the slop to unbelievable magnitude.

      • Leon@pawb.social
        link
        fedilink
        arrow-up
        1
        ·
        6 hours ago

        While people may be opposed even in theory to more tame things like a little code completion, there’s plenty of room to very obviously notice GenAI slop.

        I mean there’s the regular “can you really sell code you don’t own” kind of thing going for it. The companies have stolen all sorts of data; voices, music, raster, vector, video, books, film. It’d be shocking if they also haven’t scraped all the code that’s out there on the web.

        Some of that is perfectly fine to alter, and sell. A lot of it isn’t. There are plenty of FOSS licenses that are restrictive in the sense that you’re free to use it and change it, but you can’t alter the license of it, and in many cases not sell it.

        So when an LLM produces code based on that, what applies?

        Then there’s obviously broader problems with ex-developers turned vibe coders coming out of the woodworks talking about how they can’t code anymore. I’ve people at my company joking about this, and the notion scares me. The idea that they’ve outsourced their thinking and problem solving skills to the point that they’re incapable of doing now it is terrifying.

        I don’t know why anyone would willingly do that.

        • jj4211@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          3 hours ago

          Well, unless you declare AI consumption fair use, only public domain is fair game, since every single license requires at least attribution. The courts regrettably seem to be buying the line that they are merely “learning” like a human and therefore exempt from the rules. All this ignoring that if a human reproduces something they “learned” close enough, they are on the hook for infringement, and in the AI scenario the codegen user has no sane way to know if the output is substantive and close enough to training material to count, since the origins are so muddled.

          I just don’t understand the “real” developer to vibe coding scenario. Like, it really sucks, even Opus 4.6, at being completely off the leash. I don’t understand how anyone can take what it yields as-is if they ever knew how to specifically get what they want. I know people that might be considered “coding adjacent” who are enthusiastic at seeing a utility brought to life, though usually they haven’t that is not quite what they wanted and get frustrated when it doesn’t work right and no amount of “prompt” seems to get the things to fix it. They long were intimidated by “coding”, but LLM is approachable. Many of these folks “scripted” far more convoluted stuff than many “coders”, yet they are intimidated by coding.

  • brucethemoose@lemmy.world
    link
    fedilink
    arrow-up
    15
    arrow-down
    27
    ·
    edit-2
    11 hours ago

    I mean, AI in games can be neat.

    As a specific example, consider Rimworld mods that generate conversations for characters, flesh out bios, make portraits based on their in-game traits. For free, on lightweight community finetunes that run on your PC.

    …I like that. I like how it’s tightly integrated and a good fit, yet also “optional flavor,” not the foundation of a game.

    What no one wants is AI Bro bullshit like:

    …A group discussion about how the games industry can "capitalize on shifting trends in customer engagement.”

    • verdigris@lemmy.ml
      link
      fedilink
      arrow-up
      47
      arrow-down
      1
      ·
      11 hours ago

      No thanks, I don’t want all of the descriptions and dialogues to be low-quality semi-plagiarized nonsense blabber just to fill space. I don’t want modders spending their precious time massaging slop to be fairly relevant when they could just make bespoke content instead.

      If a part of a game isn’t worthy of human attention, let it be boring or non-existent or an afterthought.

      • leoj@piefed.zip
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        1
        ·
        11 hours ago

        Right?

        AI companies stole the collective knowledge, creative juices, and artistic endeavors of the internet, which was shared freely to expand human knowledge and artistry.

        Now they wonder why we aren’t willing to pay for their repackaged and pillaged slop…

        I see a world where knowledge slowly gets hidden behind the choke hold of AI answers and paywalled sources. How do we hold the line?

      • brucethemoose@lemmy.world
        link
        fedilink
        arrow-up
        7
        arrow-down
        5
        ·
        11 hours ago

        No thanks, I don’t want all of the descriptions and dialogues to be low-quality semi-plagiarized nonsense blabber just to fill space. I don’t want modders spending their precious time massaging slop to be fairly relevant when they could just make bespoke content instead.

        That’s the thing. It can’t be bespoke content, unless it’s a quest mod. Rimworld situations are so dynamic they rarely fit the “mould” of something written ahead of time. Hence the placeholder dialogue you often see in base Rimworld is already autogenerated “nonsense blabber just to fill space”

        That… and have you ever used small LLMs finetuned for writing? While not perfect, it’s nothing like the slop you’d get out of, say, an OpenAI model. The finetuning datasets are open, and some of the base model datasets are open, too.

        • jj4211@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          8 hours ago

          My experience with the ‘look how amazing it is at writing’ is being exceedingly bored by the prattling on without substance.

          Like sure the style and structure can be less obviously bad, but it is still ultimately senselessly padding out a short prompt into a mountain of words that say no more than what the short prompt conveyed in the first place.

          If I want to dwell on some imagery, I can and have set down a book and just contemplated what I read and let it fill my mind. I don’t need a ton of words to force me to linger.

          • brucethemoose@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            8 hours ago

            It wouldn’t be long monologues. It’s short bits of conversation, or maybe 1 sentence descriptions.

            Again, throw everything you think you know about chat models out of your head. Throw everything related to multi-turn conversation and prompt engineering out.

            The prompt would look like a mess of programming variables: Rimworld skill levels and passions, traits, injuries, clothes and their state, logs of events, maybe a plot of entities around them. It would condense a bunch of information down (to, say, some reasonable quip of dialogue this character would say,) which is what text modeling was supposed to do before these stupid chatbots came in and spammed everything up.


            I get the sentiment that, sometimes, imagination is better. I like to read, or write out stories stuck in my head.

            …But sometimes I’d rather play a game.

            • jj4211@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              3 hours ago

              I just meant in response to the “models for writing”, that they are verbosity engines and even in a reading scenario where one might want to pointlessly dwell on something, we don’t need wall of text to do so.

              But sure, you can have a short dialog for background characters, but not sure I would care about the flavor text being ever so slightly bespoke versus the usual short throwaway lines. By the time you’ve fed all those stats, factoids, and events into the model., feels like you’ve already done way more than writing a couple lines of throwaway background dialog.

        • Hackworth@piefed.ca
          link
          fedilink
          English
          arrow-up
          3
          ·
          11 hours ago

          I could see that becoming the standard. Games ship with a switch in Settings that turns on/off the LLM features, with a field to either enter your API key or point it to a local model.

          • brucethemoose@lemmy.world
            link
            fedilink
            arrow-up
            2
            arrow-down
            5
            ·
            10 hours ago

            I mean, an paid API key shouldn’t be default. It shouldn’t even be an option, if you ask me. It should default to a community “horde” of folks playing the game, and prompt you to host an LLM and/or generate some responses for other users if you wish to.

            Kinda like the Fediverse. Or the AI Horde, but for a specific game: https://aihorde.net/

            I really don’t want one more drop of traffic redirected to OpenAI. They’re like a cancer in the machine learning community.

    • James R Kirk@startrek.website
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      2
      ·
      11 hours ago

      I think it could be cool for background NPC dialogue in big open RPGs like Skyrim. Imagine if townsfolk could have realistic conversations and interactions like bartering over goods, etc. Nothing major or plot-dependent, obviously. Just something more natural than a handful of repeated, scripted and prerecorded phrases.

      I would compare it to ray-tracing. Ray tracing means the artists don’t have to plan out every single light beam, and the result is actually more realistic than if they had. A tactfully used LLM could mean they don’t have to script out every line of background dialogue and also achieve a more realistic result.

      • jj4211@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        8 hours ago

        Think the critical thing would be to identify “background content” so that you don’t spend forever trying to tease out actionable info from a background character.

        That’s the biggest thing is that while LLM can do ‘flavor text’, it’s not very good at making sure that characters convey specific relevant detail reliably to a player.

        I don’t know about ‘more realistic’ though, LLM game demos can often go pretty out of character. Like a medieval setting NPC discussing coding. Or in one the character talked about how they had just came in from an outside walk, but they were chained in a dungeon cell. Another character talking about how the developer wrote them this way. Keeping an LLM “on the rails” of a scenario can break down.

      • brucethemoose@lemmy.world
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        11 hours ago

        Imagine if townsfolk could have realistic conversations and interactions like bartering over goods, etc. Nothing major or plot-dependent, obviously. Just something more natural than a handful of repeated, scripted and prerecorded phrases.

        There’s already a in-development Skyrim mod for that. Many sandbox games have mods for exactly this.

        I haven’t tried the Skyrim one though; haven’t been in the Skyrim scene for awhile. And TBH, some of the mods use pretty sad or sloppy LLMs by default.