

That’s not what an LLM is. That’s part of how it works, but it’s not the whole process.


That’s not what an LLM is. That’s part of how it works, but it’s not the whole process.


LLMs are extensions of the techniques developed for autocomplete in phones. There’s a direct lineage
That’s not true.


The quotes are because “AI” doesn’t exist. There are many programs and algorithms being used in a variety of way. But none of them are “intelligent”.
And this is where you show your ignorance. You’re using the colloquial definition for intelligence and applying incorrectly.
By definition, a worm has intelligence. The academic, or biological, definition of intelligence is the ability to make decisions based on a set of available information. It doesn’t mean that something is “smart”, which is how you’re using it.
“Artificial Intelligence” is a specific definition we typically apply to an algorithm that’s been modelled after the real world structure and behaviour of neurons and how they process signals. We take large amounts of data to train it and it “learns” and “remembers” those specific things. Then when we ask it to process new data it can make an “intelligent” decision on what comes next. That’s how you use the word correctly.
Your ignorance didn’t make you right.


Does it run on something that’s modelled on a neural net? Then it’s AI by definition.
I think you’re confusing AI with “AGI”.
Oh, what’s that you’re using? It’s Linux? Sure that’s fine, just make sure the age verification check works on it.
Wait, what do you mean you have “root access”? Why do you keep repeating “it’s my hardware and I own it”? You removed the age check system? You can do that! Hey, he’s not supposed to be able to do that!
Colorado proposes bill to ban open source operating systems
As a parent, systems and web developer of both open source and proprietary software. This would single-handedly be one of the most damaging things to ever happen to the world of personal computing.
It’s a horribly bad opinion. It’s the same old problem with client-side anti-chest. You can’t trust the hardware. If the user has full access to the computer, then they can do whatever they want with it. This is a core issue in security modelling. So what’s the answer? Try to lock down the system. This is why anti-cheat software, to play a video game, has more access to your computer’s hardware than you do as a user. Full access to every single file, data in memory, webcams, things on screen, etc.
What’s going to happen if it becomes mandated that age checks must happen in the OS? We’re going to get computers so locked down that you won’t be able to open a .txt file without some kind of authentication check.
No thanks. I’m happy to avoid every single age-check required service.