• 0 Posts
  • 2 Comments
Joined 6 months ago
cake
Cake day: September 9th, 2025

help-circle
  • We don’t necessarily need to know how animal brains work to achieve AGI, and it doesn’t necessarily have to work anything like animal brains.

    100% agree. Definitely thinking inside the box, inside the brain, when I went down that path.

    I think better way to explain my thinking is that LLMs can not operate like a human brain in that they fundamentally lack almost all qualities of a human brain. They are good but not perfect at logic just like humans, but they completely lack creativity, intuition, imagination, emotion and common sense, qualities that would make AGI.

    Without humans being able to understand how our brains process those qualities, it will be very hard to achieve AGI. But again, very wrong of me to think we need to translate code from our brains to achieve AGI.


  • From reading all the comments from the community, it’s amazing (yet not surprising) that all these managers have fallen for the marketing of all these LLMs. These LLMs have gotten people from all levels of society to just accept the marketing without ever considering the actual results for their use cases. It’s almost like the sycophant nature of all LLMs has completely blinded people from being rational just because it is shiny and it spoke to them in a way no one has in years.

    On the surface level, LLMs are cool no doubt, they do have some uses. But past that everyone needs to accept their limitations. LLMs by nature can not operate the same as a human brain. AGI is such a long shot because of this and it’s a scam that LLMs are being marketed as AGI. How can we attempt to recreate the human brain into AGI when we are not close to mapping out how our brains work in a way to translate that into code, let alone other more simple brains in the animal kingdom.