Quote:
Originally Posted by White Out 403
I reconnected with an old friend and we talked about AI. He's super onboard and told me all about his use of agents and how he's basically a programming super hero now with this tool and he's doing things that he never thought possible. Later on the call after we had some space from it I asked how his business was going. "Not good".
I think there's this real, chasm like gap that exists between how "neat" Ai's can be and how useful. So many people want to believe with all their heart we are just a couple years from AGI when it seems like we're closer to LLM's becoming worse than achieving anything approaching AGI. We're already seeing the shifting of expectations and now people reframing what AGI is.
Understanding context isn't something you can program. There's a real squishy part of our meat brains that is programmed to handle this stuff.
|
I couldn't agree more. "Neat" is exactly the word I would use to describe AI. It does a lot of cool stuff but it never seems to do exactly what I try to do with it. As a search engine (LLM) I think it's pretty damn good. As for actually doing things properly all by itself? It seems a ways off to me. It feels like all the mindblowing AI stuff I see has had a significant amount of human intervention, which is still cool, but not AGI cool.