Claude has no self awareness

“Claude has no self awareness. It doesn't know what it's thinking about, what it tells you it's doing is completely disconnected from what it's actually doing. I'd say that self-awareness is a precondition for consciousness, so this model is nowhere near conscious. The example also tells us that all the talk about emergent features in large language models is nonsense. Claude doesn't learn how to do maths, despite the fact that it has access to thousands of textbooks and algorithms. All it does is token predictions. Yes, it uses intermediate steps that you can interpret as internal reasoning, but it's still just token predictions. It hasn't developed an abstract math score or anything.”

(Sabine Hossenfelder)

Full video on YouTube: AGI never?

Lambert Heller ✖️ https://biblionik.org