Quote:
Originally Posted by Shazam
For fata's sake, AI all boils down to statistical analysis, so if you think that's how the human brain works or if you think that is sentience, well, no, it's not.
Machine learning is still only somewhat useful. It still fails on many, many use cases. GIGO still matters.
There are lots of cloud AIs. I have a use case that would save me hundreds of thousands of dollars, and they all suck at my needs. I can even have them digest my entire dataset and they still suck.
|
What is your account of sentience and how does it emerge from activities of the nervous system, either in humans or other living things? What are the necessary and sufficient conditions for the existence of sentience? What is a test for sentience or the absence of sentience that we can apply to non-humans?
As I said, the performance of the AI holds a mirror up to ourselves and we understand the operation of brains like ours and the notion of sentience in things just like us. Since language is core to how we think, and assessing sentience by just looking at the mechanics of a physical system doesn't seem to work, the fact that LaMDA's responses are so compelling is great to make us ask questions about ourselves and our beliefs about what sentience is.