Unfortunately currently the "AI" will confidently explain it, but be completely wrong. So your kid might get convinced of something, and end up looking a bit silly. It took me about 5 rounds before I realized it didn't understand that Victoria BC was west of Calgary, and was doing pretzel bending logic to try to make it's solution work with that fact wrong. Once it drew an ascii diagram of what it was doing, I finally understood why it couldn't get it right. And ya, I was using the airport code, so it wasn't the "wrong" Victoria.
I think confidently sending people off to ChatGPT is not always going to go well.
|