View Single Post
Old 03-02-2023, 04:23 PM   #99
JohnnyB
Franchise Player
 
JohnnyB's Avatar
 
Join Date: Mar 2006
Location: Shanghai
Exp:
Default

Quote:
Originally Posted by greyshep View Post
Yep and if you talk with it long enough it will obsessively fall in love with you.

https://www.cnbc.com/2023/02/16/micr...for-users.html


What are we getting ourselves into here?

Also typical naive response from Microsoft...

Microsoft said in its blog post Wednesday that it didn’t “fully envision” using the chatbot for “social entertainment” or talking to the bot for fun. It thanked users who were trying to get it to say wild stuff — “testing the limits and capabilities of the service” — and said it helped improve the product for everyone.
It no longer allows really long conversations. Each conversation is now limited to six questions/interactions from the user, then it forces you to wipe it clean.

It also now seems more regulated than ChatGPT with OpenAI in terms of the answers it gives.
__________________

"If stupidity got us into this mess, then why can't it get us out?"
JohnnyB is offline   Reply With Quote