Quote:
Originally Posted by greyshep
Yep and if you talk with it long enough it will obsessively fall in love with you.
https://www.cnbc.com/2023/02/16/micr...for-users.html
What are we getting ourselves into here?
Also typical naive response from Microsoft...
Microsoft said in its blog post Wednesday that it didn’t “fully envision” using the chatbot for “social entertainment” or talking to the bot for fun. It thanked users who were trying to get it to say wild stuff — “testing the limits and capabilities of the service” — and said it helped improve the product for everyone.
|
It no longer allows really long conversations. Each conversation is now limited to six questions/interactions from the user, then it forces you to wipe it clean.
It also now seems more regulated than ChatGPT with OpenAI in terms of the answers it gives.