cnbctv18 · 15h
‘Please Die’: Student gets abusive reply from Google's AI chatbot Gemini
Google's chatbots have previously come under fire for providing potentially dangerous answers to user inquiries. Reporters discovered in July that Google AI provided inaccurate, potentially fatal answers to a number of health-related questions,
moneycontrol.com · 22h
Google's AI chatbot Gemini verbally abuses student, calls her 'stain on universe': 'Please die'
The alarming exchange unfolded when Reddy asked the chatbot for help with a school assignment exploring challenges faced by older adults. Instead of providing constructive assistance, the chatbot issued a series of disturbing and hostile statements,
Some results have been hidden because they may be inaccessible to you
Show inaccessible results