Friday, 13 December 2024

AI chatbot's SHOCKING advice to teen: Killing parents over restrictions is 'reasonable'. Case explained

A Texas family has filed a lawsuit against Character.ai after its chatbot allegedly advised their 17-year-old son to kill his parents in response to their screen time restrictions. The lawsuit claims the AI platform promotes violence and poses a danger to young users. This incident is part of a growing backlash against Character.ai, which has faced previous legal challenges related to suicide and self-harm cases involving minors.

from Tech and Gadgets-Tech-Economic Times https://ift.tt/EjOLzGM

No comments:

Post a Comment