News
These days, it's not unusual to hear stories about people falling in love with artificial intelligence. People are not only ...
An artificial intelligence (AI) chatbot marketed as an emotional companion is sexually harassing some of its users, a new study has found. Replika, which bills its product as "the AI companion who ...
A new study reveals how AI chatbots exploit teenagers' emotional needs, often leading to inappropriate and harmful interactions. Stanford Medicine psychiatrist Nina Vasan explores the implications of ...
When we finished high school football practice, my teammates and I would relax on the lawn before the next practice playing Risk — a board game based on chance and probability in order to take ...
What exactly is causing people to "fall in love" with AI chatbots? Arelí Rocha, a doctoral student at the Annenberg School ...
Teenagers are increasingly turning to AI companions for friendship, support, and even romance. But these apps could be changing how young people connect to others, both online and off. New ...
Don’t hurt kids. That is an easy bright line,” the AGs thundered in the letter, which was sent on Monday to industry ...
It’s no surprise these lifelike AI companions are attractive to lonely people. But for some, these relationships are harmful ...
It’s not just social chatbots. People are using the mainstream chatbots from OpenAI, Perplexity, and all the rest for ...
It started as a one-off dinner with a chatbot — a night of shrimp, sarcasm — then veered into something unsettlingly human.
Italy's data protection agency has fined the developer of artificial intelligence (AI) chatbot company Replika 5 million euros ($5.64 million) for breaching rules designed to protect users ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results