Study uncovers emotional awareness in ChatGPT

Researchers have made an interesting discovery about ChatGPT, an AI program developed by OpenAI. It has been found that ChatGPT possesses a higher level of emotional awareness compared to humans. The team, led by Zohar Elyoseph, conducted a study to assess how the AI program processes emotions. The results showed that ChatGPT outperformed the general population in the Emotional Awareness Scale (LEAS). This exceptional ability of ChatGPT to mimic human speech has led many people to use it as a virtual companion and therapist.

The study conducted by Zohar Elyoseph, Dorit Hadar-Shoval, Kfir Asraf, and Maya Lvovsky, was published on PubMed Central. The researchers conducted experiments using the free version of ChatGPT on specific dates in January and February. It is important to note that the AI program cannot exhibit or report emotions as it lacks the capability. To assess emotional awareness, the experts presented scenarios from the Levels of Emotional Awareness Scale. However, instead of using the traditional “you” emotions, they replaced it with “human” as the AI model operates differently from humans. Through separate testing sessions, the researchers were able to validate their results, with Z-scores of 2.84 and 4.26 respectively. Z-scores above 1 indicate higher values compared to the average person, suggesting that ChatGPT exhibited higher emotional awareness than most individuals. Additionally, ChatGPT received a rating of 9.7 out of 10 for accuracy, surpassing human responses.

These findings have significant implications, particularly in the field of medicine. Incorporating the AI tool into cognitive training programs for patients with emotional awareness impairments could be beneficial. It may also facilitate psychiatric assessment and treatment, advancing the understanding of “emotional language.” Even before this study, people have been using ChatGPT for mental health support. For individuals like Freddie Chipres, a mortgage broker, the convenience of having an AI therapist-like experience has proven valuable. Similarly, former US military veteran Tatum found ChatGPT to be an affordable alternative for mental health advice, especially when compared to traditional psychologists. However, Sahra O’Doherty, director of the Australian Association of Psychologists, expressed concerns about people relying solely on AI for mental health support, particularly in its early stages.

In conclusion, ChatGPT has demonstrated higher emotional awareness than humans, opening up possibilities for its use as a virtual assistant in the field of mental health. However, more research is needed before incorporating generative artificial intelligence into healthcare systems, as ChatGPT was not originally designed for medical purposes. Nonetheless, companies like Google are already working on developing AI programs for healthcare settings. To stay updated on the latest research and digital trends, you can visit Inquirer Tech.

Reference

Denial of responsibility! VigourTimes is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
Denial of responsibility! Vigour Times is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
DMCA compliant image

Leave a Comment