The Negative Effects of YouTube on Mental Health: Study Finds Link to Loneliness, Anxiety, and Depression Among Young Viewers

by time news

2023-05-12 17:35:00

YouTube – iStock

Published in:
Last updated:

Researchers have found a link between frequent use of the YouTube platform and increased levels of loneliness, anxiety and depression, especially among viewers under the age of 29, according to Neuroscience News.

Content makers

The study, conducted by researchers from the Australian Institute of Suicide Research and Prevention, raises concerns about the “social relationships” between creators and viewers, as well as digital recommendations for viewing suicide-related content.

The researchers stress the importance of finding AI-based solutions to guide users towards positive mental health content.

Dr. Luke Balcombe and Professor Emeritus Diego de Leo, from Griffith University’s School of Applied Psychology and AISRAP, sought to understand the positive and negative impacts of YouTube, as the world’s most popular streamer, on mental health.

The team of researchers discovered that the most affected individuals are people under the age of 29, or who regularly watch content about other people’s lives.

Face-to-face social interaction

Lead researcher Dr. Balcombe said the evolution of social relationships between creators and followers can be cause for concern, but there have also been some neutral or positive cases of creators developing closer relationships with their followers.

He added that “these online ‘relationships’ can fill a gap for people who have, for example, social anxiety, but can exacerbate their problems when they are not engaging in face-to-face interactions, which are especially important in the developmental years.”

appropriate period of use

Dr. Balcombe advised that users of YouTube and similar services “limit their watch times and seek other forms of social interaction to combat loneliness and promote positive mental health.”

Balcomb explained that the amount of time spent on YouTube has often been a source of concern for parents, who struggle to monitor their children’s use of the platform for educational or other purposes. He explained that for the purpose of the study, more than two hours per day of using the YouTube platform was classified as high-frequency use and more than five hours per day as saturated use.

Recommendations based on search engine

The study also determined that more needed to be done to prevent recommending and suggesting suicide-related content to users based on AI algorithms.

While ideally, people should not be able to search for these topics and get knocked over, YouTube’s algorithm pushes recommendations or suggestions for items to watch based on past searches, which can result in serious harm to the user.

self harming

Of course, users can report this type of content, but sometimes it may not be reported, or stay available for a few days or weeks and with the huge volume of content going through, it is almost impossible for YouTube’s algorithms to counteract and block in a timely manner. Also, if a piece of content is flagged as possibly containing themes of self-harm, YouTube will provide a warning and ask the user if they want to play the video.

Dr Balcombe said there “could be value in AI monitoring and intervention” in relation to “at-risk” children and adolescents who engage in high-frequency use.

Human and Computer Interactions

He went on to explain that “human-computer interaction issues were explored and a concept was proposed for an algorithmic recommendation system independent of YouTube that would direct users towards verified mental health content or promotions, noting that YouTube is increasingly being used for mental health purposes, mainly to search for information.” Many digital mental health approaches are being tried with varying levels of merit, but with over 10,000 mental health apps currently available, it can be really difficult to know which ones to use, or even how positive and valid the recommendations are from a practitioner’s point of view. “

Experts and artificial intelligence

Dr Balcomb concluded, “There is a gap in mental health or suicide tools, not validated based on a combination of AI-based machine learning, risk modeling, and appropriately qualified human decisions, but by bringing mental health and suicide experts together to validate information from Artificial intelligence and digital interventions for mental health could be a very promising solution to support growing unmet mental health needs.”

Read also

#Beware.. #damage #YouTube #platform #lot

You may also like

Leave a Comment