Since launching in 2005, YouTube has changed how people watch videos online. There’s no denying YouTube’s convenience, offering users endless content on virtually any topic. However, researchers from Griffith University suggest spending too much time on YouTube can damage mental health. A team from the Australian Institute for Suicide Research and Prevention reports that habitual YouTube users show higher levels of anxiety, loneliness, and depression.
Dr. Luke Balcombe from Griffith University’s School of Applied Psychology hoped to understand the negative and positive mental health implications linked with the world’s busiest streaming platform. Remarkably, people experiencing the most damaging effects were people under 29 years old and those regularly watching content about other people’s lives.
He explained that encouraging parasocial relationships between followers and content creators may cause concern. However, there were a few positive or neutral instances where creators developed closer relationships with their followers. In a university release, Dr. Balcombe said, “These ‘online relationships’ can fill a gap for people who, for example, have social anxiety. However, it can exacerbate their issues when they don’t engage in face-to-face interactions, which are especially important in developmental years. We recommend individuals limit their time on YouTube and seek other forms of social interaction to combat loneliness and promote positive mental health.”
Study authors add that many parents are especially concerned about how much their kids spend on YouTube. It’s common for moms and dads to say it’s tough to constantly monitor their children’s use of the platform for educational or other purposes.
During this study, the team considered spending over two hours daily on YouTube as “high-frequency use” and over five hours daily as “saturated use.” Researchers also concluded much more needs to be done to prevent suicide-related content from being recommended to users via suggested viewing algorithms. YouTube’s algorithm offers videos based on previous searches, potentially sending users further down a disturbing “rabbit hole.”
While users can report this type of content, it often goes unreported. Conversely, if a harmful video remains online for just a few days or weeks due to the sheer volume of content passing through, it is usually near impossible for YouTube’s algorithms to stop all of it. If a piece of content is flagged as potentially promoting self-harm or suicide, YouTube generates a warning and asks users if they want to continue watching the video.
Dr. Balcombe continued, “With vulnerable children and adolescents who engage in high-frequency use, there could be value in monitoring and intervention through artificial intelligence. We’ve explored human–computer interaction issues and proposed a concept for an independent-of-YouTube algorithmic recommendation system which will steer users toward verified positive mental health content or promotions.”
He added, “YouTube is increasingly used for mental health purposes, mainly for information seeking or sharing, and many digital mental health approaches are being tried with varying levels of merit, but with over 10,000 mental health apps currently available, it can be overwhelming knowing which ones to use, or even which ones to recommend from a practitioner point of view.”