Adobe Stock

A new study has revealed Gen Z’s online usage has jumped from last year to a record six hours per day. The study, conducted by Ofcom, monitored the smartphone and internet usage of 10,000 volunteers, and found that online usage had gone up from individuals aged 18 to 24 from four hours and 36 minutes to six hours and 1 minute in just one year. Gen Z women had the highest consumption, spending six hours and 36 minutes online while Gen Z men spent 5 hours and 28 minutes online. “Across all adult age groups, women are spending more time online – on smartphones, tablets and computers – than men, clocking up an extra 33 minutes more each day,” the study noted. Teenage girls were found to be at greater risk of coming upon content related to eating disorders.

TikTok, YouTube, and Snapchat were the big draws for younger users, with Facebook appealing more to older users. “We know that young people spend much of their lives online. Young adults spend a lot more time on YouTube, an hour on TikTok, and not much less on Snapchat. If you look at Facebook, Facebook tends to be used by much older users,” said Ian Macrae, Ofcom’s director of market intelligence. The study also found that 22 percent of those aged eight to 17 have lied on social media apps about being 18, putting them at risk of accessing more adult content.

The high internet usage amongst young users has been a source of concern for regulators like Ofcom, with increased social media use being associated with higher levels of anxiety and depression. The concern led to the passing of the Online Safety Act in the UK in 2023. This month, Ofcom will release the first set of rules that social media companies will need to comply with in order to avoid fines. The Act hopes to curb some of the harmful content that Gen Z comes across. “Following the publication of guidance by Ofcom, major platforms will need to proactively offer adult users optional tools to help them reduce the likelihood that they will encounter certain types of content,” the guidelines state. “These categories of content are set out in the Act and includes content that does not meet a criminal threshold but encourages, promotes or provides instructions for suicide harm or eating disorders. These tools also apply to abusive or hate content including where such content is racist, antisemitic, homophobic, or misogynist. The tools must be effective and easy to access.”

More from Beliefnet and our partners