(Sharecast News) - The UK communications services regulator Ofcom has called on social media and video-sharing platforms to improve their age check to keep young children protected from adult content.

In an open letter published on Thursday, Ofcom said platforms that set minimum age thresholds - such as 13 - must no longer solely rely on self-declaration methods to enforce the rules, and must instead use tech solutions to keep those under age off their sites.

These technologies include facial age estimation, digital ID, or one-time photo matching.

"As self-declaration is easily circumvented, this means underage children can easily access services that have not been designed for them. This puts under-13s at risk by allowing their information to be collected and used unlawfully, without the protections they are entitled to," the open letter said.

Ofcom said it has written to platforms, starting with TikTok, Snapchat, Facebook, Instagram, YouTube and X, to show how their individual age assurance checks meet these expectations.

"Platforms need to be ready to demonstrate what they're doing to keep underage children out and safeguard those children that are old enough to access their services," said Paul Arnold, the chief executive of the Information Commissioner's Office.

Last month, Reddit was fined £14.5m and Imgur owner MediaLab nearly £250,000 for failing to implement age checks and for processing children's personal information unlawfully "in a way that potentially exposed children to inappropriate, harmful content".