Ofcom, the communications watchdog, is calling on social media platforms to combat online grooming by refraining from suggesting children as “friends” by default. This recommendation is part of Ofcom’s initial guidance for tech platforms to comply with the Online Safety Act, focusing on addressing illegal content, especially child abuse online. The guidance emphasises measures to protect children online, including changing default settings to prevent the addition of children to suggested friend lists, ensuring the confidentiality of children’s location information, and preventing unsolicited messages from non-contacts.
Figures from Ofcom reveal that over 10% of 11- to 18-year-olds have received naked or semi-naked images online. The draft code of practice under the Online Safety Act covers areas such as “child sexual abuse material (CSAM),” grooming, and fraud. Ofcom seeks feedback from tech platforms on its proposed plans, which also involve resource allocation for content moderation teams.
The guidance includes the use of hash-matching technology to detect CSAM. This technology converts images into numerical hashes and compares them with a database of known CSAM image hashes. However, this method does not apply to private or encrypted messages, and Ofcom asserts that it is not proposing anything that would compromise encryption. Controversial powers in the bill, allowing the scanning of private messages for CSAM, will not be consulted on until 2024 and are unlikely to be enforced until around 2025.
While the encryption debate continues, Ofcom encourages encrypted messaging companies to explore ways to combat child abuse on their platforms. The guidance represents Ofcom’s initiative to address online safety challenges and protect young users from potential risks on social media platforms.
More Stories
Australia Plans to Ban Kids from Social Media – Will It Be Effective?
Ukraine prohibits the use of Telegram on state-issued devices
Sony unveils the new, more powerful, and significantly pricier PlayStation 5 Pro