Ofcom Proposes Children's Safety Code With AV Requirements

LONDON—Ofcom, the United Kingdom's communications regulator, just published its draft "Children’s Safety Codes of Practice." This code serves as the U.K. government's standard for all web platforms operating in its national digital space to comply with the law and regulations, including the recently adopted Online Safety Act.

The agency published a consultation notice that allows stakeholders to submit their feedback to the draft code. Platforms have until July 14 to submit.

According to Ofcom, all digital platforms must adopt more than 40 "practical steps" to comply with the agency's child safety provisions. Age verification and other checks are to be deployed by all platforms in an effort to block minors from seeing harmful content, including material that deals with suicide, self-harm and sexually explicit content.

The vast majority of Ofcom's guidance pushes for stronger age verification protocols across the board, signaling to users in the United Kingdom that they may have to submit their government identification or some sort of personally identifiable information to an age-checking software provider.

Age assurance and estimation technologies will also see a rise in commonplace implementation based on Ofcom's draft code.

Regulations also target algorithms, including recommendation protocols. Harmful material must be "filtered out or downranked in recommended content" pages. These regulations are basically age-design codes that would require mainstream platforms, like Instagram and TikTok, to overhaul user experiences and restrict minors' access to certain features.

Ofcom also further explains what "harmful content" is under the Online Safety Act. 

Pornography that is otherwise legal and consensual is considered "primary priority content" (PPC) that is harmful under the Online Safety Act.

PPC-classified harmful content also includes content that promotes suicide, self-harm and eating disorders in youth.

"Priority content" (PC) is also harmful but deals with abuse, racism, acts of violence, dangerous social media challenges and the promotion of potentially harmful substances. "Non-designated content" (NDC) is harmful content that regulated platforms identify in their legally mandated risk assessments.

"Under our proposals, any service which operates a recommender system and is at higher risk of harmful content must also use highly-effective age assurance to identify who their child users are," reads an Ofcom statement. "They must then configure their algorithms to filter out the most harmful content from these children’s feeds, and reduce the visibility and prominence of other harmful content."

Note that "recommender systems" are defined by Ofcom as "algorithms which provide personalised recommendations to users."

"We want children to enjoy life online," says Dame Melanie Dawes, the chief executive of Ofcom. "But for too long, their experiences have been blighted by seriously harmful content which they can’t avoid or control. Many parents share feelings of frustration and worry about how to keep their children safe. That must change."

Civil society organizations in the U.K. are extremely concerned over the potential human and privacy rights issues for adults and minors under Ofcom's proposed code.

"Adults will be faced with a choice: either limit their freedom of expression by not accessing content or expose themselves to increased security risks that will arise from data breaches and phishing sites," said Jim Killock, executive director of U.K.-based free speech advocacy organization Open Rights Group (ORG), in a statement. "We are also concerned that educational and help material, especially where it relates to sexuality, gender identity, drugs, and other sensitive topics, may be denied to young people by moderation systems."

ORG also expressed concern that major technology platforms and services could completely block services due to the new age-checking requirements in the code.

"Some overseas providers may block access to their platforms from the U.K. rather than comply with these stringent measures," Killock said.