Aylo, IWF Publish U.K.-Focused Adult Industry Code of Conduct

LONDON/MONTREAL—Aylo, the parent company of adult tube site Pornhub, in partnership with the United Kingdom-based Internet Watch Foundation (IWF) today announced new industry standards developed for adult platforms operating in the U.K.

The two organizations say they've developed an adult entertainment industry-specific code of conduct, which they refer to as the "Standard of Good Practice" for platforms with U.K. users.

According to a press release, an advisory committee of experts was formed by the Internet Watch Foundation to help adult industry companies in the U.K.'s digital space create a "zero tolerance" environment for child sexual abuse material (CSAM) on their platforms.

IWF and Aylo announced their partnership over 18 months ago, and the recommendations outlined in the Standard of Good Practice are a direct product of that collaboration.

“I was pleased that the IWF and Aylo have been able to work together to establish stringent and clear standards for material on adult sites," said Sir Richard Tilt, IWF's former chair and the leader of this expert advisory committee. "Success of the new standards will depend on close auditing to ensure a high level of compliance."

The recommendations outlined in the Standard for Good Practice, also referred to as Adult Content Standards, establish six principles that guide the baseline standard of compliance to counter CSAM online.

Platforms that publish adult content in the United Kingdom's digital space should "adopt a zero-tolerance approach to child sexual abuse; ensure transparency, with enforceable terms of service; operate with accountability, with clear reporting mechanisms; embrace technological tools and solutions; collaborate with specialists; and embrace regulatory and safety initiatives, including voluntary principles."

Based on these principles, the code of conduct provides platforms with a recommended baseline standard in order to comply with U.K. authorities:

a. Adult services must comply with legislation and regulations. This includes requirements to assess the level of risk on their services and embrace safety by design as an approach to mitigate harms in the configuration of their platform or service.

b. Adult services must adopt mechanisms to detect and prevent known child sexual abuse material (CSAM), as deemed appropriate by the IWF. 

c. Adult services must ensure that they do not carry any materials in breach of the British Board of Film Classification’s (BBFC) standards for an R18 certification, including:

  • Material which is in breach of the criminal law;
  • Material (including dialogue) likely to encourage an interest in sexually abusive activity which may include adults role-playing as non-adults;
  • The portrayal of sexual activity which involves real or apparent lack of consent; any form of physical restraint which prevents participants from indicating a withdrawal of consent;
  • The infliction of pain or acts which are likely to cause serious physical harm, whether real or (in a sexual context) simulated. Some allowance may be made for non-abusive, consensual activity;
  • Penetration by any object likely to cause physical harm;
  • And sexual threats, humiliation or abuse which do not form part of a clearly consenting role-playing game.

d. Adult services must publish transparency reports every six months.

e. Adult services must establish and maintain a dedicated portal to facilitate accessible and secure communication between law enforcement and the platform regarding specific investigations, ensuring a channel for victim redress is readily available.

f. Adult services must have a clear reporting function for users to flag harmful content.

g. Adult services must subject anyone publishing or appearing in material on the platform to age verification measures to confirm they are over 18 years old at the time of production before content is permitted to be published.

h. Adult services must ensure that consent is secured for all individuals appearing in content. All individuals must be allowed to withdraw consent at any time. Cases which involve professional contracts should be assessed on a case-by-case basis.

i. Adult services must not support technologies that obscure the content of messaging and other communications that would inhibit moderation.

j. Adult services must not adopt, or encourage the adoption of, technologies that can be used to bypass content filtering and content blocking mechanisms, whether for accessing their services or hosting them.

Then, there is a higher standard for platforms looking to exceed the baseline standard:

a. Human moderators must review all content before it can be published on the platform.

b. Human moderators must be well-supported.

c. Adult services must deploy tools and changes on all historical content and accounts where reasonably possible, even if this requires removing old content.

d. Adult services should have a clear and effective deterrence messaging display if a user attempts to search for words and terms associated with underage content and other types of illegal content. This should be supplemented by a message encouraging users to contact an appropriate support organization.

e. Adult services should seek to engage with other relevant organizations to use their tools, participate in initiatives, and seek their expertise.

f. Adult services should invest in horizon scanning for future issues and share intelligence with other platforms about threats to children on their sites or services.

IWF also provides supporting documentation and an official standards document along with citations for specific elements of these standards that allow adult platforms to meet certain Online Safety Act requirements. These requirements include adopting age verification requirements for all adult users in the United Kingdom.

The standard's requirements are already standard across the vast majority of the adult industry. For example, in the baseline standard, IWF and Aylo call for record-keeping standards, like ensuring performers who appear in the content are confirmed to be at least 18 years or older, consent to be in that content and make those documents accessible for law enforcement and regulatory inspection.

United States law already requires adult entertainment platforms and studios to retain age and identity records in some way through a custodian of records, typically an attorney or senior executive.

Aylo's involvement with IWF and this advisory board is noteworthy, given years of controversy surrounding accusations of profiting from non-consensual imagery and videos. Well before Aylo was acquired by private holdings company Ethical Capital Partners, it began voluntarily reporting to the U.S.-based National Center for Missing & Exploited Children (NCMEC) CyberTipline program. 

Aylo also partners with the NCMEC program TakeItDown, which provides an anonymous takedown service that can be used to remove CSAM. It also participate in the StopNCII.org program, which is similar to TakeItDown. StopNCII.org is operated by the South West Grid for Learning (SWGfL). IWF is the United Kingdom's equivalent to NCMEC.

IWF, SWGfL, and Aylo representatives are joined on the expert advisory committee that developed these standards by others from government agencies, civil society, and academia. These organizations include the British Board of Film Classification, the UK Home Office, the National Crime Agency, and academics from Middlesex University and Exeter University.

"Trust and safety should be a priority for every platform, adult, and mainstream, and together, we have a collective responsibility in shaping a brighter, safer online experience," said David Cooke, Aylo's senior director for trust and safety regulations and partnerships, in the IWF press release. "We will continue to work with the IWF to encourage all image-sharing platforms to uphold these new standards."

In a BBC report on the new standards, an Aylo spokesperson indicated that the company intends to abide by the age verification requirements imposed by the Online Safety Act—but with reservations. 

"We will always comply with the law, and the Online Safety Act is no different, but it is clear from other jurisdictions that site-level age verification does not prevent children from accessing adult content online while creating data privacy issues and increasing exposure to unmoderated, harmful material," the Aylo spokesperson is quoted as saying.

The BBC story went on to paraphrase the Aylo spokesperson as voicing support for laws and regulations that require device manufacturers to sell their products with parental filters enabled, noting that Aylo "says it hopes that instead of age checks on sites themselves, governments will mandate parental controls on phones and laptops."

AVN previously reported that Aylo and Ethical Capital Partners have informally endorsed device filtering by default at the retail level.