FSC Europe Petitions EU Commission on Behalf of Sex Workers

BERLINThe European Union is about to implement a new set of rules to govern digital services platforms and protect users' online rights, but Free Speech Coalition's European branch is petitioning the EU to demand that sex workers and sexuality professionals, artists and educators not be excluded or discriminated against any longer. The organization is proposing a 10-step program to create a safer digital space that protects the rights of those working within the sexuality sphere—people who are already some of society’s most marginalized people—and to demand transparency from the platforms, which, as the European Commission itself states, “have become integral parts of our daily lives, economies, societies and democracies.”

Sex in almost all its guises is being repressed in the public online sphere and on social media like never before. Accounts focused on sexuality—from sexuality professionals, adult performers and sex workers, to artists, activists and LGBTIQ folks, publications and organizations—have been and continue to be deleted without warning or explanation and with little regulation by private companies that are currently able to add and enforce discriminatory changes to their terms and conditions without explanation or accountability to those affected by these changes. Additionally, in many cases, it has proven impossible for the users to have their accounts reinstated—accounts that are often vitally linked to the users’ ability to generate income, network, organize and share information.

At the same time as sexual expression is being erased from digital spaces, new legislation is being passed in the European Union to safeguard internet users’ online rights. The European Commission’s Digital Services Act (DSA) and Digital Markets Act contain upgraded rules governing digital services, with their focus in part being to build a safer and more open digital space. These rules will apply to online intermediary services used by millions every day, including major platforms such as Facebook, Instagram and Twitter. Amongst other things, they advocate for greater transparency from platforms, better protections for consumers and empowerment for users.

With the DSA promising to “shape Europe’s digital future” and “to create a safer digital space in which the fundamental rights of all users of digital services are protected,” Free Speech Coalition believes that it's time to demand that that future include those working, creating, organizing and educating in the realm of sexuality. As we consider what a safer digital space can and should look like, it’s also time to challenge the pervasive and, frankly, puritanical notion that sexuality, a normal and healthy part of our lives, is somehow harmful, shameful or hateful.

The DSA is advocating for “effective safeguards for users, including the possibility to challenge platforms’ content moderation decisions.” In addition to this, the Free Speech Coalition Europe demands the following:

• Platforms need to put in place anti-discrimination policies and train their content moderators so as to avoid discrimination on the base of gender, sexual orientation, race, or profession. The same community guidelines need to apply as much to an A-list celebrity or mainstream media outlet as they do to a stripper or queer collective;

• Platforms must provide to the user the reason a post has been deleted or account has been restricted or deleted. Shadowbanning is an underhanded means of suppressing users’ voices. Users should have the right to be informed when they are shadowbanned and to challenge the decision;

• Platforms must allow for the user to request a revision of a content moderation’s decision. Platforms must ensure moderation actions take place in the users' location, rather than arbitrary jurisdictions which may have different laws or custom; e.g., a user in Germany cannot be banned by reports and moderation in the Middle East, and must be reviewed by the European moderation team;

• Decision-making on notices of reported content as specified in Article 14 of the DSA should not be handled by automated software, as these have been shown to delete content indiscriminately. A human should make the final judgement.

• The notice of content as described in Article 14.2 of the DSA should not immediately hold a platform liable for the content as stated in Article 14.3, since such a liability will entice platforms to delete content indiscriminately after a report has been filed in order to avoid such liability, which enables organized hate groups to mass-report and take down users;

• Platforms must provide for a department (or, at the very least, a dedicated contact person) within the company for complaints regarding discrimination or censorship;

• Platforms must provide a means to indicate whether you are over the age of 18 as well as providing a means for adults to hide their profiles and content from children (e.g. marking profiles as 18+); Platforms must give the option to mark certain content as “sensitive”;

• Platforms must not reduce the features available to those who self-identify as adult or adult-oriented (i.e. those who have marked their profiles as 18+ or content as “sensitive”). These profiles should then appear as 18+ or “sensitive” when accessed without log in or without set age, but should not be excluded from search results or appear as “non-existing”;

• Platforms must set clear, consistent and transparent guidelines about what content is acceptable. However, these guidelines cannot outright ban users focused on adult themes; for example, a platform could ban highly explicit pornography (e.g., sexual intercourse videos which show penetration), but the creator should still be able to post an edited video that doesn’t show penetration;

• Platforms cannot outright ban content intended for adult audiences, unless a platform is specifically for children, or more than 50% of their active users are children.

The Free Speech Coalition petition went live online today, March 26, and can be found in full at this link.