Many people are concerned about the impact on children if they see nude images. Some want to have a blanket ban on nudity, while others argue that art should be protected from sexual obsession.

Blanket bans on bare chests are receiving increasing criticism from human rights groups, including Meta’s Oversight Board. They claim that they clash with cultural expectations and impede right to expression for women, trans, and nonbinary people.

1. Be aware of your own body

When it comes to nudity moderation, context is key. What may be considered indecent in one culture could easily be seen as a form of self-expression in another. This is why it’s important for image moderators to consider their own personal and cultural views when deciding whether or not an image is inappropriate.

For example, a photo of a woman breastfeeding a baby or showing off their nipples might be categorized as indecent in the US but not in other countries. This could lead to women being censored for simply doing something that’s natural and healthy.

This is why it’s essential that we provide users with more individualized tools to have control over their feeds. For instance, rather than banning consensual adult nudity completely, we should give people the option to block it in their settings. This would allow them to see content that matches their preferences while still providing them with a safe space.

2. Don’t judge other people’s bodies

Social media sites use algorithms to detect nudity in images and videos, but they’re not always accurate. They can also be biased against specific groups of people. This is a serious problem that has led to the removal of content that highlights transgender and non-binary bodies or advocates for women’s rights, such as the image of Petrenko.

This bias has a significant impact on the way in which social media sites regulate their communities. Social media companies’ policies frame a wide range of nudity as sexual and therefore objectionable, but this framing erases the cultural, political and social dimensions that are involved.

This bias can also affect the way in which images are categorized. For example, a photo might be labeled “nude” but be excluded from the Explicit Nudity category because it includes a woman in lingerie. It can be difficult to understand how a system like this works, but it’s important for brands to consider what they want their users to see and how those expectations change across different cultures and geographical areas.

3. Don’t be afraid to ask for help

Social media companies have a clear commercial interest in keeping their platforms free of explicit nudity, depictions of sexual activity and content encouraging sexual encounters. This is because social media platforms sell their audiences to advertisers.

However, a blanket ban on bare breasts and other forms of nudity is controversial as it clashes with cultural expectations and impedes right to expression for women, trans and non-binary people. Moreover, it can cause offence and lead to misinterpretation as the same image can be interpreted differently in different contexts.

For example, an e-commerce site may allow partial nudity in images and videos that are contextually appropriate for the product being sold. For instance, a swimsuit that shows significant cleavage. However, an automated tool like Amazon Rekognition might fail to understand the nuances of this image and therefore, might be unable to identify it as containing explicit nudity. This is why human moderators are essential for the success of social media moderation tools.

4. Don’t be afraid to speak up

Nudity moderation has become a critical issue because of its broader impact on networked social life. Social media platforms’ de-contextualized policies around nudity and depictions of sex work against basic civil liberties.

Moreover, the underlying technology behind most image moderation APIs is often unable to capture the nuances that distinguish nudity from sexual content. For example, a photo of a child in a bathing suit may not appear explicit or NSFW on its own, but it could be used by threat actors as an attack vector to target the family.

Similarly, a photo of an areola or nipple may not be sexually explicit, but it can be repurposed by threat actors to attack women and minors. To combat this, it’s important to have tools that provide context about why an image is NSFW or not.