Back to top

In social media, we are barraged daily by hundreds if not thousands of photographic images. Facebook and Twitter feeds show not only lunch plates and newborn nephews, but “selfies” taken to display a new tattoo, change of hair color or, in some cases, scars from a recent surgery. Instagram, as a social medium (owned by Facebook) serves more specifically as a photographic stream and, therefore, potentially documents personal experiences in an even more graphically representative way.

Users of social media choose to post graphic images for different reasons. Drawing attention to an illness or social issue is a very different motivation from posting an image because of its sexually explicit nature, yet can these two choices have the same outcome of potentially being banned from social networking sites? How has social media changed policy determinations over time to reflect changing moral systems and how do these policies affect users?

Unfriended For Showing Scars

In the recent Salon article, “Unfriended for Showing Her Scars: A ‘breast cancer preventer’ opens up,” Beth Whaanga posted seminude photos of her body after total bilateral mastectomy and total hysterectomy.

“Within hours of the photos going up, they’d been reported to Facebook. Encouragingly, Facebook has said it will not remove the images, but Whaanga meanwhile says she has been unfriended more than a hundred times. Over a hundred people looked at her and walked away.”

Although Facebook did not ban the woman’s photo, it was her own social network that chose to unfriend her as a result of the images. But she found support as well: Her “Under the Red Dress” Facebook page has nearly 60,000 likes and her message “your scars aren’t ugly; they mean you’re alive” resonates on her website.

How Facebook Handles Graphic Content

It is only more recently that Facebook has embraced allowing graphic scar photos. They encountered a controversy in their initial ban of photographer David Jay’s The Scar Project (slogan: “breast cancer is not a pink ribbon”). NBC reported that a change.org petition prompted the account’s reinstatement, noting that Facebook spokeswoman Alison Schumer said Facebook has "long allowed mastectomy photos to be shared on Facebook, as well as educational and scientific photos of the human body and photos of women breastfeeding."

"We only review or remove photos after they have been reported to us by people who see the images in their News Feeds or otherwise discover them," Schumer told NBC. "On occasion, we may remove a photo showing mastectomy scarring either by mistake, as our teams review millions of pieces of content daily, or because a photo has violated our terms for other reasons."

Self-harming scars are another matter entirely. Unfortunately there are a number of websites and blogs that glorify teen self-harm. Facebook expressly discourages any self-harm content, stating in its policy:

“Facebook takes threats of self-harm very seriously. We remove any promotion or encouragement of self-mutilation, eating disorders or hard drug abuse. We also work with suicide prevention agencies around the world to provide assistance for people in distress.”

Facebook also initially banned the account of a woman using a different kind of graphic image to tell a story. When D. M. Murdock posted a disturbing image of African girls undergoing “virginity testing” in order to raise awareness for child abuse, her Facebook account was disabled.

“I posted the uncensored, shocking photo on Facebook because it is important to see the utter indignity these poor girls must suffer – this horrible abuse is now being done in the West,” Murdoch wrote. “How can we battle it, if we can’t see what it is?”

Eventually Facebook reversed the permanent ban decision after a social media campaign and petition were undertaken asking for reinstatement of the account.

Facebook’s community standards are clear on its policy of dealing with graphic images:

“Facebook has long been a place where people turn to share their experiences and raise awareness about issues important to them. Sometimes, those experiences and issues involve graphic content that is of public interest or concern, such as human rights abuses or acts of terrorism. In many instances, when people share this type of content, it is to condemn it. However, graphic images shared for sadistic effect or to celebrate or glorify violence have no place on our site."

When people share any content, we expect that they will share in a responsible manner. That includes choosing carefully the audience for the content. For graphic videos, people should warn their audience about the nature of the content in the video so that their audience can make an informed choice about whether to watch it.”

Are there times when trigger warnings are necessary for graphic image content on social media sites? On Twitter, the hash tag #triggerwarning is used to warn users of content that might be sensitive to readers. Author Noah Berlatsky addresses the issue in his piece “Does This Post Need a Trigger Warning?”

I can see the virtues in the arguments both for and against trigger warnings — and partially as a result, I feel like both may be framed in overly absolutist terms. To me, it seems like it might be better to think about trigger warnings not as a moral imperative but rather as a community norm. Warnings exist, after all, in dialogue with reader expectations. On some forums, people may expect to be warned about difficult content.”

Trigger warnings are a good idea to use on social media because of the public way in which content is being shared. When an author is writing on a sensitive topic—whether it’s suicide, eating disorders, or violence, it’s certainly considered professional and courteous to warn readers in advance that they might find the material sensitive.

When Images Are Intentionally Graphic

Overly sexual images that are posted on Facebook and its child company Instagram are screened and deleted quickly by the social networking site; Twitter allows more pornographic images.

Although these are examples of graphic imagery being used to tell a story and misinterpreted as inappropriate by social media sites, what about the effects of graphic images that have every intention and awareness of being inappropriate?

Bans on porn vary by social media site. Instagram and mother Facebook are very clearly on the side of family-friendly: Facebook lowered the minimum age of holding an account from 14 to 13.

Facebook’s nudity and pornography policy is zero-tolerance; its standards note “a strict policy against the sharing of pornographic content and any explicitly sexual content where a minor is involved. We also impose limitations on the display of nudity. We aspire to respect people’s right to share content of personal importance, whether those are photos of a sculpture like Michelangelo's David or family photos of a child breastfeeding.”

Video social networking site Vine and its parent company Twitter have always taken a more relaxed approach to adult content, though things heated up recently when both social media sites became embroiled in a teen pornography controversy that prompted complete policy change.

CNN asked in a report “Does Twitter’s Vine Have a Porn Problem?”

“On Twitter's end, the anything-goes aspect of Vine jibes with the site's overall philosophy. Compared to Facebook, which believes social sharing is best when tied to a user's true identity and real-world networks, Twitter allows its users to register under fake names and has fought governments and law-enforcement agencies seeking user information. As such, Twitter has taken a more hands-off approach on adult content. It's not hard to hunt down hash tags its users are employing to share adult content on a daily basis. (#TwitterAfterDark becomes a trending topic on the site nearly every day — clicker beware).”

CNN’s “porn problem” question was answered not long after, when an anonymous teen posted a video of himself having sex with a Hot Pocket snack wrap. In its follow-up piece, “Twitter Bans Porn Videos on Vine” CNN reported that Apple (which holds the key to the download of an app) had forced Vine to change its minimum age from 12 to 17, ultimately causing Vine’s parent company, Twitter, to ban porn on the video site. The company’s announcement stated, “We’re making an update to our Rules and Terms of Service to prohibit explicit sexual content. For more than 99 percent of our users, this doesn’t really change anything. For the rest: we don’t have a problem with explicit sexual content on the Internet –– we just prefer not to be the source of it.”The changes at the major social media sites beg the question: Who are the Internet moral police? Whose job is it to review millions of pieces of content that hit the Web day after day and try to make determinations about what is appropriate? Policy changes after petitions and forced age minimums show that the sites are listening to public outcry resulting from a wide diversity in moral standards.

Whether the effect of posting a graphic image to a social media site is positive, creating an impetus for social change or negative (for example, causing a young person to be denied a job because of his or her Facebook profile photos), it’s clear that posting polarizing images has the potential to have an impact beyond the social media profile of an individual.

Mary T McCarthy

Mary McCarthy is Senior Editor at SpliceToday.com and the creator of pajamasandcoffee.com. She has been a professional writer for over 20 years for newspapers, magazines, and the Internet. She teaches classes at The Writer’s Center in Bethesda, Maryland and guest-lectures at the University of Maryland’s Philip Merrill College of Journalism. Her first novel The Scarlet Letter Society debuted this year and her second novel releases in 2015.

Add new comment

Restricted HTML

  • Allowed HTML tags: <a href hreflang> <em> <strong> <cite> <blockquote cite> <code> <ul type> <ol start type> <li> <dl> <dt> <dd> <h2 id> <h3 id> <h4 id> <h5 id> <h6 id>
  • Lines and paragraphs break automatically.
  • Web page addresses and email addresses turn into links automatically.