The Internet used to be for porn (at least, more for than it is now), and then the safety nets went up to keep young and pure eyes (and ears) away from such content. And it looks like the same thing is happening with social media. The recent fall out over Vine and its NSWF issue, however, has made us realize one thing: Porn just follows one of the primary rules of the internet – adapt.
When Vine first appeared, it was set into the wild with an encouraging push from Twitter, all wide-eyed and bushy-tailed – and unregulated. You could search for tags as obvious as “NSFW” or the more blatant “boobs” and find results… lots of results. And notoriously, due to a supposed human error, a vine called “Dildoplay” made it onto the Editor’s Picks list on the app (you can’t help but wonder how the resulting meeting went down).
It looked like Vine was at risk of going the way the Post Secret app did, which is to say out of the iOS App Store swiftly after its launch thanks to pornographic, gruesome, and even threatening images that a small team of moderators (in PostSecret’s case, largely made up of volunteers) just couldn’t keep up. The App Store rules say that apps which “contain user generated content that is frequently pornographic” will get the boot, and mention ChatRoulette by name in its examples of things which don’t fly.
Vine’s terms of service didn’t even prohibit explicit imagery or nudity – and still don’t (though it does note that threats prohibited). While the company maintains the right to “respond to user support requests,” nowhere does it explicitly say that the explicit is not OK. That hasn’t changed, but the ability to use tags to find porn has been dropped.
Since Vine’s early days, hashtags like #boobs and #smut get just sad face “nothing to see here” results. #Sex101 brings up a strangely adorable vine of stuffed animals posed in various coital positions, but no actual human body parts.
But there are still loopholes. For example, the SuicideGirls account shows off full-frontal shots of tattooed models. Scrolling through, the first few Vines play a series of very adult theme images… but get about four or five videos in, and suddenly you see warnings about sensitive content you need to tap to view. Clearly, the Vine team is having trouble keeping up with what’s rolling in. While some of the more obvious hashtags aren’t pulling up the desired Vines, others are still rife with very adult videos (want to see for yourself? Search for #xxxvine or #xxxxxx and be prepared for what shows up).
When it comes to depending on the App Store and functioning with a relatively small staff to rifle through content and its visual nature — and a tendency for attracting lewd images — Instagram sets the standard for flagging, finding, and pulling adult-themed pictures.
Today, Instagram has a 12+ rating in the iTunes store for, among other things, “Infrequent/Mild Sexual Content or Nudity.” Tags not allowed: #NSFW,# porn, and the like. And even the porn_stars and Suicidegirls accounts are more Victoria’s Secret than Penthouse. Instead of relying on an army of moderators, Instagram took out the obvious tags and then relies on the “Flag for Review” button.
Still, the masses are curious, so one user tried to see what he could get away with. The user posted a pic of boobs onto Instagram as @thebreastsofayounglady and reported the results. Within a day and a half, Instagram came calling with a warning:
It has come to our attention that one or more of the photos you’ve shared on Instagram violates our Community Guidelines, which can be found here.
In short, we ask that you:
- Don’t share photos that aren’t yours.
- Don’t share photos that show nudity or mature content.
- Don’t share photos of illegal content.
- Don’t share photos that attack an individual or group, or violate our Terms of Use.
Any violating images flagged by members of the Instagram community have been automatically removed, and we strongly suggest deleting any additional content on your account that may not fall in line with the above guidelines or our Terms of Use.
It seems while plenty of users don’t mind a little T&A, others are policing pretty effectively.
Instagram has done an admirable job of cleaning itself up – but the point is that users will continue to post adult content, even if it keeps getting pulled. There are too many loopholes and too many people to keep porn off social networks entirely. Vine will have to learn from its predecessor’s experience if it wants to stay partnered with Twitter and maybe even fix that 17+ rating from the App Store.
As more and more visual-sharing app take over social networking (and buddying up with big platforms like Facebook and Twitter), self-censorship is going to be incredibly important. And as we adapt and find ways to slip our dirty minds into these image-heavy feeds, they’ll need to keep on coming up with ways to warn us or block the content entirely. The Internet – and by extension, social media – will always be for porn, but now it’s also going to come with child locks attached.