Why Periscope making users into moderators is doomed to fail

Periscope to make users into moderators and why this won't work

Recently mobile livestreaing app Periscope announced that they will be enlisting users to be their moderators when it comes to commenters leaving abusive comments. As the Forbes article I linked to points out, this is not a good idea. I would go as far as calling it the inmates running the asylum.

I’m not saying that all Periscope users are obnoxious trolls but we’re talking about the venue where watchers were giving likes to an alleged livestreamed rape. In addition to that, if history is any indicator, leaving moderation to the users always ends up in the moderators being the most abusive. Craigslist’s ‘community policing’ comes to mind. The classifieds site claims that they can’t afford to hire moderators and let the users flag ads that violate their terms of service, except that’s not what the users do. Instead they use the flagging ability as their personal harassment tool. A number of business will flag the ads of competitors or racists will flag any ad that’s not in English among other abuses of the privilege. Let’s also not forget all the ads for illegal items and services that never get flagged.

While giving users who stream more robust tools to moderate their own streams is commendable, letting stream comments be subject to mob rule will only make the experience not only less enjoyable but could potentially be a business killer for Periscope.

The people who keep child porn off of your favorite sites

stop-child-porn

How Child Porn And The Other Awfulest Things Ever Get Scrubbed From The Internet:

Whenever there is any kind of child porn bust the people I feel sorry for the most are of course the children that appear in those images. However there are also other people that we should remember that are also affected by things like this. Think about the law enforcement officers that have to go through all of it when it’s seized as evidence and they have to catalog every single piece and frame of it. They must die a little or a lot each time they have to go through this. At least I would imagine that they get some satisfaction of busting a child porn ring or putting some of these scumbags away for a long time. Then there are the people who have to deal with almost the same abject horror who have very little in the way of rewards. I’m talking about the people who review media that are uploaded to such sites as Facebook, MyYearbook and YouTube.

The article I linked to from BuzzFeed goes into great detail how employees from these sites are underpaid and while some do receive counselling from their employers some say it’s not enough for what they have to endure. For people who work for the NCMEC it’s supposedly even worse for them.

I implore you to read the entire article to see what they have to go through on a constant basis. As dead as I may be inside I think that would eat away at the very last parts of me that still feel. They are stronger men and women than I.

One thing that I noticed about this article is that you’ll notice that neither craigslist nor backpage use any of these kinds of services. Just sayin’.

Facebook monitors its content, why can’t classifieds sites?

attachment

Facebook provides rare peek at how site is policed:

Recently Facebook allowed the media to have a glimpse at its content monitoring structure. Facebook says that they employ hundreds of people in order to keep the site free of harassment, porn, spam and threats among other things.

In a posting accompanying the chart on Tuesday, Facebook explained that its User Operations group comprises four teams to handle the different types of incident reports: a safety team, a hate and harassment team, an access team and an abusive content team.

Now Facebook is one of the most popular sites on the internet boasting a userbase in the hundreds of millions. So why can’t a somewhat smaller site, for example backpage or craigslist, employ dozens of people to monitor their sites for inappropriate content? And I don’t just mean content that doesn’t pass their lawyers’ sniff tests for illegalities. I mean questionable material that any one with an ounce of common sense knows is bad news.

On craigslist for instance if an ad appeared in the casual encounters section that was advertising for ‘bois’ or ‘K9’ a moderator could make sure that ad never sees the light of day.

On the other hand if a moderator for Backpage saw an ad for an ‘escort’ offering ‘young stuff’ that could be rejected with a vengeance.

Don’t let either of these sites fool you with their poor me blues. Both sites can easily afford they just choose not to. Why? Because that would eat into their multi-million dollar profits. You know for two organizations that try to represent the little guy they sure do act like the soulless corporations that they often rebel against.

So what if children are being molested or women are being sold into sexual slavery. There’s profits to be had.

Facebook’s porn police

Walking the Cyberbeat:

This is a great article from Newsweek about how Facebook employs a staff to keep pornographic content off of their users’ profiles.

If Facebook can employ a staff to prevent the millions of its users from having porn on their profiles couldn’t craigslist hire a staff to review the ads in its erotic services section? From what I understand craigslist makes a ton more money than Facebook does.

I’m thinking Jim doesn’t want anything cutting into his profits.