Facebook Admits To ProPublica That They Made “Mistakes” On 50% Of Reported Posts, Contact These Reporters To Let Them Know What Facebook’s Done To Turtleboy
Want to advertise with Turtleboy? Email us at [email protected] for more information.
Check out this article from ProPublica:
We asked Facebook about its handling of 49 posts that might be deemed offensive. The company acknowledged that its content reviewers had made the wrong call on 22 of them.
Basically what ProPublica did was try to paint the picture that Facebook allows hate speech to stay up, which is really only half the story. For instance….
Facebook’s community standards prohibit violent threats against people based on their religious practices. So when ProPublica reader Holly West saw this graphic Facebook post declaring that “the only good Muslim is a fucking dead one,” she flagged it as hate speech using the social network’s reporting system.
Facebook declared the photo to be acceptable. The company sent West an automated message stating: “We looked over the photo, and though it doesn’t go against one of our specific Community Standards, we understand that it may still be offensive to you and others.”
Both posts were violations of Facebook’s policies against hate speech. But only one of them was caught by Facebook’s army of 7,500 censors — known as content reviewers — who decide whether to allow or remove posts flagged by its 2 billion users. After being contacted by ProPublica, Facebook also took down the one West complained about.
Facebook does indeed let a lot of hate speech go. Didi Delgado posted the other day that we should murder white people and give their money out to black people. Many of you reported that post, but received a message back from Facebook telling you that it didn’t violate their standards. It’s not that a person at Facebook is allowing this to happen. It’s that real people don’t actually review these things.
This was the most important part of that:
After being contacted by ProPublica, Facebook also took down the one West complained about.
Facebook only fixes the issue when they get negative press about it. This happens to millions of users, and unless you write for a major media outlet like ProPublica, they won’t find out about it, and you get fucked.
What ProPublica did was ask readers to submit things they reported to Facebook to figure out what Facebook’s censorship policies were all about. But what they failed to realize was the most important part about the whole story – they have no policies, because no one works at Facebook. If they had policies then things like this wouldn’t be removed, and our pages wouldn’t be shut down:
We asked Facebook to explain its decisions on a sample of 49 items, sent in by people who maintained that content reviewers had erred, mostly by leaving hate speech up, or in a few instances by deleting legitimate expression. In 22 cases, Facebook said its reviewers had made a mistake. In 19, it defended the rulings. In six cases, Facebook said the content did violate its rules but its reviewers had not actually judged it one way or the other because users had not flagged it correctly, or the author had deleted it. In the other two cases, it said it didn’t have enough information to respond.
Yup, according to Facebook almost HALF of the 49 reported posts that were given to them, were taken down or kept up because of a “mistake.” This is what they do. This is how they lie. When 50% of your “human beings” make a “mistake,” it proves that human beings are not reviewing the posts. Computers are.
But the angle ProPublica chose to go after was the “Facebook is allowing hate speech” angle. This is why I wanted to avoid turning this into a left-right thing. Because it distracts from the most important issue that everyone is ignoring – no one works at Facebook. No one reviews your reported posts. It’s all luck. Unless you can get a major media outlet to expose them:
In several instances, Facebook ignored repeated requests by users to delete hateful content that violated its guidelines. At least a dozen people, as well as the Anti-Defamation League in 2012, lodged protests with Facebook to no avail about a page called Jewish Ritual Murder. However, after ProPublica asked Facebook about the page, it was taken down.
All they have to do is press a button. They just won’t do it unless they get pie on their face.
Here’s the biggest lie:
Without an appeals process, some Facebook users have banded together to flag the same offensive posts repeatedly, in the hope that a higher volume of reports will finally reach a sympathetic moderator. Annie Ramsey, a feminist activist, founded a group called “Double Standards” to mobilize members against disturbing speech about women. Members post egregious examples to the private group, such as this image of a woman in a shopping cart, as if she were merchandise.
Facebook said it takes steps to prevent mass reporting, a tactic used not only by “Double Standards” but by other advocacy groups, from influencing decisions. It uses automation to recognize duplicate reports, and caps the number of times it reviews a single post, according to a Facebook official.
BULL-SHIT. Just a bold faced lie. If mass reporting didn’t work than these Nazis wouldn’t be holding up the messages they got from Facebook, telling them that mass reporting was effective, like they were trophy animals:
It’s slowly starting to come out. Media outlets like ProPublica respond when they get a lot of messages from people. So if you have a minute, respectfully contact the two women who wrote this article, and let them know about what is happening to our page. Whatever you do, don’t turn it into a “Facebook is run by libtards trying to censor muh First Amendment” bullshit. That’s a great way for them to ignore you. Tell them the truth – Turtleboy is a media outlet that’s being censored because online mobs are mass reporting pictures of Christmas Trees. This is how we fight back. We use other media outlets to expose them until we get our page back. Tired of reading our blogs about this? Fix it by contacting these people. Then we can get our page back and get back to business as usual.