Facebook shares more users' personal information with moderators than it has so far acknowledged, it has been revealed. The revelation comes to light after the social network was slammed last week for employing third-party content moderators in the developing world for one dollar an hour.
New evidence has, however, shown that these moderators, who have to deal with distressing images and messages, can clearly see the names of people who upload the 'offensive' content, the subject of the image or person tagged in a photo and the person who has reported the content. Moreover, there are currently no security measures to prevent these moderators from taking screen shots of people's personal photos, videos and posts.
"On Facebook, the picture alone is not the content. In evaluating potential violations of our rules it is necessary to consider who was tagged and by whom, and well as additional content such as comments. Everything displayed is to give content reviewers the necessary information to make the right, accurate decision," The Telegraph quoted a Facebook spokesperson, as saying in defence of charges levelled against the company.
His statement comes after a Facebook moderator, Amine Derkaoui showed several screenshots of what these outsourced workers see when deciding if a piece of content is suitable to be on the site.
Derkaoui, who was employed by oDesk, used by Facebook to employ outsourced content moderators, claimed that there was no decent security at all through the content system and looking at each report - was like "looking at a friend's Facebook page"- that's how much information was on there. A privacy specialist at Pitmans law firm, Philip James called on Facebook to improve the security around the content system.