As Mark Twain observed: “A lie can travel halfway around the world while the truth is putting on its shoes.” And that was a long time before the web. Which brings us to a meme that was propagating last week though social media. Its essence was an assertion that Facebook monitored – and stored – not only the stuff that its subscribers post on their Facebook pages, but even stuff that they started to type and then deleted! Shock, horror!
The source of the meme was an article by Jennifer Golbeck in the venerable online magazine Slate. “The code in your browser that powers Facebook,” it states, “still knows what you typed – even if you decide not to publish it. It turns out that the things you explicitly choose not to share aren’t entirely private. Facebook calls these unposted thoughts ‘self-censorship’.”
Creepy, eh? The peg for Ms Golbeck’s article was an academic paper published by two Facebook researchers, Sauvik Das and Adam Kramer. They were interested in studying “last-minute self-censorship” on Facebook, ie content that is filtered by its author after it has been written. Their motivations were not, of course, purely academic.
As they put it: “Last-minute self-censorship is of particular interest to SNSs as this filtering can be both helpful and hurtful. Users and their audience could fail to achieve potential social value from not sharing certain content, and the SNS loses value from the lack of content generation.” Which, being translated, reads: “Anything that Facebookers fail to post means lost revenue for us.”
Well, at least they were being open about it. Das and Kramer studied about 5 million Facebook users over a period of 17 days. Their findings are mildly interesting but not surprising. Some 71% of the sample self-censored at least one post or comment over the experimental period, for example, so “self-censorship is common”. Big deal: anyone who’s ever typed anything into a text box could have told them that. Posts are more censored than comments (again, big deal: posts require original composition; comments are generally just reactive). Males censor more posts than females, especially if their audience is predominately male. And so on.
If I’d been refereeing the paper I’d have put it in the “unexciting but worthwhile” category. The researchers’ methodology seems OK – data anonymized, the content of the self-censored posts was not monitored (just the fact that they were deleted), it’s clearly written and the findings are presented with due reservations.
So why the fuss? It all seems to hinge on Ms Golbeck’s assertion that: “It turns out that the things you explicitly choose not to share aren’t entirely private.” This has understandably alarmed people and caused steam to issue from some orifices – for example those of an excitable Daily Kos columnist. “I’m astonished at how wrong author Jennifer Golbeck gets this,” he fumes. “Facebook does not ‘monitor our unshared thoughts’ … It does not ‘collect the text you type’ or ‘automatically analyze’ your ‘unposted thoughts’. There is no rational way to ‘connect this to all the recent news about NSA surveillance’. There is not a single significant word or phrase in this story that is supported by the information provided. It is completely, categorically, profoundly, utterly wrong.”
Er, not quite. While it’s true that the researchers did not read the censored posts, there’s no reason to suppose that Facebook’s software doesn’t, as a matter of course, do so. And this is something that Ms Golbeck tried to confirm. She points out that in Facebook’s data use policy it’s made clear that the company collects information you choose to share or when you “view or otherwise interact with things”. Typing and deleting text in a box, she argues, could be considered a type of interaction, though very few of us would expect that data to be saved. But when she contacted the company for clarification, a representative told her that “the company believes this self-censorship is a type of interaction covered by the policy”.
Indeed, given that the essence of Facebook’s business is “strip-mining human society”, to use Professor Eben Moglen’s colorful phrase, it would be surprising if the company viewed any online activity engaged in by its wretched subscribers as lying beyond its corporate reach. In that sense, Mr Zuckerberg’s empire is the spiritual heir of the Chicago pork-packers of the 1920s, of whom it was memorably said that they “used every part of the hog except the grunt”. The only difference is that Facebook can now also utilize the grunt.
eNewsDesk / Source : Guardian