Facebook Always Trying to Alter People's Behavior, Says Former Data Scientist
At facebook it seems we are their lab rats, social engineering is the name of their game...and whatever private info they are selling or giving to our government, it apparently doesn't make them lose any sleep at night as they are laughing all the way to the bank.
Of course, what else should we expect from a company that got millions of dollars in start up money indirectly, from the CIA? -W.E.
See also: Facebook Saves Everything You Type - Even if You Don't Publish It
See also: Facebook May Add User Photos into Facial Recognition Database
See also: America’s New Normal: Mass Surveillance, Secret Courts and Death to Whistleblowers
See also: Facebook, Twitter And Other Social Media Sites Are An Elaborate CIA Spying Scheme
See also: This Is How Facebook Is Tracking Your Internet Activity
See also: FBI Wants Backdoors to Facebook, Twitter Immediately
See also: Big Brother Monitoring your Social Activity?
We're hardly ever fast to wake up to what might be going on with our data. But once in a while, we're suddenly roused and make a noise.
Such has been the case with the revelations surrounding Facebook's manipulation of the News Feeds of almost 700,000 people in order to see if it would make them happier or sadder, depending on the content presented.
Now, a former member of Facebook's Data Science team has revealed that, for much of its existence since 2007, the team operated with seemingly little supervision.
Andrew Ledvina, who was on Facebook's team from February 2012 to July 2013, told the Wall Street Journal: "There's no review process, per se. Anyone on that team could run a test. They're always trying to alter people's behavior."
This, if true, might make for a profound surprise to those who somehow believed their data was, indeed, their data.
Ledvina suggested that tests were conducted with such regularity that some scientists worried that the same people's data was being analyzed more than once.
Since the controversial study on human emotions, Facebook has reportedly stiffened its procedures. However, since 2007, the Data Science team has reportedly run hundreds of experiments without users' consent or even knowledge.
In 2012, the company created a 50-person panel of experts in areas such as data security and privacy. (The company won't release the names of these experts.) From the beginning of this year, members of this panel have reviewed all research beyond standard product testing.
A Facebook spokesperson said: "We are taking a very hard look at this process to make more improvements."
Clearly some see great benefits in attempting to understand human behavior better through such constant and everyday activity as Facebook posting.
However, after COO Sheryl Sandberg's expressions of regret and reassurance during a TV interview in India, many questions remain.
During the interview, she said: "Facebook cannot control emotions of users. Facebook will not control emotions of users."
However, my understanding of the results of the experiment, conducted by Facebook and researchers at Cornell and UC San Francisco, is that they showed Facebook can manipulate people's moods.
Indeed, the research report said that though the mood changes seemed small, the effects "nonetheless matter."
This was because "given the massive scale of social networks such as Facebook, even small effects can have large aggregated consequences."
The pace in which social behavior has changed and moved online has inevitably caused enormous amounts of data to be amassed, often in the hands of very few. Facebook isn't alone in seeking to find truths in that data.
But the potential economic (putting people in a bad mood and then showing them ads for a pick-me-up) and political (skewing news or even moods for one political side or another) dangers are, even if only theoretical, still evident.
The impression given by Ledvina's comments is of a morass of data so inviting to scientists that they paid little regard to the feelings of the people who generated that data.
Perhaps now, though, there might be a greater debate about whether protections need to be far greater than they seem to have been.