Facebook Conducts Mind-Control Experiments On Over 700K Users
Facebook, in a response to the complaint, said Thursday it has always asked its users for permission to use their data to improve the service.
A U.K. regulator that handles data protection has said it is looking into the experiment. A deputy regulator for data protection in Ireland, where Facebook has its international headquarters, has said the country is also looking into the matter.
Sheryl Sandberg, Facebook’s No. 2 executive, said Wednesday during a trip to India that the study was “part of ongoing research companies do to test different products” and was “poorly communicated.” Facebook has said that since the study, it has implemented stricter guidelines on its research.
Facebook Chief Operating Officer Sheryl Sandberg said Wednesday during a trip to India that the study was "part of ongoing research companies do to test different products" and was "poorly communicated."
The company said that after the feedback on the study, "We are taking a very hard look at this process to make more improvements."
Until recently, the Data Science group operated with few boundaries, according to a former member of the team and outside researchers. At a university, researchers likely would have been required to obtain consent from participants in such a study. But Facebook relied on users' agreement to its Terms of Service, which at the time said data could be used to improve Facebook's products. Those terms now say that user data may be used for research.
"There's no review process, per se," said Andrew Ledvina, a Facebook data scientist from February 2012 to July 2013. "Anyone on that team could run a test," Mr. Ledvina said. "They're always trying to alter peoples' behavior."
Thousands of Facebook Inc. users received an unsettling message two years ago: They were being locked out of the social network because Facebook believed they were robots or using fake names. To get back in, the users had to prove they were real.
In fact, Facebook knew most of the users were legitimate. The message was a test designed to help improve Facebook's anti-fraud measures. In the end, no users lost access permanently.
The experiment was the work of Facebook's Data Science team, a group of about three dozen researchers with unique access to one of the world's richest data troves: the movements, musings and emotions of Facebook's 1.3 billion users.
The little-known group was thrust into the spotlight this week by reports about a 2012 experiment in which the news feeds of nearly 700,000 Facebook users were manipulated to show more positive or negative posts. The study found that users who saw more positive content were more likely to write positive posts, and vice versa.
Facebook said that since the study on emotions, it has implemented stricter guidelines on Data Science team research. Since at least the beginning of this year, research beyond routine product testing is reviewed by a panel drawn from a group of 50 internal experts in fields such as privacy and data security. Facebook declined to name them.
Company research intended to be published in academic journals receives additional review from in-house experts on academic research. Some of those experts are also on the Data Science team, Facebook said, declining to name the members of that panel.
A Spokesman Said Facebook Is Considering Additional Changes
Since its creation in 2007, Facebook's Data Science group has run hundreds of tests. One published study deconstructed how families communicate, another delved into the causes of loneliness. One test looked at how social behaviors spread through networks. In 2010, the group measured how "political mobilization messages" sent to 61 million people caused people in social networks to vote in the 2010 congressional elections.
Many of Facebook's data scientists hold doctoral degrees from major universities in fields including computer science, artificial intelligence and computational biology. Some worked in academic research before joining Facebook.
Adam Kramer, the lead author of the study about emotions, said in a 2012 interview on Facebook's website that he joined the company partly because it is "the largest field study in the history of the world." Mr. Kramer, who has a doctorate in social psychology from the University of Oregon, said that in academia he would have had to get papers published and then hope that someone noticed. At Facebook, "I just message someone on the right team and my research has an impact within weeks, if not days."
Much of Facebook's research is less controversial than the emotions study, testing features that will prompt users to spend more time on the network and click on more ads. Other Internet companies, including Yahoo Inc., Microsoft Corp., Twitter Inc. and Google Inc., conduct research on their users and their data.
The recent ruckus is "a glimpse into a wide-ranging practice," said Kate Crawford, a visiting professor at the Massachusetts Institute of Technology's Center for Civic Media and a principal researcher at Microsoft Research. Companies "really do see users as a willing experimental test bed" to be used at the companies' discretion.
Facebook's team has drawn particular interest because it occasionally publishes its work in academic journals that touch on users' personal lives, including the study about positive and negative posts.
"Facebook deserves a lot of credit for pushing as much research into the public domain as they do," said Clifford Lampe, an associate professor at the University of Michigan's School of Information who has worked on about 10 studies with Facebook researchers. If Facebook stopped publishing studies, he said, "It would be a real loss for science."
Dr. Lampe said he has been in touch with members of the Data Science team since the controversy erupted. "They've been listening to the arguments and they take them very seriously," he said.
Mr. Ledvina, the former Facebook data scientist, said some researchers debated the merits of a study similar to the one that accused users of being robots but there was no formal review, and none of the users in the study were notified that it was an experiment.
"I'm sure some people got very angry somewhere," he said. "Internally, you get a little desensitized to it."
Facebook Inc. is being investigated by the U.K.’s data-protection authority after a study showed a psychological experiment influenced what users saw in their news feeds, raising fresh privacy concerns.
A company researcher apologized on June 29 for a test in January 2012 that altered the number of positive and negative comments that almost 700,000 users saw on their online feeds of articles and photos. Disclosure of the experiment prompted some members to express outrage on Twitter about the research as a breach of privacy.
Regulators may want to examine whether Facebook users should have been informed of the experiment and what the company’s purpose was in collecting information, said Paul Van den Bulck, a lawyer at McGuireWoods LLP in Brussels.
The U.K. Information Commissioner’s Office, or ICO, said yesterday it will speak with Facebook and work with the Irish Data Protection Commissioner, the company’s lead regulator in Europe, to learn more about the circumstances.
The Irish regulator, which governs Facebook’s compliance with EU privacy law “has been in contact with Facebook on privacy issues including consent in relation to the research” and expects a comprehensive report from the company, said John O’Dwyer, a spokesman for Ireland’s authority.
Facebook “communicated poorly” about the experiment, Chief Operating Officer Sheryl Sandberg said today at a New Delhi event to promote her book “Lean In: Women, Work and the Will to Lead.”
The probe of the social network was reported earlier by the Financial Times. The ICO is investigating whether the company broke data-protection laws, though it’s too early to tell what part of the law Facebook may have infringed, the paper reported.
“It’s clear that people were upset by this study and we take responsibility for it,” said Richard Allan, a spokesman for Facebook in the U.K., in an e-mailed statement. “We want to do better in the future and are improving our process based on this feedback. The study was done with appropriate protections for people’s information and we are happy to answer any questions regulators may have.”
According to a study published June 17 in the Proceedings of the National Academy of Sciences, the number of positive and negative comments that users saw on their news feeds was changed in January 2012. People shown fewer positive words were found to write more negative posts, while the reverse happened with those exposed to fewer negative terms, according to the trial of random Facebook users.
The data showed that online messages influence readers’ “experience of emotions,” which may affect offline behavior, the researchers said.
In a statement on June 29, Facebook said that none of the data in the study was associated with a specific person’s account. Research is intended to make content relevant and engaging, and part of that is understanding how people respond to various content, the Menlo Park, California-based company said.
“We carefully consider what research we do and have a strong internal review process,” Facebook said at the time. “There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”
"It's completely unacceptable for the terms of service to force everybody on Facebook to participate in experiments," said Kate Crawford, visiting professor at MIT's Center for Civic Media and principal researcher at Microsoft Research.
Ms. Crawford said it points to broader problem in the data science industry. Ethics are not "a major part of the education of data scientists and it clearly needs to be," she said.
Asked a Forbes.com blogger: "Is it okay for Facebook to play mind games with us for science? It's a cool finding, but manipulating unknowing users' emotional states to get there puts Facebook's big toe on that creepy line."
Slate.com called the experiment "unethical" and said "Facebook intentionally made thousands upon thousands of people sad."
Mr. Kramer defended the ethics of the project. He apologized for wording in the published study that he said might have made the experiment seem sinister. "And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it," he wrote on Facebook.
Facebook also said the study was conducted anonymously, so researchers could not learn the names of the research subjects.
Mr. Kramer said that the content—both positive and negative—that was removed from some users' news feeds might have reappeared later.
The emotional changes in the research subjects was small. For instance, people who saw fewer positive posts only reduced the number of their own positive posts by a tenth of a percent.
Comments from Facebook users poured in Sunday evening on Mr. Kramer's Facebook page. The comments were wide-ranging, from people who had no problem with the content, to those who thought Facebook should respond by donating money to help people who struggle with mental health issues.
"I appreciate the statement," one user wrote. "But emotional manipulation is emotional manipulation, no matter how small of a sample it affected."
Monty Henry, Owner
Number of Americans Renouncing Citizenship Surges To Escape Oppressive Tax Rules
Dropping Off The Grid: A Growing Movement In America: Part I
Online Privacy Tools and Tips
What is BitCoin and How Does It Work?
The Creature From Jekyll Island: This Blog And Video Playlist Explains Why The U.S. Financial System is Corrupt and How It Came To Be That Way
Next-Generation Bug / Microwave / ELF / Spy Phone / GSM And Camera Detectors (Buy, Rent, Layaway) tinyurl.com/2eo8mlz Open...
— Spy Store Rentals (@MontyHenry1)
Nanny IP (Internet) Cameras, GPS Trackers, Bug Detectors and Listening Devices, etc, (Buy / Rent / Layaway): tinyurl.com/396jlw6...
— Spy Store Rentals (@MontyHenry1)
• Video is Recorded Locally To An Installed SD Card (2GB SD Card included)
• Email Notifications (Motion Alerts, Camera Failure, IP Address Change, SD Card Full)
• Live Monitoring, Recording And Event Playback Via Internet
• Back-up SD Storage Up To 32GB (SD Not Included)
• Digital Wireless Transmission (No Camera Interference)
• View LIVE On Your SmartPhone!
* Nanny Cameras w/ Remote View
* Wireless IP Receiver
* Remote Control
* A/C Adaptor
* 2GB SD Card
* USB Receiver
FACT SHEET: HIDDEN NANNY-SPY (VIEW VIA THE INTERNET) CAMERAS
* Transmission Range of 500 ft Line Of Sight
* Uses 53 Channels Resulting In No Interference
* 12V Power Consumption
* RCA Output
* Supports up to 32gig SD
* 640x480 / 320x240 up to 30fps
* Image Sensor: 1/4" Micron Sensor
* Resolution: 720x480 Pixels
* S/N Ratio: 45 db
* Sensitivity: 11.5V/lux-s @ 550nm
* Video System: NTSC
* White Balance: Auto Tracking
* You Buy Our DVR Boards And We'll Build Your Products! (Optional)
Our New Layaway Plan Adds Convenience For Online Shoppers
Phone: (1888) 344-3742 Toll Free USA
Local: (818) 344-3742
Fax (775) 249-9320
Google+ and Gmail
AOL Instant Messenger
Yahoo Instant Messenger
Alternate Email Address
Join my Yahoo Group!
My RSS Feed