A study that was published already a while ago is a topic of hot discussion now – see e.g. here: http://qz.com/227869: Facebook filters the news feed to bring up the content that algorithms determine to be the most relevant ones for the individual. This filter had been tweaked for a scientific study that tried to figure out what an impact more positive or more negative news do have on the individual. Not surprisingly, emotional content led to a higher engagement and had a contagious emotional effect.
This kind of use is seen to be covered by the terms of service that each user agrees to when signing up for Facebook. Nonetheless, Facebook is hit by criticism and many comments on the web are full of sarcasm (like e.g. “how could you be surprised?”).
One key concept here from an ethical perspective is what is called “informed consent”: when you agree to the terms of service, are you aware of what these terms could mean in all its consequences? Any company that aims at a high profile of social and corporate responsibility should take this into account from an information governance perspective. This is a key obligation for the new role of the Chief Data Officer, CDO.
Secondly, my claim would be that there’s something we might call an “implicit consent” that a company can deal with to rule out what is a no-go with respect to data privacy. If we are looking at the web site of a shop, say Amazon, we would not be surprised to learn that the product recommendations aren’t there for our benefits only (e.g. a better service), but also for the benefit of the shop (more sales). In a way we recognize and expect there’s an attempt to influence our behavior and most of us think we can deal with it. At least we haven’t seen much of a heated debate about that. The difference with the Facebook story is that we read about some users being the guinea pigs in an attempt to influence their emotional state. This touches a deeper level and sphere of our self than the behavioral level. A region of our self that we are way more keen to protect. There’s no way to assume an implicit consent for that. At least not in the European culture that I live in.