Tag Archives: big data experience

What to learn from Facebook’s secret psycho test

A study that was published already a while ago is a topic of hot discussion now – see e.g. here: http://qz.com/227869: Facebook filters the news feed to bring up the content that algorithms determine to be the most relevant ones for the individual. This filter had been tweaked for a scientific study that tried to figure out what an impact more positive or more negative news do have on the individual. Not surprisingly, emotional content led to a higher engagement and had a contagious emotional effect.

This kind of use is seen to be covered by the terms of service that each user agrees to when signing up for Facebook. Nonetheless, Facebook is hit by criticism and many comments on the web are full of sarcasm (like e.g. “how could you be surprised?”).

One key concept here from an ethical perspective is what is called “informed consent”: when you agree to the terms of service, are you aware of what these terms could mean in all its consequences? Any company that aims at a high profile of social and corporate responsibility should take this into account from an information governance perspective. This is a key obligation for the new role of the Chief Data Officer, CDO.

Secondly, my claim would be that there’s something we might call an “implicit consent” that a company can deal with to rule out what is a no-go with respect to data privacy. If we are looking at the web site of a shop, say Amazon, we would not be surprised to learn that the product recommendations aren’t there for our benefits only (e.g. a better service), but also for the benefit of the shop (more sales). In a way we recognize and expect there’s an attempt to influence our behavior and most of us think we can deal with it. At least we haven’t seen much of a heated debate about that. The difference with the Facebook story is that we read about some users being the guinea pigs in an attempt to influence their emotional state. This touches a deeper level and sphere of our self than the behavioral level. A region of our self that we are way more keen to protect. There’s no way to assume an implicit consent for that. At least not in the European culture that I live in.

Comments on Big Data Market Forecast 2012-2017 and Big Data Adoption Barriers

Wikibon just came out with a forecast (Big Data Vendor Revenue and Market Forecast) which underlines my last post: The hype is over, big data is getting real.

Quote:

“In the enterprise space in particular, the combination of a better understanding of the use cases for Big Data and more mature product and service offerings resulted in a significant percentage of Big Data early adopters graduating from small, proof-of-concept projects to large-scale, production-level deployments.”

It also talks about the adoption barriers. These revolve around three major themes:

  1. Lack of Data Scientists
  2. Moving to higher levels of maturity as an analytic enterprise
  3. Lack of application development tools and services

It’s not a suprise that all these difficulties still persist as we’re still in an early phase of adoption from an innovation perspective. Over time all the adoption barriers mentioned there will be overcome. However, I do not believe we will get there by focusing on these barriers per se. Let’s re-frame it this way: In the early days of the automobile, every driver needed to be its own mechanic. In the early days of the PC, the early adopters were extremely knowledgeable about everything – they even built their systems by sticking together the components (as I did, too ;-)). This kind of capability is analogous to what is expected from a data scientist: He’s a Jack of all trades with a scientific foundation in Math, Statistics, Computer Science, programming with a diverse set of tools and languages and specific insights into the topic at hand. Over time we will not need that many Data Scientists of that profile, as technology will mature and the market will consolidate.

Till then, two options for the enterprise: sit and wait…. until others took care of making big data adoptioin more accessible or palpatable – OR: relentlessly focus on some kind of business scenario where going beyond the data that was analysed so far will expand the analytic capabilities. Pick the solution or technology to make it work now, but do not expect to define your big data standards NOW and for ever. It may well be that you will have to enlarge or change the technology foundation in 2-3 years from now. Till then you’ll have earned some early benefits and you’ll have developed a staff with far more experience to build on for the next phase in your big data journey.

Concluding remark: If you go through the above mentioned adoption barriers, it is obvious that the focus is on big data – per se. That focus is wrong. The focus has to be on business opportunities  that can be exploited by advancing our analytic capabilities. Technology considerations are an afterthought. This helped the early adopters to move from a big data pilot to large scale implementations.