So as more and more of our lives are captured in digital, Big Data Analytics is going to become increasingly ubiquitous. As a society we have the opportunity to choose, if we want to insights we generate to be used to exploit and manipulate us or to enrich our lives.
How we approach Privacy will have the single greatest impact on that outcome.
Let’s imagine a future where every piece of information you see is filtered by an algorithm. You only see what it wants you to see. Your favorite social networking site, which you log into daily to check your news feed, needs to grow advertising revenue of diet products.
To do this it “primes the pump” by feeding you articles tailored to reinforce your body image issues. Then when it observes a decrease in your self-confidence, it pushes targeted advertising out to you, grab you while you’re at a low ebb, I line out….
The social networking site get its advertising revenue, its client sells lots of diet product, and you end up feeling bad about yourself. I could have felt lot worse, I promise you.
And, but reality that doesn’t have to be like this and I don’t think we want to live in the world, the world going to be like this….
Social, mobile, and cloud technologies are connecting us as never before, and if we want to benefit from being part of this new global human network, then we need to accept that much of our data is out there and is out of our immediate control.
The reaction of many governments is to suggest that we simply turn off the data tap; either by preventing data from being gathered or wrapping it in complicated regulatory frameworks.
The problem this approach is, that in this inter-connected world, the data leaks between people, so the data someone else shares can generate insight about you, and maybe insight that you really don’t want someone else to have or to use.
Data ownership is meaningless, it’s who the insight refers to that counts.
Also defining the data as public or private is also a challenge, because analytics can take public data, like social media, web data, to drive a private insight. For example, modeling your daily habits in order to predict your physical location.
On the other hand analytics can take private data, such as your GPS location, to generate a public insight that never exposes uniquely identifiable information.
So what do we do in this new world; this new inter-connected world, where we face privacy spaghetti?
There is no easy solution to privacy and different scenarios require different approaches.
However knowing what is possible with data analytics, the types of insights that we can infer from the seemingly innocuous and how we can “fill in” missing data, I believe a key challenge that we need to address is one of transparencies.
In much the same way as consumers have decided that it’s socially unacceptable to buy from companies that pollute the environment, we could choose as individuals and a society if we really want to engage with organizations that are not open and transparent about how they are collecting, analyzing, and using information about us.
And a culture of transparency could not only give us access to these insights, but could also then enable us to exercise appropriate control.
So, how would transparency work in the real world? I’d like to share with you the project I’ve been working within IBM for the last couple of years.
So, IBM has been using social and collaboration technologies since long before it was popular, so we are sitting on probably one of the largest and longest standing enterprise social network on the planet.
Our challenge is that on the one hand we’ve got an engaged active network of employees, who all looking to maximize the benefit that they get from their social and collaboration investment. And we’ve got a management team keen to understand what the network is saying about their employees and the business as a whole.
When I was asked to build a system that would analyze our enterprise social network, I decided to take a Privacy by Design approach and before we wrote a line of code we defined the philosophy that would guide our subsequent design decisions. It had 3 principles:
Personal Empowerment: Knowledge is power, we will put “actionable insight” into the hands of all employees.
個人への権限委譲 – 知識は力。私たちはすべての社員に「実用的なインサイト」を届けます。
So employees get to access these new insights and they get to choose if they want to share them with anyone else.
Management get access to aggregated analysis that allows drill-down to subsets of the network, but not to a uniquely identifiable individual.
While this approach may seem restrictive, and if some people thought I was insane when I initially suggested, that I can tell you, because we do all analysis but we are not gonna show you, but the upside has been really significant for us, and number of different fronts.
So the first thing is by defining very simple principles that don’t require a law degree to decipher we’ve demonstrated the trust with our employees, we built this trusting relationship with our employees.
By being open and transparent with them, we’ve been able to generate dialogue and it completely change the conversation around how-to-use and generate value out of social and collaboration data, which for me has been one of the most rewarding part of the project.
Because I’ve seen employees who would be naturally suspicious of such an analytics system not only proactively requesting to join, but offering to share more data and really engaging in the conversation.
The other thing is while, putting employees in control of the analytics, we’ve demonstrated respect, and then whole engagement, respect and trust has meant we can create new relationship with our employees, so corporate programs now have access to new insights but in a way, that is, they can access in a way that is respectful and sensitive to employees.
So just to demonstrate example of this: A few months ago, an advocacy program of IBM reached out me, and they wanted to access to the analytics we generated.
When I explained that the analytics was private to each employee and I couldn’t share it, they were initially disappointed.
However we looked into more details, what the program needed, we recognized that we could give them so much more than analytics.
We could give them an opportunity to really engage and build relationship with the IBMers they wanted to recruit.
So our system could accurately map the needs of our users with the needs of our program, but by reaching back to the employees first before we shared anything, we not only demonstrated that this program respected the employees and their privacy, but we also ensured that recommendation that we gave only included the IBMers really wanted to be part of the program.
Much the we have given employees access to the analytics and control of the analytics we have generated about them, why should the consumer have less level of transparency from the services they use.
I know personally, I would like to know what assumptions my favorite retailer might be making about me and what they are doing with those insights.
Data analytics is going to be key to our future.
And for society, and if we want society really benefit positively, we need to take this journey together.
If we can embrace privacy instead of fight it.
And if instead of looking for easy solution, we look for best solution for all participants.
Then maybe we can avoid this tug of war between the citizen on the one hand, who doesn’t want to be digitally stalked and manipulated, and the organization on the other, whose very survival may depend on its ability to harvest and generate value from this data.