I've recently experienced an interesting contrast in my work. I spent some time thinking about the potential of "big data" in creating responsive government services that can have a positive impact on life in a modern metropolis like Chicago or NYC. Followed immediately by a project that explored the implications of data collection and storage on our individual sense of privacy in context of socio-technical systems (1). It was a jarring contrast and it made me think of the role designers play in influencing the user's sense of privacy associated with the use of the system.
The rate of creation of these systems has accelerated dramatically in the past 20 years. When taking a look at the landscape of product and service systems being created for both end consumers and for workplaces, one might have a hard time finding a socio-technical system/solution that does not operate under the premise that the more information [it] can collect, and retain about the user, the better (Nissenbaum, 36). Good examples are the Quantified Self systems or productivity management systems in workplaces that have social interaction elements built into the system. They are essentially behavior modification tools that allow users to understand something about their behaviors with the goal of changing behavior to a more “desired” state.
I'd like to argue that it is not always our [the designer’s] job to correct user’s behavior and, furthermore, that the assumption that access and control over information about the [user] self is worth trading off for this “new desired state” requires a deeper contextual inquiry of the user’s information norms (Nissenbaum, 129).
I think this idea has some implication on our work as designers. I suspect we have all fell victims, at some point in our work to this assumed “hope that was placed in information predicting and shaping human behavior” (Nissenbaum, 44). Regardless of your position on the ability of systems to deliver on this promise, many of you might agree that this "hope" has fueled an unquenchable thirst for more and more information. We are living in an infinitely more connected world where information sharing is no longer an option but a ubiquitous state of being. For example, I am creating a personal data exhaust while writing this article that is being used by many systems that are foreign to me both in existence and in purpose.
We, designers and users of these systems, seem to be suffering from what behavioral economists would call “coherent arbitrariness” in which when a product or service is introduced in a particular way, it becomes a permanent point of reference, one to which we compare all future relevant solutions. It is illustrative of the momentum that every idea has once it is introduced. So in a really roundabout way, I think that this unquenchable thirst, and the momentum that it has created, has also led us to a continuous re-anchoring to a lesser and lesser expectation of privacy. This manifests itself in systems that are designed to penetrate deeper into our lives in search for more personal information to deliver on the promise of predicting and shaping human behavior.
Part of the reason that this is happening is simply because it is difficult to visualize the trade-offs you are making by “allowing” these systems to do their thing. To use behavioral economics terminology, our true cost of the convenience of Amazon’s predictive algorithm or Google’s robust analytics is hidden from us, it happens below our line of visibility. In addition, the nature of payment in these systems has changed as well since the currency now is personal information.
This has created a cognitive degree of separation much like a credit card payment is separated from our idea of money. For example, I don’t usually think twice of logging in with my Facebook or Gmail account to another service because I don’t have access to information that allows me to process what this decision means to me in the long term. I would go even further and say that we don’t even consider this a transaction anymore because how does one directly compare the value of one’s personal information to Google versus the benefit of having access to a world of information and a powerful tool to maneuver it? It becomes too complex of a task and therefore we resort to the default, which is to not think about it.
The real trade-off you are making however is the convenience and access to things in the present (present gains) versus the diminished ability to manage your individual identity in the future (future losses). Predictive nature of these services implies that they will know what you want (based on your behavior and that of people like you) before you do thus making them detrimental to your ability to control and express your own identity. Now this is just a simple case of shopping for goods online but if you apply the same logic to a situation where Quantified Self principles are at play, it doesn’t take a huge leap of imagination to see how this might be problematic if let’s say Nike Fuel Band strikes a data sharing partnership with a large insurance company. The point is that the design of these systems too often focuses exclusively on the usability and removing barriers for adoption for the user.
Of course as designers we cannot predict the future arrangement of companies and how they will share information about users. One thing we can do however is to understand that we don’t know whether the people we are designing for would be ok with this trade-off if they understood how to compare the two. I think a new approach to research and design of socio-technical systems could potentially lead to a more participatory meaning making in big data management. One that considers and engages users earlier and more frequently in the research and design process leading to more transparency and value that maneuvers our information norms instead of breaching them.
I don’t presume to have answers to this whale of a problem, however, what I do know is that as human beings we evaluate privacy, and information, based on the context in which it takes place. What constitutes an invasion of privacy is essentially a breach of what Helen Nissenbaum calls a contextually relative information norm. Whether we "experience" them in the moment or not, these breaches are a constant part of modern life and have profound effects on many aspects of our everyday life. They are caused by designs of socio-technical systems that continue to re-anchor our sense of a right to privacy to a new, more invasive vision of user-centeredness in the design of information technology systems. This is a vision of the future rooted in the relatively shaky faith in the predictive power of big data that makes it meaningful to allow companies to peek into your living rooms (via gaming console cameras), into your children's bedrooms (via smart thermostats) and into your personal finances. As designers we must do a better job of developing approaches to understanding user’s information norms and leveraging this knowledge in creating smarter and more contextually relevant socio-technical systems.
(1) Whitworth & Ahmad. Socio-Technical System Design. The Encyclopedia of Human-Computer Interaction, 2nd Ed. Interaction Design Foundation, 2012. Web. 15 Dec. 2013
(2) Nissenbaum, Helen. Privacy in Context, Technology Policy, and the Integrity of Social Life. Stanford: Stanford University Press, 2010, PDF