This is a chapter, authored by Lanah Kammourieh, Thomas Baar, Jos Berens, Emmanuel Letouzé, Julia Manske, John Palmer, and Patrick Vinck, which will be part of a book edited by Linnet Taylor and Luciano Floridi.
A recent presentation on the material can be found here: APC Group Privacy 2015. Below is the abstract of the chapter.
Big Data has blurred the boundaries between individual and group data. Through the sheer number and richness of databases and the increasing sophistication of algorithms, the “breadcrumbs” left behind by each one of us have not only multiplied to a degree that calls our individual privacy into question; they have also created new risks for groups, who can be targeted and discriminated against unbeknownst to themselves, or even unbeknownst to data analysts. This challenges us to enrich our approach to privacy. Where individual privacy might once have sufficed to rein in state and corporate surveillance and the neighbors’ curiosity, and to give individuals a measure of control over their reputations and security, today it can leave groups vulnerable to discrimination and targeting and, what’s more, leave them unaware of that risk. The concept of group privacy attempts to supplement individual privacy by addressing this blindspot.
Group privacy is not, however, without complications of its own. Indeed, creating a simple, one-dimensional group privacy right is no silver bullet: such a right can only provide effective protection where there is a group possessed of legal personality able to enforce it before a (domestic or international) court or tribunal. Yet Big Data’s specificity lies precisely in its ability to extract valuable information on passive groups with no such self-awareness or capacity. Thus, on the one hand, a group privacy right can help active, structured groups assert their informational self-determination and protect their own interests. On the other, it must be supplemented by additional protections that recognize and address the privacy interests of passive groups extracted at the data analysis stage.
This points us towards a multi-pronged approach to strengthen the protection of privacy. Traditional avenues, including conventions on the international plane and legislation in the domestic legal sphere, are indispensable to reaffirm the importance of privacy and introduce public debate about its application to groups. These should not focus only on setting the conditions for lawful data collection, but also on limiting and sanctioning the risky downstream potential uses of such data.
The introduction of harmonized regulation on data sharing could also afford users a greater measure of control over their own data and increase transparency in the way our myriad “breadcrumbs” of information are used. At the same time, the private sector must be harnessed – both to help develop technology in a direction that ensures greater accountability for privacy breaches, and to encourage the social responsibility of businesses where local privacy laws are weak.
Lastly, none of these changes can have meaningful impact without increased data literacy across the board, so that individuals become more aware of the impact of their actions not only on their own safety, but also that of others. Improved privacy protection is not an impediment to the myriad potentialities of Big Data – but rather the condition for this potential to be unleashed in a responsible and socially beneficial way.