Blog (updated 15 June 2016): There’s a battle of words going on, the battle is about the definition of “privacy”, and it’s been going on for centuries. Somehow we’ve led ourselves to believe that the definition of privacy that we all think we share is something intrinsically connected to the individual. But actually it’s not. Although privacy as such is in fact only something the individual can claim (corporations and states cannot), the individual has always been very absent in the very construction of the concept.

On the 13th of June, UN’s new privacy rapporteur Joe Cannataci visited Copenhagen, DIMG_3919enmark for the first time in his term as rapporteur, to discuss business practices’ influence on the individual’s right to privacy. He was invited by DataEthics.eu and the Danish Institute of Human Rights. At the morning workshop with representatives from key organisations and leading experts in Denmark, he pointed out that there is actually no binding and universally accepted definition of privacy. And this in fact impedes the legal enforcement of the right to privacy. Because how do we regulate a specific business conduct, such as for example a business’ predictive analytics tool used on big data collected on customers, if we do not have the same understanding of the problems it arises in regards to the individuals’ right to privacy?

Numb by definitions

What Joe Cannataci is referring to is that even though we do formally have a right to privacy that is described in internationally agreed conventions and declarations, including the right to data protection, we have never formally agreed upon the very content of this human right. We’ve silently and unformally agreed upon a variation of culturally (and heavily interest driven) definitions. In the Internet commercial era, we’ve for example accepted the terms and conditions of the “Digital Declarations”  (to quote the retired, but still very active, Harvard professor Shoshanna Zuboff) made by the big data corporations (and governments) without questioning their roots and embedded interests.

So we are left with all these competing definitions which at best leaves us (the citizens and the enforcers) numb. Before the advent of the internet, privacy was the absence of interference from states (in human rights legal language a “negative” right). We slowly found out that this also included “positive obligations” of governments to ensure this right in law and enforcement. These obligations were defined and tested in a range of legal cases (check out e.g. the case law of the ECHR). And we’ve added legal instruments such as the GDPR or convention 108 in Europe.

Oil, Needles and Genes

But still today in our public by default digital reality governments tend to define privacy as the needle’s right not to be found in their decrypted and accessible haystack of data (the mass/bulk surveillance model). And the largest corporations define privacy as all the stuff you agree to share (“the privacy settings”, the so called “consent model  and actually it should’t be called “privacy policy”, but a “data sharing policy”) – meaning in essence all your data to them, no privacy for you. But how does the individual fit into these definitions? We are seeing a rising movement in user demands and actions. Ad and content blockers, increasing use of privacy tools. Even a rising movement of users to more privacy friendly services (or at least ones that are perceived as more privacy friendly by users). But we don’t hear people’s voices. What are their privacy needs, demands, identity and developmental requirements? What type of privacy empowers them?

There’s a battle of discourses going on and it is very difficult to see the individual, the citizen in all of this. Is privacy something we can trade with? Is our data thenew oil? Is it the property of our governments that can choose whenever they want to look for us in a gigantic haystack of online data? Or is it not something completely different? The genes of a person for example and the analysis of this, the genetics of the digital age? (and how do we phrase the enforcement question? Do we need rules for the trading tools that deals with our valuable data oil? Or do we need an ethics for digital data genetics?)

Mmw_large

A formally binding Universal Definition of Privacy

Joe Cannataci is in his latest report urging to create a formally binding Universal Definition of Privacy: “A priority issue such as up-dating legal instruments through an expanded understanding of what is meant by the right to privacy would seem to be an essential starting point. There appears to be a consensus amongst several stakeholders that one of these legal instruments could take the form of an additional protocol to Art. 17 of the ICCPR26 wherein the SRP is being urged “to promote the opening of negotiations on this additional protocol during his first mandate” “(p.19)

He continues to describe the right to privacy as an “enabling right”, interlinked with human dignity and the ability to develop one’s personality freely and unhindered. That sounds very much like a definition of privacy from the individual’s point of view. At last we hear this definition…. and at last we see an actual move towards formally agreeing upon a definition based on the interests of the individual (whose privacy and life this actually concerns). At last we are moving on to empowered enforcement. (and lets please be quick, because just around the corner awaits the complex privacy implications of new intelligent, autonomous systems)

Data Ethical Businesses Define Privacy as an Enabling Right

There is a rising movement within business development that is responding to the privacy needs and demands of individuals. These new businesses are innovating from the point of view of the individual. A strangely enough new definition of privacy, but only strange because it is actually a new thing to take into account the “privacy needs” of the individual when privacy is defined in the corporate world and society in general. And its even done in a very business like manner. The privacy demands of the individual are by these businesses prioritized just like any other customer demand that a business normally would cater to when it explores the market, innovate and develop. We are for example seeing new business models that are not based on the monetization of personal data. New tech developed to minimize data collection, not maximize it. Ethical businesses that go beyond the mere compliance with the law. Innovation driven by privacy by design principles.

These new businesses will take part in the construction of a formally binding universal definition of privacy.

Advertisements