Data protection impact assessments (DPIA), also referred to as Privacy Impact Assessments (PIA), are compulsory under the new EU General Data Protection Regulation (GDPR). But ensuring an organization or business’ data protection compliance now and in the future is the bare minimum. In a world of data networks that increasingly shape and define everything from our identity, economy to politics, we need to also consider the broader ethical and social implications of data processing.
Data Ethics: Renegotiating Human Agency in Industry Data Innovation
In the wake of a rapid technological development in which digitally collected, stored and procesed data have become defining factors in society, human and ethical dilemmas emerge. This study explores a social process that is taking place to negotiate global standards, roles, rights and responsibilities to create a new trust system to manage the risks of a data-saturated environment with an emphasis on ethics, human empowerment and agency.
Discussions of privacy are often framed in terms of struggling against those who threaten it: governments, corporations or other authorities. But it’s not just an ‘activist’ fight to make the case for privacy: it’s just better business.
Today it’s a competitive edge for companies to respect user privacy and their right to control their own data. The organizations who view data ethics as a social responsibility – who place similar importance on data as they do environmental awareness and respect for human rights – will win in the market.
We ask artificially intelligent systems to make order in our messy modern realities, but very rarely do we question what type of order. This is a call for an historical awareness of the social systems that we are building: Which social, cultural norms, values and interests do they represent, reinforce and enact?
The first European Data Ethics Forum is organized by DataEthics and Dansk IT. The mission is to showcase best cases within privacy tech and data ethical companies and to share knowledge in the field. Speakers include among others Evelyn Rupert from Goldsmiths College in London, Siim Teller from Wire in Berlin, John C. Havens from IEEEs global initiative on ethically aligned design and Raegan McDonald from Mozilla.
The IAPP is the largest and most comprehensive global information privacy community and resource. Founded in 2000, the IAPP is a not-for-profit organization that helps define, support and improve the privacy profession globally.
– by Gry Hasselbalch (based on panel debate participation, Ethics, Observational Platforms, Mathematics and Fundamental Rights, CPDP, Brussels, 2017)
A few years ago a social media company decided to do some experiments on its users. Filling their news feeds with positive or negative stories, they were measuring hundreds of thousands of people’s emotional reactions. When the story surfaced there was of course a public out cry and the company found it self in a situation where it had to show that it cared. So a spokesperson apologized publically. However, she didnt apologize because what they did had been ethically questionable, they were just sorry that it had been “poorly communicated”.
Legally you can do a lot with data right now, and a lot is done with data, that is not necesarilly in the best interest of the individual. And this is the point where we revisit ethics – when the laws, social awareness and formal systems in place are not enough.
“Legally you can do a lot with data right now, and a lot is done with data, that is not in the best interest of the individual. This is where we revisit ethics. When the laws and formal systems are not enough.”
Our lives are lived in data. Data crossing borders and connected in virtual space. Most often, it appears, we live in open and too easily accessible data networks. States and corporations are watching us through data, and we are watching each other through data. What does individual privacy mean in this data saturated environment?
We are living in an era defined and shaped by data. Data makes the world go round. It is politics, it is culture, it is everyday life and it is business. Our data-flooded era is one of technological progress, with tides rising at a never seen before pace. Roles, rights and responsibilities are reorganized and new ethical questions posed. Data ethics must and will be a new compass to guide us.
by Gry Hasselbalch and Pernille Tranberg
First appeared in DailyDot 23rd October
BLOG: How come Facebook knows that I have a mother with Alzheimer’s, a woman asked the other day. She was sure that she hadn’t provided the social medium with such sensitive information. But when quizzed a bit further, she confirmed that she had googled the illness and been browsing the Alzheimer’s Association’s website. Both websites “gossip” about visitors’ behavior behind their back to other websites, including Facebook, via third-party cookies.
Blog (updated 15 June 2016): There’s a battle of words going on, the battle is about the definition of “privacy”, and it’s been going on for centuries. Somehow we’ve led ourselves to believe that the definition of privacy that we all think we share is something intrinsically connected to the individual. But actually it’s not. Although privacy as such is in fact only something the individual can claim (corporations and states cannot), the individual has always been very absent in the very construction of the concept.
– by Gry Hasselbalch
This January the European Data Protection Supervisor presented his new “Ethics Advisory Group”. A group of experts that will help him “reconsider the ethical dimension of the relationships between human rights, technology, markets and business models and their implications for the rights to privacy and data protection in the digital environment.” He is not the first European decision maker or thought leader to bring forward ethics as a guiding principle in the digital age. Over the last year digital ethics, and in particular data ethics, have become the “talk of the town” in Europe. Based on the realisation that laws have not followed pace with the development of digital technologies, technologists, academics, policymakers and businesses are today revisiting cultural values and moral systems when groping for a new ethical framework for the digital age.
– by Gry Hasselbalch
In her new article “How the machine ‘thinks’: Understanding opacity in machine learning algorithms” (January 2016) Jenna Burrell from UC Berkley School of Information discusses methods to investigate opacity in algorithms. Once a technical, opaque word belonging to the sphere of computer scientists and programmers, “Algorithm” has today become a commonly used buzz word in business discourse. So much so that discussions about “big data” in an informed business community will always include a reference to the “Algorithmic Economy”. A new business adventure based on finding patterns in data, creating profiles, predicting and responding to data, making meaning out of data and transforming it into value.
A new 2016 report from the Danish Consumers’ Council “Digital Challenges for Consumers in Denmark” by Gry Hasselbalch maps key challenges for Danish consumers in the digital era. A rapid digital adoption in Denmark has created a number of challenges for Danish consumers. In particular automatic data collection and correlation performed by both public and private actors challenge consumer privacy. Laws, consumers’ skill, as well as public institution’s and private businesses’ conduct, have not progressed in a way that adequately protects and empowers consumers’ in a digital market and public sphere. The report also points to solutions. There is a need for an updated regulatory data protection framework, a development of consumer skills that provide consumers’ with background knowledge of the life of and interests in their data and the advancement of privacy by design solutions in public and private business.
BLOG: The long-awaited EU data protection reform agreed on by the Europan Union late Tuesday night stipulated among others that companies cannot process the data of children and young people under the age of 16 without their parents’ consent.
– by Gry Hasselbalch
How can we question the ethics of a service if we don’t have access to the details of how it is designed to act on data? How can we put a health warning on a product if we don’t know the ingredients?
TALKS & EVENTS: “How can you put a health warning on a product if you don’t even know the ingredients”. Talking about Data Ethics at Internet Governance Forum 2015
TALKS & EVENTS: The Internet Governance Forum (IGF) is a series of annual conferences organized by the UN. It brings together representatives from various stakeholder groups in discussions on public policy issues relating to the Internet. The IGF informs those with policy-making power in both the public and private sectors. At their annual meeting delegates discuss, exchange information and share good practices with each other.
TALKS & EVENTS: On 29th October representatives of toy companies and tech critics met to discuss the evolving Internet of Toys and the data ethical implications of this at the European Commission Safer Internet Forum. The fact that we are talking about data ethics with toy makers at this early stage of the development of an internet of toys is yet another symptom of the paradigm shift in business development where privacy and data ethics increasingly are perceived as competitive parameters.