Artificially intelligent technologies are complex data processing systems that pose several ethical challenges. We should consider the data intensity of these technologies and find solutions to the ethical implications in legislation, design and society in general.
We develop intelligent systems to create order in our messy contemporary reality, but very rarely do we put demands on what kind of order they create. Data processing algorithms can be described as the language of the big data age, which creates structure and meaning out of unstructured data. This language is not independent of the context in which it is used but is an expression of given cultural and social norms and values and their priorities. Artificial intelligence is therefore not a free agent, with free will, able to act inscrutably on data from its own computer logic, they are social systems that represent and amplify community values and specific interests. The main message is therefore that it is a technology that we can create and have an influence on. Viewed through these eyes, means that those who design the systems are also designers of social systems more than just designers of objective mathematical systems. Therefore, as early as in the design phase, an analysis and assessment can be made of the social and ethical consequences of the data processing systems being developed.
Finally, me and my team mates in DataEhics.eu got spare time to develop a set of data ethics principles and guidelines. They are based on years of work and hands on experiences with the life of data handling and practices in organisations and businesses and combines our legal and humanistic knowledge. Our point of departure is a view on an anthropocentric view on the world. Check out the principles, a detailed questionnaire and a FAQ on data ethics here. They may be reproduced freely as long as DataEthics.eu is clearly credited with a link to our website.
Data protection impact assessments (DPIA), also referred to as Privacy Impact Assessments (PIA), are compulsory under the new EU General Data Protection Regulation (GDPR). But ensuring an organization or business’ data protection compliance now and in the future is the bare minimum. In a world of data networks that increasingly shape and define everything from our identity, economy to politics, we need to also consider the broader ethical and social implications of data processing.
BLOG: If we want to design ethical AI that benefits a human evolution, we need a way of talking about it that respects our human values and quirks.
Data Ethics: Renegotiating Human Agency in Industry Data Innovation
In the wake of a rapid technological development in which digitally collected, stored and procesed data have become defining factors in society, human and ethical dilemmas emerge. This study explores a social process that is taking place to negotiate global standards, roles, rights and responsibilities to create a new trust system to manage the risks of a data-saturated environment with an emphasis on ethics, human empowerment and agency.
Discussions of privacy are often framed in terms of struggling against those who threaten it: governments, corporations or other authorities. But it’s not just an ‘activist’ fight to make the case for privacy: it’s just better business.
Today it’s a competitive edge for companies to respect user privacy and their right to control their own data. The organizations who view data ethics as a social responsibility – who place similar importance on data as they do environmental awareness and respect for human rights – will win in the market.
We ask artificially intelligent systems to make order in our messy modern realities, but very rarely do we question what type of order. This is a call for an historical awareness of the social systems that we are building: Which social, cultural norms, values and interests do they represent, reinforce and enact?
The IAPP is the largest and most comprehensive global information privacy community and resource. Founded in 2000, the IAPP is a not-for-profit organization that helps define, support and improve the privacy profession globally.
– by Gry Hasselbalch (based on panel debate participation, Ethics, Observational Platforms, Mathematics and Fundamental Rights, CPDP, Brussels, 2017)
A few years ago a social media company decided to do some experiments on its users. Filling their news feeds with positive or negative stories, they were measuring hundreds of thousands of people’s emotional reactions. When the story surfaced there was of course a public out cry and the company found it self in a situation where it had to show that it cared. So a spokesperson apologized publically. However, she didnt apologize because what they did had been ethically questionable, they were just sorry that it had been “poorly communicated”.
Legally you can do a lot with data right now, and a lot is done with data, that is not necesarilly in the best interest of the individual. And this is the point where we revisit ethics – when the laws, social awareness and formal systems in place are not enough.
“Legally you can do a lot with data right now, and a lot is done with data, that is not in the best interest of the individual. This is where we revisit ethics. When the laws and formal systems are not enough.”
Our lives are lived in data. Data crossing borders and connected in virtual space. Most often, it appears, we live in open and too easily accessible data networks. States and corporations are watching us through data, and we are watching each other through data. What does individual privacy mean in this data saturated environment?
We are living in an era defined and shaped by data. Data makes the world go round. It is politics, it is culture, it is everyday life and it is business. Our data-flooded era is one of technological progress, with tides rising at a never seen before pace. Roles, rights and responsibilities are reorganized and new ethical questions posed. Data ethics must and will be a new compass to guide us.
by Gry Hasselbalch and Pernille Tranberg
First appeared in DailyDot 23rd October
BLOG: How come Facebook knows that I have a mother with Alzheimer’s, a woman asked the other day. She was sure that she hadn’t provided the social medium with such sensitive information. But when quizzed a bit further, she confirmed that she had googled the illness and been browsing the Alzheimer’s Association’s website. Both websites “gossip” about visitors’ behavior behind their back to other websites, including Facebook, via third-party cookies.
Data ethics – the New Competitive Advantage by Gry Hasselbalch and Pernille Tranberg
Blog (updated 15 June 2016): There’s a battle of words going on, the battle is about the definition of “privacy”, and it’s been going on for centuries. Somehow we’ve led ourselves to believe that the definition of privacy that we all think we share is something intrinsically connected to the individual. But actually it’s not. Although privacy as such is in fact only something the individual can claim (corporations and states cannot), the individual has always been very absent in the very construction of the concept.
– by Gry Hasselbalch
This January the European Data Protection Supervisor presented his new “Ethics Advisory Group”. A group of experts that will help him “reconsider the ethical dimension of the relationships between human rights, technology, markets and business models and their implications for the rights to privacy and data protection in the digital environment.” He is not the first European decision maker or thought leader to bring forward ethics as a guiding principle in the digital age. Over the last year digital ethics, and in particular data ethics, have become the “talk of the town” in Europe. Based on the realisation that laws have not followed pace with the development of digital technologies, technologists, academics, policymakers and businesses are today revisiting cultural values and moral systems when groping for a new ethical framework for the digital age.
– by Gry Hasselbalch
In her new article “How the machine ‘thinks’: Understanding opacity in machine learning algorithms” (January 2016) Jenna Burrell from UC Berkley School of Information discusses methods to investigate opacity in algorithms. Once a technical, opaque word belonging to the sphere of computer scientists and programmers, “Algorithm” has today become a commonly used buzz word in business discourse. So much so that discussions about “big data” in an informed business community will always include a reference to the “Algorithmic Economy”. A new business adventure based on finding patterns in data, creating profiles, predicting and responding to data, making meaning out of data and transforming it into value.
A new 2016 report from the Danish Consumers’ Council “Digital Challenges for Consumers in Denmark” by Gry Hasselbalch maps key challenges for Danish consumers in the digital era. A rapid digital adoption in Denmark has created a number of challenges for Danish consumers. In particular automatic data collection and correlation performed by both public and private actors challenge consumer privacy. Laws, consumers’ skill, as well as public institution’s and private businesses’ conduct, have not progressed in a way that adequately protects and empowers consumers’ in a digital market and public sphere. The report also points to solutions. There is a need for an updated regulatory data protection framework, a development of consumer skills that provide consumers’ with background knowledge of the life of and interests in their data and the advancement of privacy by design solutions in public and private business.
BLOG: The long-awaited EU data protection reform agreed on by the Europan Union late Tuesday night stipulated among others that companies cannot process the data of children and young people under the age of 16 without their parents’ consent.
– by Gry Hasselbalch
How can we question the ethics of a service if we don’t have access to the details of how it is designed to act on data? How can we put a health warning on a product if we don’t know the ingredients?