We are living in an era defined and shaped by data. Data makes the world go round. It is politics, it is culture, it is everyday life and it is business. Our data-flooded era is one of technological progress, with tides rising at a never seen before pace. Roles, rights and responsibilities are reorganized and new ethical questions posed. Data ethics must and will be a new compass to guide us.
by Gry Hasselbalch and Pernille Tranberg
First appeared in DailyDot 23rd October
BLOG: How come Facebook knows that I have a mother with Alzheimer’s, a woman asked the other day. She was sure that she hadn’t provided the social medium with such sensitive information. But when quizzed a bit further, she confirmed that she had googled the illness and been browsing the Alzheimer’s Association’s website. Both websites “gossip” about visitors’ behavior behind their back to other websites, including Facebook, via third-party cookies.
Blog (updated 15 June 2016): There’s a battle of words going on, the battle is about the definition of “privacy”, and it’s been going on for centuries. Somehow we’ve led ourselves to believe that the definition of privacy that we all think we share is something intrinsically connected to the individual. But actually it’s not. Although privacy as such is in fact only something the individual can claim (corporations and states cannot), the individual has always been very absent in the very construction of the concept.
BLOG: This January the European Data Protection Supervisor presented his new “Ethics Advisory Group”. A group of experts that will help him “reconsider the ethical dimension of the relationships between human rights, technology, markets and business models and their implications for the rights to privacy and data protection in the digital environment.” He is not the first European decision maker or thought leader to bring forward ethics as a guiding principle in the digital age. Over the last year digital ethics, and in particular data ethics, have become the “talk of the town” in Europe. Based on the realisation that laws have not followed pace with the development of digital technologies, technologists, academics, policymakers and businesses are today revisiting cultural values and moral systems when groping for a new ethical framework for the digital age.
BLOG: In her new article “How the machine ‘thinks’: Understanding opacity in machine learning algorithms” (January 2016) Jenna Burrell from UC Berkley School of Information discusses methods to investigate opacity in algorithms. Once a technical, opaque word belonging to the sphere of computer scientists and programmers, “Algorithm” has today become a commonly used buzz word in business discourse. So much so that discussions about “big data” in an informed business community will always include a reference to the “Algorithmic Economy”. A new business adventure based on finding patterns in data, creating profiles, predicting and responding to data, making meaning out of data and transforming it into value.
A new 2016 report from the Danish Consumers’ Council “Digital Challenges for Consumers in Denmark” by Gry Hasselbalch maps key challenges for Danish consumers in the digital era. A rapid digital adoption in Denmark has created a number of challenges for Danish consumers. In particular automatic data collection and correlation performed by both public and private actors challenge consumer privacy. Laws, consumers’ skill, as well as public institution’s and private businesses’ conduct, have not progressed in a way that adequately protects and empowers consumers’ in a digital market and public sphere. The report also points to solutions. There is a need for an updated regulatory data protection framework, a development of consumer skills that provide consumers’ with background knowledge of the life of and interests in their data and the advancement of privacy by design solutions in public and private business.
BLOG: The long-awaited EU data protection reform agreed on by the Europan Union late Tuesday night stipulated among others that companies cannot process the data of children and young people under the age of 16 without their parents’ consent.
BLOG: How can we question the ethics of a service if we don’t have access to the details of how it is designed to act on data? How can we put a health warning on a product if we don’t know the ingredients?
TALKS & EVENTS: “How can you put a health warning on a product if you don’t even know the ingredients”. Talking about Data Ethics at Internet Governance Forum 2015
TALKS & EVENTS: The Internet Governance Forum (IGF) is a series of annual conferences organized by the UN. It brings together representatives from various stakeholder groups in discussions on public policy issues relating to the Internet. The IGF informs those with policy-making power in both the public and private sectors. At their annual meeting delegates discuss, exchange information and share good practices with each other.
TALKS & EVENTS: On 29th October representatives of toy companies and tech critics met to discuss the evolving Internet of Toys and the data ethical implications of this at the European Commission Safer Internet Forum. The fact that we are talking about data ethics with toy makers at this early stage of the development of an internet of toys is yet another symptom of the paradigm shift in business development where privacy and data ethics increasingly are perceived as competitive parameters.
BLOG: Toy manufacturers are today creating intelligent toys that remember, find patterns and respond to data from children. We need a data ethical approach to innovation in the development of an “Internet of Things” for children.
BLOG: “If it’s free then you are the product”. This statement normally applies to consumers paying for online services with their data. Another version of this is developers using big industry machine learning technologies for free to build and create services they don’t own the real value of.
AWARENESS RAISING: This guide is adapted to you internet users to provide them with insights into their human rights online.
Advisor for organisations on the social and data ethical implications of tech. Co-founder of DataEthics.eu. Independent ethics expert for the EU. Worked with internet policy and digital rights for 10 years in the pan EU network on youth & internet. Behind several larger studies and reports. Author of Data Ethics – The New Competitive Advantage. @mediamocracy, email@example.com
Phone: +39 3512705584 (Italy), +45 29827374 (Denmark)
BLOG: Our destiny is a product. Fate is developed upon and innovated with. Fate is part of an actual machinery. It can be sold and traded with. Fate is something the Destiny Machine produces.
BLOG: Surveillance is the default. We need a change of direction. But waking up society can be a challenge. Did we hit the trucks yet?
It’s a Steve Martin and John Candy farce. Passenger: “He says we are going in the wrong direction”. Driver shrugs: “ah he’s drunk. How would he know where we are going? He he… what a moron!”
PUBLICATIONS: Pernille Tranberg og Gry Hasselbalch are currently writing a book about Data Ethics in business development. The book is based on more than 40 business cases worldwide. Expected publication in English and Danish summer 2016.
(Read an English shorter version of this article here: THE SECOND DIGITAL DIVIDE: PAY FOR PRIVACY AND TRADE WITH PRIVACY)
BLOG: I fremtiden kan du købe forskellige grader af privatliv på nettet. Hvis du har råd til det.
BLOG: If you weren’t already aware of it, you are being profiled online and your personal data traded in a billion dollar data industry. Don’t worry, most people don’t know much about this. The personal data market is incomprehensible to the average consumer mostly because the trades with their data happen without their direct involvement. And this seems to be the main problem when great minds have to come up with innovative solutions to today’s privacy invasive online business models. The fact that consumers are not involved directly in the trade. That they don’t get their cut of the cake. “Pay for Privacy” and “Trade with Privacy” become the norm, presented as the most fair solutions. But fair to who? Perhaps it’s more a question of a change in fundamental perspective?