A new report, co-authored by Bennett Institute's Affiliate Researcher Sam Gilbert, explores how tech company executives and policymakers should think about the issues of data ethics arising from their increasingly global activities.

Data ethics[1] is in vogue. Recent years have seen “ethics boards” proliferate, as big tech companies begin to reckon with the moral implications of their products, and the political implications of their power.
However, ethics boards are a far from sufficient response to the ethical questions raised by digital technology – particularly ones that have no formal role in a company’s decision-making. They may even invite accusations of “ethics washing” – that is, the disingenuous engagement with ethics as an external affairs strategy, similar to “greenwashing” marketing activities by companies with poor environmental records. As Meredith Whittaker points out, even Axon, which manufactures taser weapons, surveillance drones, and AI-enhanced police body cameras, now has an ethics board.
So, what practical questions of data ethics should multinational technology companies and policymakers be most concerned with? Ethics boards aside, what actions should they take? What ethical frameworks should they be using? Is it possible to reconcile the variations in ethical traditions and norms across the many different countries in which these companies operate?
These are the questions animating a new report from the UK-China Global Issues Dialogue Centre, which I have written with Professor Peter Williamson. Beginning with a review of academic literature on data ethics, we identify seven key themes in current discussions:
- Privacy and surveillance
- Bias, discrimination, and injustice in algorithmic decisioning
- Encoding of ethical assumptions in autonomous vehicle systems
- Artificial general intelligence as an existential risk to humanity
- Software user interface design as an impediment to human flourishing
- Job displacement from machine-learning and robotics
- Monetary compensation for personal data use
Using data from Google Trends to assess how well these themes reflect the concerns of the general public, we suggest that tech company executives currently place a disproportionate emphasis on the existential risk posed by machine superintelligence. By contrast, not enough consideration is being given to more immediate questions of fairness in algorithmic decisioning systems. We also suggest that data ethics is not only a matter of risk mitigation: there are opportunities for tech companies to take positive actions on data-sharing and job-creation that could increase the good they do in the world.
Looking beyond the West, we reject the relativist view that a global approach to data ethics is impossible, or even meaningless. A case study of Shanghai Rodway, the sometime China subsidiary of the global information services firm Dun & Bradstreet, shows that the Confucian and Judaeo-Christian traditions can agree on when the collection of personal data is unethical, even when starting from different premises. We are optimistic that “overlapping consensus” between different ethical traditions may be achievable on many questions of data ethics, and we encourage executives and policymakers to pursue it. Practical enablers include contributing expertise to IGO consultations, and participating in collaborative initiatives like the Partnership on AI.
Word cloud visualization of all article titles in our database of academic literature on digital ethics (n=953). Frequencies are shown in parentheses. Visualization: Tagcrowd.
We also look at the ways philosophical theories of ethics may be applied to data. Following recent work by our colleague Stelios Zyglidopoulos, we identify virtue ethics as a more suitable framework for technology companies’ ethical decision-making than utilitarianism – not least because the “utilitarian calculus” required to weigh up the potential consequences of decisions about data is so complex. Instead, we suggest that the best question for a tech company to ask is “What kind of company should we be?” A notable recent case is public deliberation on this question by the web infrastructure company Cloudflare, which modified its position on providing services to 8Chan (a message board associated with white-supremacist and neo-Nazi ideologies).
By giving ethics board a formal role in governance structures, and giving individuals transparency and control over how their personal data is collected, stored, and used, tech companies can begin to transcend “ethics washing”. By adopting a virtue ethics-based approach to their own decision-making, broadening the scope of their ethical reflection beyond AI, and looking for overlapping consensus when contributing to the setting of global norms, they can play an even more important role.
___________________________________________________________________________________________________________________________________
The full report “Data Ethics and Multinational Technology Companies” by Sam Gilbert & Peter J. Williamson, was published by UK-China Global Issues Dialogue Centre, Jesus College, University of Cambridge.
The report explores how executives and employees at technology companies, policymakers, lawyers, and tech entrepreneurs should think about the issues of data ethics that are arising from their increasingly global activities.
___________________________________________________________________________________________________________________________________
References
[1] Data Ethics Definition. Data ethics is…
“… a new branch of ethics that studies and evaluates the moral problems related to data (including generation, recording, curation, processing, dissemination, sharing and use), algorithms (including artificial intelligence, artificial agents, machine learning and robots) and corresponding practices (including responsible innovation, programming, hacking and professional codes), in order to formulate and support morally good solutions (e.g. right conducts or right values).” Luciano Floridi & Mariarosaria Taddeo
The views and opinions expressed in this post are those of the author(s) and not necessarily those of the Bennett Institute for Public Policy.