Jump to content

Ethics of quantification

From Wikipedia, the free encyclopedia

Ethics of quantification is the study of the ethical issues associated to different forms of visible or invisible forms of quantification. These could include algorithms, metrics/indicators, statistical and mathematical modelling, as noted in a review of various aspects of sociology of quantification.[1]

According to Espeland and Stevens[2] an ethics of quantification would naturally descend from a sociology of quantification, especially at an age where democracy, merit, participation, accountability and even ‘‘fairness’’ are assumed to be best discovered and appreciated via numbers. In his classic work Trust in Numbers Theodore M. Porter[3] notes how numbers meet a demand for quantified objectivity, and may for this be by used by bureaucracies or institutions to gain legitimacy and epistemic authority.

For Andy Stirling of the STEPS Centre at Sussex University there is a rhetoric element around concepts such as ‘expected utility’, ‘decision theory’, ‘life cycle assessment’, ‘ecosystem services’ ‘sound scientific decisions’ and ‘evidence-based policy’. The instrumental application of these techniques and their use of quantification to deliver an impression of accuracy may raise ethical concerns.[4]

For Sheila Jasanoff these technologies of quantification can be labeled as 'Technologies of hubris',[5] whose function is to reassure the public while keeping the wheels of science and industry turning. The downside of the technologies of hubris is that they may generate overconfidence thanks to the appearance of exhaustivity; they can preempt a political discussion by transforming a political problem into a technical one; and remain fundamentally limited in processing what takes place outside their restricted range of assumptions. Jasanoff contrasts technologies of hubris with 'technologies of humility'[6] which admit the existence of ambiguity, indeterminacy and complexity, and strive to bring to the surface the ethical nature of problems. Technologies of humility are also sensitive to the need to alleviate known causes of people’s vulnerability, to pay attention to the distribution of benefits and risks, and to identify those factors and strategies which may promote or inhibit social learning.

For Sally Engle Merry, studying indicators of human rights, gender violence and sex trafficking, quantification is a technology of control, but whether it is reformist or authoritarian depends on who has harnessed its power and for what purpose. She notes in order to make indicators less misleading and distorting some principles should be followed:[7]

  • democratize the production of indicators
  • develop in parallel qualitative research to verify the validity of assumptions
  • keep it the indicators simple
  • test or adopt multiple framings
  • admit the limits of the various measures

The field of algorithms and artificial intelligence is the regime of quantification where the discussion about ethics, is more advanced, see e.g. Weapons of Math Destruction[8] of Cathy O'Neil. While objectivity and efficiency are some positive properties associated with the use of algorithms, ethical issues are posed by these tools coming in the form of black boxes.[9] Thus algorithms have the power to act upon data and make decisions, but they are to a large extent beyond query.[8][10] The existence of a surveillance capitalism in the theme of Shoshana Zuboff[11] 2019 book. A more militant reading of the dangers posed by artificial intelligence is Resisting AI: An Anti-fascist Approach to Artificial Intelligence by Dan McQuillan.[12]

See also

[edit]

References

[edit]
  1. ^ E. Popp Berman and D. Hirschman, “The Sociology of Quantification: Where Are We Now?,” Contemp. Sociol., vol. 47, no. 3, pp. 257–266, 2018.
  2. ^ W. N. Espeland and M. L. Stevens, “A sociology of quantification,” Eur. J. Sociol., vol. 49, no. 3, pp. 401–436, 2008.
  3. ^ T. M. Porter, Trust in Numbers: The Pursuit of Objectivity in Science and Public Life. Princeton University Press, 1995.
  4. ^ A. Stirling, “How politics closes down uncertainty - STEPS Centre,” STEPS Centre, 2019.
  5. ^ Jasanoff, S. Technologies of humility: Citizen participation in governing science. Minerva 41, 223–244 (2003).
  6. ^ Jasanoff, S. Technologies of humility. Nature 450, 33 (2007).
  7. ^ S. Engle Merry, The Seductions of Quantification: Measuring Human Rights, Gender Violence, and Sex Trafficking. University of Chicago Press, 2016.
  8. ^ a b C. O’Neil, Weapons of math destruction : how big data increases inequality and threatens democracy. Random House Publishing Group, 2016.
  9. ^ J. Danaher et al., “Algorithmic governance: Developing a research agenda through the power of collective intelligence,” Big Data Soc., vol. 4, no. 2, pp. 1–21, 2017.
  10. ^ R. Kitchin, “Thinking critically about and researching algorithms,” Inf. Commun. Soc., vol. 20, no. 1, pp. 14–29, Jan. 2017.
  11. ^ Zuboff, Shoshana (January 2019). "Surveillance Capitalism and the Challenge of Collective Action". New Labor Forum. 28 (1): 10–29. doi:10.1177/1095796018819461. ISSN 1095-7960. S2CID 159380755.
  12. ^ McQuillan, D. (2022). Resisting AI: An Anti-fascist Approach to Artificial Intelligence, Bristol University Press, https://bristoluniversitypress.co.uk/resisting-ai.