Subject Guides

Weapons of Math Destruction, Common Read 2022

This guide is intended to help faculty and students engage with the 2022 Common Read book, Weapons of Math Destruction. Special thanks to Vicky Ludas Orlofsky and Courtney Walsh of Stevens Institute of Technology for the foundation for this guide.

Ethics in AI

The Ethics of Big Data

"Big Data processes codify the past. They do not invent the future. Doing that requires moral imagination, and that's something only humans can provide. We have to explicitly embed better values into our algorithms, creating Big Data models that follow our ethical lead. Sometimes that will mean putting fairness ahead of profit." (p. 204)

Guides


Articles

Books on the Ethics of Big Data

Books on the Ethics of Big Data

Off-campus users will be prompted to log in to read these books using their Binghamton username and password.

Ethical Standards

Ethical Standards

“Algorithms are only going to become more ubiquitous in the coming years. We must demand that systems that hold algorithms accountable become ubiquitous as well” (p. 231).

Scholars from around the world have been discussing the issue of ethics in AI and devising standards and principles to guide ethical usage.

The AI4People group (linked below), in surveying AI opportunities and risks (see image), developed the following principles based on traditional bioethics principles (1-4) and adding one:

  1. Beneficence: Promoting Well-Being, Preserving Dignity, and Sustaining the Planet
  2. Non-maleficence: Privacy, Security and “Capability Caution”
  3. Autonomy: The Power to Decide (Whether to Decide)
  4. Justice: Promoting Prosperity and Preserving Solidarity
  5. Explicability: Enabling the Other Principles Through Intelligibility and Accountability

An external file that holds a picture, illustration, etc.Object name is 11023_2018_9482_Fig1_HTML.jpg

"Overview of the four core opportunities offered by AI, four corresponding risks, and the opportunity cost of underusing AI", Floridi et al., 2018 (see link below)