Subject Guides
Weapons of Math Destruction, Common Read 2022
This guide is intended to help faculty and students engage with the 2022 Common Read book, Weapons of Math Destruction.
Special thanks to Vicky Ludas Orlofsky and Courtney Walsh of Stevens Institute of Technology for the foundation for this guide.
AI in Criminal Justice
AI in Criminal Justice
“[F]airness is squishy and hard to quantify. It is a concept. And computers, for all of their advances in language and logic, still struggle mightily with concepts. ... So fairness isn’t calculated into WMDs. And the result is massive, industrial production of unfairness. If you think of a WMD as a factory, unfairness is the black stuff belching out of the smoke stacks. It’s an emission, a toxic one.” (pp. 94-95, emphasis in the original)
- Algorithms: Resources from the Marshall ProjectThe Marshall Project (collection of criminal justice reporting)
Last updated May 2021 - Should We Be Afraid of AI in the Criminal-Justice System?The Atlantic Monthly
June, 2019 - AI: Algorithms and JusticeBerkman Klein Center for Internet & Society (Harvard)
Last updated 2019
Collected research on the subject. - Algorithms Can Be a Tool for Justice–If Used the Right WayWired
October, 2018 - Does Big Data Belong in the Courtroom?Pacific Standard
January, 2018
Algorithmic Bias
- How scientists are subtracting race from medical risk calculatorsScience
July, 2021 - Can data drive racial equity?Kennedy, E. J. (2021). Can data drive racial equity? MIT Sloan Management Review, 62(2), 9–11.
- Dissecting racial bias in an algorithm used to manage the health of populationsObermeyer, Z., Obermeyer, Z., Powers, B., Vogeli, C., & Mullainathan, S. (2019). Dissecting racial bias in an algorithm used to manage the health of populations. Science, 366(6464), 447-453.
- The Biased Algorithm: Evidence of Disparate Impact on HispanicsHamilton, M. (2019). The biased algorithm: Evidence of disparate impact on Hispanics. American Criminal Law review 56(1553).
Via SSRN. - How artificial intelligence learns to be racistVox
April, 2017
Mathematical Models & Inequality
Mathematical Models and Inequality
“[M]athematical models can sift through data to locate people who are likely to face great challenges, whether from crime, poverty, or education. It's up to society whether to use that intelligence to reject and punish them--or to reach out to them with the resources they need. We can use the scale and efficiency that make WMDs so pernicious in order to help people. It all depends on the objective we choose.” (p. 118)
Algorithms of Oppression: How Search Engines Reinforce Racism by Safiya Umoja Noble
Call Number: EbookISBN: 1479833649Publication Date: 2018Off-campus users will be prompted to log in using their Binghamton username and password.
- Last Updated: Jul 26, 2023 1:39 PM
- URL: https://libraryguides.binghamton.edu/wmd2022
- Print Page