Skip to Main Content
It looks like you're using Internet Explorer 11 or older. This website works best with modern browsers such as the latest versions of Chrome, Firefox, Safari, and Edge. If you continue with this browser, you may see unexpected results.

Weapons of Math Destruction, Common Read 2022

This guide is intended to help faculty and students engage with the 2022 Common Read book, Weapons of Math Destruction. Special thanks to Vicky Ludas Orlofsky and Courtney Walsh of Stevens Institute of Technology for the foundation for this guide.

AI in Criminal Justice

AI in Criminal Justice

“[F]airness is squishy and hard to quantify. It is a concept. And computers, for all of their advances in language and logic, still struggle mightily with concepts. ... So fairness isn’t calculated into WMDs. And the result is massive, industrial production of unfairness. If you think of a WMD as a factory, unfairness is the black stuff belching out of the smoke stacks. It’s an emission, a toxic one.” (pp. 94-95, emphasis in the original) 


Algorithmic Bias

Mathematical Models & Inequality

Mathematical Models and Inequality

“[M]athematical models can sift through data to locate people who are likely to face great challenges, whether from crime, poverty, or education. It's up to society whether to use that intelligence to reject and punish them--or to reach out to them with the resources they need. We can use the scale and efficiency that make WMDs so pernicious in order to help people. It all depends on the objective we choose.” (p. 118)