Longlisted for the National Book Award | New York Times Bestseller A former Wall Street quant sounds an alarm on the mathematical models that pervade modern life and threaten to rip apart our social fabric. We live in the age of the algorithm. Increasingly, the decisions that affect our lives--where we go to school, whether we get a car loan, how much we pay for health insurance--are being made not by humans, but by mathematical models. In theory, this should lead to greater fairness: Everyone is judged according to the same rules, and bias is eliminated. But as Cathy O'Neil reveals in this urgent and necessary book, the opposite is true. The models being used today are opaque, unregulated, and uncontestable, even when they're wrong. Most troubling, they reinforce discrimination: If a poor student can't get a loan because a lending model deems him too risky (by virtue of his zip code), he's then cut off from the kind of education that could pull him out of poverty, and a vicious spiral ensues. Models are propping up the lucky and punishing the downtrodden, creating a "toxic cocktail for democracy." Welcome to the dark side of Big Data. Tracing the arc of a person's life, O'Neil exposes the black box models that shape our future, both as individuals and as a society. These "weapons of math destruction" score teachers and students, sort résumés, grant (or deny) loans, evaluate workers, target voters, set parole, and monitor our health. O'Neil calls on modelers to take more responsibility for their algorithms and on policy makers to regulate their use. But in the end, it's up to us to become more savvy about the models that govern our lives. This important book empowers us to ask the tough questions, uncover the truth, and demand change. [from the publisher]
How big data and machine learning encode discrimination and create agitated clusters of comforting rage. In Discriminating Data, Wendy Hui Kyong Chun reveals how polarization is a goal-not an error-within big data and machine learning. These methods, she argues, encode segregation, eugenics, and identity politics through their default assumptions and conditions. Correlation, which grounds big data's predictive potential, stems from twentieth-century eugenic attempts to "breed" a better future. Recommender systems foster angry clusters of sameness through homophily. Users are "trained" to become authentically predictable via a politics and technology of recognition. Machine learning and data analytics thus seek to disrupt the future by making disruption impossible. Chun, who has a background in systems design engineering as well as media studies and cultural theory, explains that although machine learning algorithms may not officially include race as a category, they embed whiteness as a default. Facial recognition technology, for example, relies on the faces of Hollywood celebrities and university undergraduates-groups not famous for their diversity. Homophily emerged as a concept to describe white U.S. resident attitudes to living in biracial yet segregated public housing. Predictive policing technology deploys models trained on studies of predominantly underserved neighborhoods. Trained on selected and often discriminatory or dirty data, these algorithms are only validated if they mirror this data. How can we release ourselves from the vice-like grip of discriminatory data? Chun calls for alternative algorithms, defaults, and interdisciplinary coalitions in order to desegregate networks and foster a more democratic big data.
This open access book explores the challenges society faces with big data, through the lens of culture rather than social, political or economic trends, as demonstrated in the words we use, the values that underpin our interactions, and the biases and assumptions that drive us. Focusing on areas such as data and language, data and sensemaking, data and power, data and invisibility, and big data aggregation, it demonstrates that humanities research, focussing on cultural rather than social, political or economic frames of reference for viewing technology, resists mass datafication for a reason, and that those very reasons can be instructive for the critical observation of big data research and innovation.The eBook editions of this book are available open access under a CC BY-NC-ND 4.0 licence on bloomsburycollections.com. Open access was funded by Trinity College Dublin, DARIAH-EU and the European Commission.
In her 2017 Ted Talk, Cathy O’Neil, author of Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, illuminates algorithms. O’Neil discusses what goes into creating algorithms and why they may not be as objective as we think, raising questions like, how biases is unintentionally built into and perpetuated by algorithms. O’Neil stresses the importance of transparency and integrity in the use of data.
A PBS interview where Weapons of Math Destruction author Cathy O’Neil discusses her experience as a Wall Street Quant and why she decided to leave it and eventually join the Occupy Wall Street movement, citing the lack of transparency in the current economic system as the culprit for its downfall.
An NPR interview wherein Weapons of Math Destruction author Cathy O’Neil warns against the dangers of creating a “techno utopia” and provides modern examples of how algorithms have perpetuated inequality.
A C-SPAN discussion between Jennifer Golbeck, director of the University of Maryland, College Park Social Intelligence Lab and Weapons of Math Destruction author Cathy O’Neil, covering the exploitive nature of algorithms being used by companies to increase their profits.
In this animated video produced by The Royal Society for the Encouragement of Arts, Manufactures and Commerce, Cathy O’Neil, author of Weapons of Math Destruction, explains the elements that are used to build an algorithm and how these seemingly objective systems are influenced by the biases of their creators and the historical data they use.
Part one of the NPR TED Radio Hour episode “Can We Trust the Numbers” where author Cathy O’Neil chats with Guy Raz about how biases are embedded into algorithms regardless of whether their creators intend them to be.
A review of Weapons of Math Destruction written for the Center for Digital Ethics and policy which was founded through the School of Communication at Loyola University Chicago. The review includes a look into the book’s “idea of a ‘Weapon of Math Destruction’”, “diversity of damage beyond unfairness and discrimination,” and “the takeaways” from the book.
A review of Weapons of Math Destruction written by Barbara Fister who says of the book, “Big data is being secretly wielded in ways that increase inequality and unaccountability, and a former quant can tell you want it all means – most entertainingly.”