Home
»
Discriminating Data
21st century
A01=Alex Barnett
A01=Wendy Hui Kyong Chun
Age Group_Uncategorized
Age Group_Uncategorized
ai
Author_Alex Barnett
Author_Wendy Hui Kyong Chun
automatic-update
automation
brain
business
Category1=Non-Fiction
Category=UB
Category=UNC
civil rights
civil rights books
communication
COP=United States
criminal justice
critical thinking
culture
Delivery_Delivery within 10-20 working days
economics
education
engineer
engineer gifts
engineering
engineering books
eq_bestseller
eq_computing
eq_isMigrated=2
eq_nobargain
eq_non-fiction
evolution
feminism
future
history
history books
identity
immigration
internet
language
Language_English
law
neuroscience
PA=Available
philosophy
police brutality
political books
politics
Price_€20 to €50
PS=Active
psychology
race relations
racism
school
self help
social
social justice
social media
society
sociology
sociology books
softlaunch
technology
white privilege
work
Product details
- ISBN 9780262548526
- Dimensions: 152 x 229mm
- Publication Date: 05 Mar 2024
- Publisher: MIT Press Ltd
- Publication City/Country: US
- Product Form: Paperback
- Language: English
Delivery/Collection within 10-20 working days
Our Delivery Time Frames Explained
2-4 Working Days: Available in-stock
10-20 Working Days: On Backorder
Will Deliver When Available: On Pre-Order or Reprinting
We ship your order once all items have arrived at our warehouse and are processed. Need those 2-4 day shipping items sooner? Just place a separate order for them!
How big data and machine learning encode discrimination and create agitated clusters of comforting rage.
In Discriminating Data, Wendy Hui Kyong Chun reveals how polarization is a goal—not an error—within big data and machine learning. These methods, she argues, encode segregation, eugenics, and identity politics through their default assumptions and conditions. Correlation, which grounds big data’s predictive potential, stems from twentieth-century eugenic attempts to “breed” a better future. Recommender systems foster angry clusters of sameness through homophily. Users are “trained” to become authentically predictable via a politics and technology of recognition. Machine learning and data analytics thus seek to disrupt the future by making disruption impossible.
Chun, who has a background in systems design engineering as well as media studies and cultural theory, explains that although machine learning algorithms may not officially include race as a category, they embed whiteness as a default. Facial recognition technology, for example, relies on the faces of Hollywood celebrities and university undergraduates—groups not famous for their diversity. Homophily emerged as a concept to describe white U.S. resident attitudes to living in biracial yet segregated public housing. Predictive policing technology deploys models trained on studies of predominantly underserved neighborhoods. Trained on selected and often discriminatory or dirty data, these algorithms are only validated if they mirror this data.
How can we release ourselves from the vice-like grip of discriminatory data? Chun calls for alternative algorithms, defaults, and interdisciplinary coalitions in order to desegregate networks and foster a more democratic big data.
In Discriminating Data, Wendy Hui Kyong Chun reveals how polarization is a goal—not an error—within big data and machine learning. These methods, she argues, encode segregation, eugenics, and identity politics through their default assumptions and conditions. Correlation, which grounds big data’s predictive potential, stems from twentieth-century eugenic attempts to “breed” a better future. Recommender systems foster angry clusters of sameness through homophily. Users are “trained” to become authentically predictable via a politics and technology of recognition. Machine learning and data analytics thus seek to disrupt the future by making disruption impossible.
Chun, who has a background in systems design engineering as well as media studies and cultural theory, explains that although machine learning algorithms may not officially include race as a category, they embed whiteness as a default. Facial recognition technology, for example, relies on the faces of Hollywood celebrities and university undergraduates—groups not famous for their diversity. Homophily emerged as a concept to describe white U.S. resident attitudes to living in biracial yet segregated public housing. Predictive policing technology deploys models trained on studies of predominantly underserved neighborhoods. Trained on selected and often discriminatory or dirty data, these algorithms are only validated if they mirror this data.
How can we release ourselves from the vice-like grip of discriminatory data? Chun calls for alternative algorithms, defaults, and interdisciplinary coalitions in order to desegregate networks and foster a more democratic big data.
Wendy Hui Kyong Chun is Simon Fraser University’s Canada 150 Research Chair in New Media and Professor of Communication and Director of the SFU Digital Democracies Institute. She is the author of Control and Freedom, Programmed Visions, and Updating to Remain the Same, all published by the MIT Press.
Alex Barnett is Group Leader for Numerical Analysis at the Center for Computational Mathematics at the Flatiron Institute in New York. He has published more than 50 research papers in scientific computing, differential equations, fluids, waves, imaging, physics, neuroscience, and statistics.
Alex Barnett is Group Leader for Numerical Analysis at the Center for Computational Mathematics at the Flatiron Institute in New York. He has published more than 50 research papers in scientific computing, differential equations, fluids, waves, imaging, physics, neuroscience, and statistics.
Qty:
