SLIDES PHIL1001 W09b - Algorithms & Decisions, Part 2 (2024)
SLIDES PHIL1001 W09b - Algorithms & Decisions, Part 2 (2024)
Agenda
1. Recap/revision
1.1. A.I. Algorithms & Human Decisions.
Test
Scores
Did they
graduate?
Name of
Sample Machine- School
Address/
Suburb
Parents’
Income
1
5/2/24
Test
Scores
? Name of
School
Likelihood
of Some Areas Where
Graduating?
Extra
Curricular Machine Learning
Activities
After Algorithms Are Used for
School
Job Decision-Making
Address/
Suburb
Parents’
Income
2
5/2/24
(a) Mom was not qualified for the job, but was hired on the basis
of a characteristic that was arbitrary from the P.O.V. of the job.
(b) This was unfair to more qualified candidates, & thus, wrong.
(c) This phenomenon happens all the time, even if not publicly
acknowledged. (Hirers may not even be aware.)
“Algorithmic Bias”
3
5/2/24
Example 1:
Amazon’s Recruitment
Algorithm
Dastin, Jeffrey (2018). “Amazon Scraps Secret AI Recruiting Tool That Showed
Bias Against Women”. Reuters.
https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-
recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G [Accessed 2/5/2022]
(b) It predicted that on average black and white patients had similar levels
of risk—and so were referred to the program in similar proportions.
(c) But it was later found that, in fact, a much greater proportion of black
patients than white patients had serious health problems.
(d) So, where a black and white patient are equally sick;
the white patient was generally assigned a higher risk score,
and was more likely to be referred to the program.
Ledford, Heidi (2019). “Millions Affected by Racial Bias in Health Care Algorithm.”
Nature 574: 608-609. https://www.nature.com/articles/d41586-019-03228-6
(e) The bias was the result of measuring degree of sickness in terms of
how much $ was spent on the patient’s health.
4
5/2/24
Question:
How could these algorithms
be biased if data about
gender & race was excluded?
Address/
Suburb
Johnson (2020).
Parents’
Income
5
5/2/24
______ (2020). "Algorithms, Agency, and Respect for Persons". • In sum: each person has a right to be treated as an individual.
Social Theory and Practice 46(3):547-572.
• The Case of Wisconson vs. Loomis
COMPAS: COMPAS:
Correctional Offender Management Profiling for Alternative Sanctions. Excerpts from the questionnaire given to defendant:
Aaron Darren
6
5/2/24
Section 3.3
Agency
______ (2020). "Algorithms, Agency, and Respect for Persons".
Social Theory and Practice 46(3):547-572.
Laundering
(Especially pages: 556-558, and 561-564.)