AI in Society's Systems
Algorithmic systems have been embedded into hiring, healthcare, criminal justice, housing, and credit — often with documented discriminatory outcomes. These three guides follow the evidence: who gets harmed, why, and what legal rights and practical recourse exist.
44.9%
false positive rate for Black defendants in COMPAS risk scoring
75%
of resumes rejected before a human reads them
0.4s
for an algorithm to review and deny a 91-year-old's care
3 guides in this series
Algorithmic Bias and Civil Rights: When AI Gets It Wrong
Algorithms are making decisions about bail, housing, credit, and healthcare — and their mistakes fall hardest on communities least able to fight back. The cases, the data, and your rights.
AI in Hiring and the Workplace: Know Your Rights
Algorithms are screening your résumé, scoring your video interview, and managing your hours — often with documented bias. What you need to know and what you can do about it.
AI and Healthcare Decisions: When Algorithms Deny Your Care
AI is making decisions about your medical care — often without your knowledge, often wrong, and almost always difficult to appeal. What's happening, how to fight it, and what the law requires.
Want CPAI to present this content to your community?
We deliver research-based AI education for advocacy organizations, community groups, and policy audiences.