We would never allow a drug to be sold in the market without having gone through rigorous testing — not even in the context of a health crisis like the coronavirus pandemic. Then why do we allow algorithms that can be just as damaging as a potent drug to be let loose into the world without having undergone similarly rigorous testing? At the moment, anyone can design an algorithm and use it to make important decisions about people — whether they get a loan, or a job, or an apartment, or a prison sentence — without any oversight or any kind of evidence-based requirement. The general population is being used as guinea pigs.
We Should Test AI the Way the FDA Tests Medicines
Randomized controlled trials can identify and address unintended consequences.
June 28, 2021
Summary.
Predictive algorithms risk creating self-fulfilling prophecies, reinforcing preexisting biases. This is largely because it does not distinguish between causation and correlation. To prevent this, we should submit new algorithms to randomized controlled trials, similar to those the FDA supervises when approving new drugs. This would enable us to infer whether an AI is making predictions on the basis of causation.
New!
HBR Learning
Digital Intelligence Course
Accelerate your career with Harvard ManageMentor®. HBR Learning’s online leadership training helps you hone your skills with courses like Digital Intelligence . Earn badges to share on LinkedIn and your resume. Access more than 40 courses trusted by Fortune 500 companies.
Excel in a world that's being continually transformed by technology.
Learn More & See All Courses
New!
HBR Learning
Digital Intelligence Course
Accelerate your career with Harvard ManageMentor®. HBR Learning’s online leadership training helps you hone your skills with courses like Digital Intelligence . Earn badges to share on LinkedIn and your resume. Access more than 40 courses trusted by Fortune 500 companies.
Excel in a world that's being continually transformed by technology.