Remove tag gender
article thumbnail

What Will Happen When Your Company’s Algorithms Go Wrong?

Harvard Business

2015: Image tagging software classified black people as gorillas. Checking for racial, gender, age, and other common biases in your algorithms. 2015: A robot for grabbing auto parts grabbed and killed a man. 2015: Medical AI classified patients with asthma as having a lower risk of dying of pneumonia. It’s inevitable.

article thumbnail

3 Biases That Hijack Performance Reviews, and How to Address Them

Harvard Business

The manager can keep a tally of how many meetings the person leads and how many times the person tag-teams with a coworker on a project. The major downside is we sometimes use superficial proxies like skin color or gender to decide who’s “one of us” and who’s not. Similarity bias: We like what is like us.

How To 40
article thumbnail

How to Prepare the Next Generation for Jobs in the AI Economy

Harvard Business

AI technologies face ethical dilemmas all the time — for example, how to exclude racial, ethnic, and gender prejudices from automated decisions; how a self-driving car balances the lives of its occupants with those of pedestrians, etc. Ethics also deserves more attention at every educational level.

How To 52