When Is It Important for an Algorithm to Explain Itself?
Harvard Business
JULY 6, 2018
Maskot/Getty Images. Many efforts to apply machine learning get stuck due to concerns about the “black box” — that is, the lack of transparency around why a system does what it does. Sometimes this is because people want to understand why some prediction was made before they take life-altering actions, as when a computer vision system indicates a 95% likelihood of cancer from an x-ray of a patient’s lung.
Let's personalize your content