How to open the black box – individual prediction explanation

Publication details

  • Event: (Trondheim)
  • Year: 2020
  • Organiser: Department of Mathematical Sciences, NTNU

Why did just you get a rejection on your loan application? Why is the price of your car insurance higher than that of your neighbor? More and more such decisions are made by complex statistical/machine learning models based on relevant data. Such (regression) models are often referred to as "black boxes" due to the difficulty of understanding how they work and produce different predictions. As these methods become increasingly important for individuals in our society, there is a clear need for methods which can help us understand their predictions, that is "open the black box". In this talk, I will motivate why this is useful and important. I will further discuss how Shapley values from game theory can be used as an explanation framework. To correctly explain the predictions, it is crucial to model the dependence between the covariates. I will exemplify this by showing that even a simple linear regression model is difficult to explain when the covariates are highly dependent. Finally, I will lay out recent work and methodology for modeling such dependence and how that leads to more accurate explanations through the Shapley value framework.