Introduces a Python package deal for producing counterfactual explanations for tree-based algorithms
The significance of interpretability in machine studying fashions is rising as they’re more and more utilized in actual‐world situations. Understanding how fashions make selections advantages not solely the mannequin’s customers but in addition those that are affected by the selections made by the mannequin. Counterfactual explanations have been developed to deal with this challenge, as they permit people to grasp how they might obtain a fascinating consequence by perturbing their authentic information. Within the quick time period, counterfactual clarification probably demonstrates actionable options to those that are affected by a machine studying mannequin choice. For instance, an individual who was rejected for a mortgage utility might know what might have carried out to be accepted this time and that might be helpful to enhance on their subsequent utility.
Lucic et al.  proposed FOCUS, which is designed to generate optimum distance counterfactual explanations to the unique information for all of the situations in tree‐primarily based machine studying fashions.
CFXplorer is a Python package deal that generates counterfactual explanations for a given mannequin and information by utilizing the FOCUS algorithm. This text introduces and showcases how CFXplorer can be utilized for producing counterfactual explanations.
GitHub repo: https://github.com/kyosek/CFXplorer
- FOCUS algorithm
- CFXplorer examples
This part briefly introduces the FOCUS algorithm.
The technology of counterfactual explanations is an issue that has been addressed by a number of present strategies. Wachter, Mittelstadt, and Russell  formulated this downside into an optimisation framework, nonetheless, this…