Introduces a Python bundle for producing counterfactual explanations for tree-based algorithms
The significance of interpretability in machine studying fashions is rising as they’re more and more utilized in actual‐world eventualities. Understanding how fashions make selections advantages not solely the mannequin’s customers but in addition those that are affected by the selections made by the mannequin. Counterfactual explanations have been developed to deal with this subject, as they permit people to know how they’d obtain a fascinating final result by perturbing their unique information. Within the quick time period, counterfactual rationalization presumably demonstrates actionable strategies to those that are affected by a machine studying mannequin determination. For instance, an individual who was rejected for a mortgage utility might know what might have accomplished to be accepted this time and that will be helpful to enhance on their subsequent utility.
Lucic et al. [1] proposed FOCUS, which is designed to generate optimum distance counterfactual explanations to the unique information for all of the situations in tree‐primarily based machine studying fashions.
CFXplorer is a Python bundle that generates counterfactual explanations for a given mannequin and information through the use of the FOCUS algorithm. This text introduces and showcases how CFXplorer can be utilized for producing counterfactual explanations.
GitHub repo: https://github.com/kyosek/CFXplorer
Documentation: https://cfxplorer.readthedocs.io/en/newest/?badge=newest
PyPI: https://pypi.org/undertaking/CFXplorer/
FOCUS algorithmCFXplorer examplesLimitationsConclusionReferences
This part briefly introduces the FOCUS algorithm.
The technology of counterfactual explanations is an issue that has been addressed by a number of present strategies. Wachter, Mittelstadt, and Russell [2] formulated this drawback into an optimisation framework, nevertheless, this…