mean decreased impurity
The importance of a feature is computed as the (normalized) total reduction of the impurity criterion brought by that feature
Cons: favour high cardinality features over low cardinality feature which are not substitutable
Sometimes, ensembles simply averages the impurity-based feature importance
The difference between Permutation feature importance and Gini importance is that Gini importance is determined during training based on how much each feature contributes to internal representation decisions in trees, while Permutation Importance measures performance degradation post-hoc in black box models (model-agnostic).