%0 Journal Article %T Heavy-Head Sampling Strategy of Graph Convolutional Neural Networks for q-Consistent Summary-Explanations with Application to Credit Evaluation Systems %A Xinrui Dou %J Open Access Library Journal %V 10 %N 9 %P 1-17 %@ 2333-9721 %D 2023 %I Open Access Library %R 10.4236/oalib.1110615 %X Machine learning systems have found extensive applications as auxiliary tools in domains that necessitate critical decision-making, such as healthcare and criminal justice. The interpretability of these systems¡¯ decisions is of paramount importance to instill trust among users. Recently, there have been developments in globally-consistent rule-based summary-explanation and its max-support (MSGC) problem, enabling the provision of explanations for specific decisions along with pertinent dataset statistics. Nonetheless, globally-consistent summary-explanations with limited complexity tend to have small supports, if any. In this study, we propose a more lenient variant of the summary-explanation, namely the q-consistent summary-explanation, which strives to achieve greater support at the expense of slightly reduced consistency. However, the challenge lies in the fact that the max-support problem of the q-consistent summary-explanation (MSqC) is significantly more intricate than the original MSGC problem, leading to extended solution times using standard branch-and-bound (B & B) solvers. We improve the B & B solving process by replacing time-consuming heuristics with machine learning (ML) models and apply a heavy-head sampling strategy for imitation learning of MSqC problems by exploiting the heavy-head maximum depth distribution of B & B solution trees. Experimental results show that using the heavy-head sampling strategies, the final evaluation results of trained strategies on MSqC problems are significantly improved compared to previous studies using uniform sampling strategies. %K Summary-Explanation %K q-Consistent %K Branch-and-Bound %K Heavy-Head Sampling Strategy %U http://www.oalib.com/paper/6803519