When dealing with complex and redundant data classification problems, many classifiers cannot provide high predictive accuracy and interpretability. We also find that the least-squares support vector classifiers (LSSVCs) hardly identify important instances and features from data, so they cannot give an interpretable prediction. Although the LSSVC has the properties of low bias and high robustness, its high variance often gives a poor predictive performance. In this paper, we propose an alternating minimization-based sparse least-squares classifier (AMSLC) approach in the framework of LSSVCs to address the aforementioned problems. Based on the reconstructed row- and column-wise kernel matrices, the sparsity-induced ℓ 0 -norm approximation function is introduced to the LSSVC model. By alternately solving two unconstrained quadratic optimization problems or two systems of linear equations, AMSLC can predict the class labels of given instances and extract the least number of important instances and features to obtain the interpretable classification. Compared with SVC, LSSVC, ℓ 1 -norm SVC (L1SVC), ℓ 0 -norm SVC (L0SVC), the least absolute shrinkage and selection operator classifier (LASSOC), and multiple kernel learning SVC (MKLSVC) on four real credit datasets, the experimental results show that the proposed AMSLC method generally obtains the best predictive accuracy and the interpretable classification with the minimum number of important instances and features.