XAI-driven Knowledge Distillation of Large Language Models for Efficient Deployment on Low-Resource DevicesRiccardo Cantini, Alessio Orsino, Domenico TaliaLast updated on Dec 5, 2025 Cite DOI ProjectLarge Language Models Sustainable AI eXplainable Artificial Intelligence Knowledge Distillation Low-resource devices machine learning