Whatls Left After Distillation? How Knowledge Transfer Impacts Fairness and Bias
dc.contributor.advisor | Ioannou, Yani Andrew | |
dc.contributor.author | Mohammadshahi, Aida | |
dc.contributor.committeemember | Far, Behrouz H. | |
dc.contributor.committeemember | Bento, Mariana Pinheiro | |
dc.date | 2025-02 | |
dc.date.accessioned | 2025-01-16T18:01:59Z | |
dc.date.available | 2025-01-16T18:01:59Z | |
dc.date.issued | 2025-01-10 | |
dc.description.abstract | Knowledge Distillation is a commonly used Deep Neural Network (DNN) compression method, which often maintains overall generalization performance. However, we show that even for balanced image classification datasets, such as CIFAR-100, Tiny ImageNet and ImageNet, as many as 41% of the classes are statistically significantly affected by distillation when comparing class-wise accuracy (i.e. class bias) between a teacher/distilled student or distilled student/non-distilled student model. Changes in class bias are not necessarily an undesirable outcome when considered outside of the context of a model’s usage. Using two common fairness metrics, Demographic Parity Difference (DPD) and Equalized Odds Difference (EOD) on models trained with the CelebA, Trifeature, and HateXplain datasets, our results suggest that increasing the distillation temperature improves the distilled student model’s fairness, and the distilled student fairness can even surpass the fairness of the teacher model at high temperatures. This study highlights the uneven effects of distillation on certain classes and its potentially significant role in fairness, emphasizing that caution is warranted when using distilled models for sensitive application domains. | |
dc.identifier.citation | Mohammadshahi, A. (2025). Whatis left after distillation? How knowledge transfer impacts fairness and bias (Master's thesis, University of Calgary, Calgary, Canada). Retrieved from https://prism.ucalgary.ca. | |
dc.identifier.uri | https://hdl.handle.net/1880/120441 | |
dc.language.iso | en | |
dc.publisher.faculty | Schulich School of Engineering | |
dc.publisher.institution | University of Calgary | |
dc.rights | University of Calgary graduate students retain copyright ownership and moral rights for their thesis. You may use this material in any way that is permitted by the Copyright Act or through licensing that has been assigned to the document. For uses that are not allowable under copyright legislation or licensing, you are required to seek permission. | |
dc.subject | Knowledge Distillation | |
dc.subject | Efficient Deep Learning | |
dc.subject | Bias | |
dc.subject | Fairness | |
dc.subject.classification | Artificial Intelligence | |
dc.title | Whatls Left After Distillation? How Knowledge Transfer Impacts Fairness and Bias | |
dc.type | master thesis | |
thesis.degree.discipline | Engineering – Electrical & Computer | |
thesis.degree.grantor | University of Calgary | |
thesis.degree.name | Master of Science (MSc) | |
ucalgary.thesis.accesssetbystudent | I do not require a thesis withhold – my thesis will have open access and can be viewed and downloaded publicly as soon as possible. |