Whatls Left After Distillation? How Knowledge Transfer Impacts Fairness and Bias

dc.contributor.advisorIoannou, Yani Andrew
dc.contributor.authorMohammadshahi, Aida
dc.contributor.committeememberFar, Behrouz H.
dc.contributor.committeememberBento, Mariana Pinheiro
dc.date2025-02
dc.date.accessioned2025-01-16T18:01:59Z
dc.date.available2025-01-16T18:01:59Z
dc.date.issued2025-01-10
dc.description.abstractKnowledge Distillation is a commonly used Deep Neural Network (DNN) compression method, which often maintains overall generalization performance. However, we show that even for balanced image classification datasets, such as CIFAR-100, Tiny ImageNet and ImageNet, as many as 41% of the classes are statistically significantly affected by distillation when comparing class-wise accuracy (i.e. class bias) between a teacher/distilled student or distilled student/non-distilled student model. Changes in class bias are not necessarily an undesirable outcome when considered outside of the context of a model’s usage. Using two common fairness metrics, Demographic Parity Difference (DPD) and Equalized Odds Difference (EOD) on models trained with the CelebA, Trifeature, and HateXplain datasets, our results suggest that increasing the distillation temperature improves the distilled student model’s fairness, and the distilled student fairness can even surpass the fairness of the teacher model at high temperatures. This study highlights the uneven effects of distillation on certain classes and its potentially significant role in fairness, emphasizing that caution is warranted when using distilled models for sensitive application domains.
dc.identifier.citationMohammadshahi, A. (2025). Whatis left after distillation? How knowledge transfer impacts fairness and bias (Master's thesis, University of Calgary, Calgary, Canada). Retrieved from https://prism.ucalgary.ca.
dc.identifier.urihttps://hdl.handle.net/1880/120441
dc.language.isoen
dc.publisher.facultySchulich School of Engineering
dc.publisher.institutionUniversity of Calgary
dc.rightsUniversity of Calgary graduate students retain copyright ownership and moral rights for their thesis. You may use this material in any way that is permitted by the Copyright Act or through licensing that has been assigned to the document. For uses that are not allowable under copyright legislation or licensing, you are required to seek permission.
dc.subjectKnowledge Distillation
dc.subjectEfficient Deep Learning
dc.subjectBias
dc.subjectFairness
dc.subject.classificationArtificial Intelligence
dc.titleWhatls Left After Distillation? How Knowledge Transfer Impacts Fairness and Bias
dc.typemaster thesis
thesis.degree.disciplineEngineering – Electrical & Computer
thesis.degree.grantorUniversity of Calgary
thesis.degree.nameMaster of Science (MSc)
ucalgary.thesis.accesssetbystudentI do not require a thesis withhold – my thesis will have open access and can be viewed and downloaded publicly as soon as possible.
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
ucalgary_2025_mohammadshahi_aida.pdf
Size:
6.25 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2.62 KB
Format:
Item-specific license agreed upon to submission
Description: