CLUE: Bringing Machine Unlearning to Mobile Devices
Abstract
Class-level machine unlearning has been proposed toaddress security and privacy issues of deep neural net-works (DNNs). However, existing approaches either ex-hibit low performance or have excessive computation/stor-age requirements. This makes them inapplicable in mobilecomputing scenarios, where computation and memory areseverely constrained yet unlearning has to be performedfrequently and effectively. This limitation is mainly due tothe usage of a retain dataset, i.e., a sub-dataset contain-ing the knowledge that the DNN should maintain after theunlearning. In this paper, we propose CLUE, an unlearn-ing algorithm that does not require a retain dataset. Ourkey idea is to treat inputs coming from the forget class asout-of-distribution data and to use knowledge distillationto impose this constraint on the updated DNN. We haveexperimentally evaluated CLUE on Resnet-20, ViT-Base,and ViT-Large DNNs trained on CIFAR10, CIFAR100, andVGGFace2 datasets. We have also implemented CLUE onRaspberry PI and compared the power consumption andlatency of CLUE with respect to several existing baselines.We show that CLUE improves power consumption by 68%and latency by 90% while improving the unlearning perfor-mance by up to 4.74%.