Times are displayed in (UTC-05:00) Central Time (US & Canada)Change
Stochastic Gradient Descent Algorithms for Machine Unlearning
Machine unlearning algorithms aim to efficiently remove data from a model without retraining it from scratch, in order to remove corrupted or outdated data or respect a user's "right to be forgotten." Certified machine unlearning is a strong theoretical guarantee based on differential privacy that quantifies the probabilistic indistinguishability between the unlearned model and the model retrained on the retained data samples. While several works have proposed unlearning algorithms using second-order methods or full-batch gradient descent, stochastic gradient descent (SGD) algorithms are far more computationally tractable. We introduce new SGD algorithms for certified machine unlearning, emphasizing the importance of black-box algorithms that can be applied to trained models without costly precomputation.
Author(s):
Siqiao Mu | PhD Candidate | Northwestern University Diego Klabjan | Professor | Northwestern University
Stochastic Gradient Descent Algorithms for Machine Unlearning
Category
Abstract Submission
Description
Primary Track: Data Analytics and Information Systems