Position: Considerations for Differentially Private Learning with Large-Scale Public Pretraining
0citations
PDF0
Citations
3
Authors
1
Data Points
Topics
Abstract
The performance of differentially private machine learning can be boosted significantly by leveraging the transfer learning capabilities of non-private models pretrained on largepublicdatasets. We critically review this approach. We primarily question whether the use of large Web-scraped datasetsshouldbe viewed as differential-privacy-preserving. We further scrutinize whether existing machine learning benchmarks are appropriate for measuring the ability of pretrained models to generalize to sensitive domains. Finally, we observe that reliance on large pretrained models may loseotherforms of privacy, requiring data to be outsourced to a more compute-powerful third party.
Citation History
Jan 28, 2026
0