Approximate Kernel Density Estimation under Metric-based Local Differential Privacy

Yi Zhou, Yanhao Wang, Long Teng, Qiang Huang, Cen Chen

Research output: Contribution to journalConference articlepeer-review

Abstract

Kernel Density Estimation (KDE) is a fundamental problem with broad machine learning applications. In this paper, we investigate the KDE problem under Local Differential Privacy (LDP), a setting in which users privatize data on their own devices before sending them to an untrusted server for analytics. To strike a balance between ensuring local privacy and preserving high-utility KDE results, we adopt a relaxed definition of LDP based on metrics (mLDP), which is suitable when data points are represented in a metric space and can be more distinguishable as their distances increase. To the best of our knowledge, approximate KDE under mLDP has not been explored in the existing literature. We propose the MLDP-KDE framework, which augments a locality-sensitive hashing-based sketch method to provide mLDP and answer any KDE query unbiasedly within an additive error with high probability in sublinear time and space. Extensive experimental results demonstrate that the MLDP-KDE framework outperforms several existing KDE methods under LDP and mLDP by achieving significantly better trade-offs between privacy and utility, with particularly remarkable advantages on large, high-dimensional data.

Original languageEnglish
Pages (from-to)4250-4270
Number of pages21
JournalProceedings of Machine Learning Research
Volume244
StatePublished - 2024
Event40th Conference on Uncertainty in Artificial Intelligence, UAI 2024 - Barcelona, Spain
Duration: 15 Jul 202419 Jul 2024

Fingerprint

Dive into the research topics of 'Approximate Kernel Density Estimation under Metric-based Local Differential Privacy'. Together they form a unique fingerprint.

Cite this