TY - JOUR
T1 - Approximate Kernel Density Estimation under Metric-based Local Differential Privacy
AU - Zhou, Yi
AU - Wang, Yanhao
AU - Teng, Long
AU - Huang, Qiang
AU - Chen, Cen
N1 - Publisher Copyright:
© 2024 Proceedings of Machine Learning Research. All rights reserved.
PY - 2024
Y1 - 2024
N2 - Kernel Density Estimation (KDE) is a fundamental problem with broad machine learning applications. In this paper, we investigate the KDE problem under Local Differential Privacy (LDP), a setting in which users privatize data on their own devices before sending them to an untrusted server for analytics. To strike a balance between ensuring local privacy and preserving high-utility KDE results, we adopt a relaxed definition of LDP based on metrics (mLDP), which is suitable when data points are represented in a metric space and can be more distinguishable as their distances increase. To the best of our knowledge, approximate KDE under mLDP has not been explored in the existing literature. We propose the MLDP-KDE framework, which augments a locality-sensitive hashing-based sketch method to provide mLDP and answer any KDE query unbiasedly within an additive error with high probability in sublinear time and space. Extensive experimental results demonstrate that the MLDP-KDE framework outperforms several existing KDE methods under LDP and mLDP by achieving significantly better trade-offs between privacy and utility, with particularly remarkable advantages on large, high-dimensional data.
AB - Kernel Density Estimation (KDE) is a fundamental problem with broad machine learning applications. In this paper, we investigate the KDE problem under Local Differential Privacy (LDP), a setting in which users privatize data on their own devices before sending them to an untrusted server for analytics. To strike a balance between ensuring local privacy and preserving high-utility KDE results, we adopt a relaxed definition of LDP based on metrics (mLDP), which is suitable when data points are represented in a metric space and can be more distinguishable as their distances increase. To the best of our knowledge, approximate KDE under mLDP has not been explored in the existing literature. We propose the MLDP-KDE framework, which augments a locality-sensitive hashing-based sketch method to provide mLDP and answer any KDE query unbiasedly within an additive error with high probability in sublinear time and space. Extensive experimental results demonstrate that the MLDP-KDE framework outperforms several existing KDE methods under LDP and mLDP by achieving significantly better trade-offs between privacy and utility, with particularly remarkable advantages on large, high-dimensional data.
UR - https://www.scopus.com/pages/publications/85212180290
M3 - 会议文章
AN - SCOPUS:85212180290
SN - 2640-3498
VL - 244
SP - 4250
EP - 4270
JO - Proceedings of Machine Learning Research
JF - Proceedings of Machine Learning Research
T2 - 40th Conference on Uncertainty in Artificial Intelligence, UAI 2024
Y2 - 15 July 2024 through 19 July 2024
ER -