M-estimation, noisy optimization and user-level local privacy

With Marco Avella Medina (Columbia University)

M-estimation, noisy optimization and user-level local privacy

We propose a general optimization-based framework for computing differentially private M-estimators. We first show that robust statistics can be used in conjunction with noisy gradient descent or noisy Newton methods in order to obtain optimal private estimators with global linear or quadratic convergence, respectively. We establish local and global convergence guarantees, under both local strong convexity and self-concordance, showing that our private estimators converge with high probability to a small neighborhood of the nonprivate M-estimators. We then extend this optimization framework to the more restrictive setting of local differential privacy (LDP) where a group of users communicates with an untrusted central server. Contrary to most works that aim to protect a single data point, here we assume each user possesses multiple data points and focus on user-level privacy which aims to protect the entire set of data points belonging to a user. Our main algorithm is a noisy gradient descent algorithm, combined with a user-level LDP mean estimation procedure to privately compute the average gradient across users at each step. We will highlight the challenges associated with guaranteeing user-level LDP and present finite sample global linear convergence guarantees for the iterates of our algorithm.

Add to your calendar or Include in your list