Our research analyses the balance between maintaining privacy and preserving statistical accuracy when dealing with multivariate data that is subject to componentwise local differential privacy (CLDP). With CLDP, each component of the private data is made public through a separate privacy channel. This allows for varying levels of privacy protection for different components or for the privatization of each component by different entities, each with their own distinct privacy policies. It also covers the practical situations where it is impossible to privatize jointly all the components of the raw data. We develop general techniques for establishing minimax bounds that shed light on the statistical cost of privacy in this context, as a function of the privacy levels $\alpha_1, \dots , \alpha_d$ of the $d$ components and demonstrate the versatility and efficiency of these techniques by presenting various statistical applications. Additionally, we conduct a detailed analysis of the effective privacy level, exploring how information about a private characteristic of an individual may be inferred from the publicly visible characteristics of the same individual.
The talk is based on a joint work with A. Gloter.