Despite its many useful theoretical properties, differential privacy (DP) has
one substantial blind spot: any release that non-trivially depends on
confidential data without additional privacy-preserving randomization fails to
satisfy DP. Such a restriction is rarely met in practice, as most data releases
under DP are actually “partially private” data (PPD). This poses a significant
barrier to accounting for privacy risk and utility under logistical constraints
imposed on data curators, especially those working with official statistics. In
this paper, we propose a privacy definition which accommodates PPD and prove it
maintains similar properties to standard DP. We derive optimal transport-based
mechanisms for releasing PPD that satisfy our definition and algorithms for
valid statistical inference using PPD, demonstrating their improved performance
over post-processing methods. Finally, we apply these methods to a case study
on US Census and CDC PPD to investigate private COVID-19 infection rates. In
doing so, we show how data curators can use our framework to overcome barriers
to operationalizing formal privacy while providing more transparency and
accountability to users.