We introduce Tritium, an automatic differentiation-based sensitivity analysis
framework for differentially private (DP) machine learning (ML). Optimal noise
calibration in this setting requires efficient Jacobian matrix computations and
tight bounds on the L2-sensitivity. Our framework achieves these objectives by
relying on a functional analysis-based method for sensitivity tracking, which
we briefly outline. This approach interoperates naturally and seamlessly with
static graph-based automatic differentiation, which enables order-of-magnitude
improvements in compilation times compared to previous work. Moreover, we
demonstrate that optimising the sensitivity of the entire computational graph
at once yields substantially tighter estimates of the true sensitivity compared
to interval bound propagation techniques. Our work naturally befits recent
developments in DP such as individual privacy accounting, aiming to offer
improved privacy-utility trade-offs, and represents a step towards the
integration of accessible machine learning tooling with advanced privacy
accounting systems.

By admin