Notes on the Bretagnolle-Huber Inequality

The Bretagnolle-Huber Inequality provides a bound on the total variation distance between two probability distributions in terms of their Kullback-Leibler divergence, and it is better than Pinsker’s Inequality when the KL divergence is larger than two, and it is never vacuous, as shown in this figure from [Canonne2023]. The following is an elementary proof, which … More Notes on the Bretagnolle-Huber Inequality

Improving the Quality of the Responsible AI Conversations

I have been incredibly frustrated with the lack of quality and content in many responsible AI (RAI) conversations. Almost all the (non-academic) RAI meetings I attended these past 12 months involve the speakers repeating words like fairness, accountability, and transparency basically for the entire duration of the meeting, with everyone nodding furiously in agreement about … More Improving the Quality of the Responsible AI Conversations