Privacy-Preserving FL: Beyond 'Data Never Leaves the Device'
"Your data never leaves your device"—the classic federated learning pitch. While technically true (raw data stays local), this statement masks a subtle reality: model updates can leak private information.
Gradient updates, aggregated statistics, and even model predictions can reveal sensitive training data through reconstruction attacks, membership inference, or model inversion. True privacy in federated learning requires rigorous mathematical guarantees, not just architectural promises.
This post explores the privacy landscape in FL and how Octomil implements provable privacy protections.