Back to glossary Technology

Federated Learning

A distributed training approach that enables AI models to learn from decentralized data without sharing raw data between parties.

What Is Federated Learning?

Federated learning is a machine learning approach where a model is trained across multiple decentralized devices or servers holding local data, without exchanging the raw data itself. Instead, each participant trains the model locally and shares only the model updates (gradients or weights), which are aggregated to improve the global model. This preserves data privacy while enabling collaborative learning across organizations or devices.

The process typically follows a cycle: a central server distributes the current model to participants, each participant trains on their local data, the resulting model updates are sent back and aggregated (commonly through federated averaging), and the improved model is redistributed. This cycle repeats until the model converges.

Privacy and Regulatory Benefits

Federated learning directly addresses data sovereignty and privacy regulations like GDPR, which restrict data movement across borders and organizations. Hospitals can collaboratively train diagnostic models without sharing patient records. Financial institutions can build fraud detection systems without exposing transaction data. The approach is often combined with differential privacy and secure aggregation for additional protection.

Enterprise Challenges

Implementing federated learning in practice involves addressing data heterogeneity (participants may have very different data distributions), communication efficiency (transmitting model updates over networks), and security concerns (model updates can potentially leak information about training data). Despite these challenges, federated learning is becoming essential for industries where data sharing is legally or competitively prohibited but collaborative model improvement would benefit all parties.

Related services and products