Back to glossary

AI GLOSSARY

Federated Learning

Learning Paradigms

A training approach where a model is trained across many decentralized devices or servers, each holding their own local data, without that data ever being centralized or shared. Only model updates, not raw data, are sent to a central server for aggregation. Federated learning is particularly valuable in privacy-sensitive contexts, such as training on medical records or personal phone data. This approach can be vulnerable to gradient leakage and data poisoning attacks through the update-sharing mechanism.
See also: federated analytics, differential privacy, gradient leakage.

External reference