Federated (de-centralized) learning (FL) is an approach that downloads the current model and computes an updated model at the device itself using local data, rather than going to one pool to update the device. These locally trained models are then sent from the devices back to the central server where they are aggregated and then a single consolidated and improved global model is sent back to the devices. Federated learning makes it possible for AI algorithms to gain experience from a vast range of data located at different sites.

How Does It Work?

Below is a visual representation of how Federated Learning works.

Your phone personalizes the model locally based on your usage and data (A). Other users updates who also use that model are then aggregated (B) to form a consensus change ( C) to the shared model, after which the procedure is repeated. FL supports GDPR in the sense that the central server does not share and store the user’s data and privacy from each device, only what has been learnt from the devices update, which is then formed into an improved model by all that is learnt from the devices, which is sent back to the device. This is then a repeating process.

Where Did It Come From?

FL is a fairly new type of learning which was introduced by Google in 2016. As there are billions of mobile device users globally, they generate a lot of data which can be collected in data centres, exploiting all these billions of peoples private data. Google decided to create a learning technique which teaches models to  train on a singular device, which if multiple models do, you combine all the results and create a single model which is created from all the devices, but without sharing any data.

What Is the USP and Benefit of Using FL?

Federated learning provides a privacy-preserving mechanism to effectively leverage those decentralized compute resources inside end devices to train machine learning models.

How Does It Compare to Centralized Learning?

The problem with centralized learning is there is a chance of the user’s data and privacy being exploited and mixed within other users device, FL eliminates this risk.

The Disadvantage of Federated Learning

As there are potentially maybe billions of that model device trying to do the same thing at the same time and also it relies on the wireless signal, the convergence time of the federated learning process may be slower than normal.

What Is FL Capable of in the Future? How Can It Be Used?

Federated learning has the potential to disrupt cloud computing, the dominant computing paradigm today. Machine learning models can be trained without counting on the compute resources owned by giant AI companies and users will not need to trade their privacy for better services.

Further Reading on Federated Learning

If you would like to learn more about Federated Learning, then have a read of some of these articles below:

Further Reading: