Fig. 1From: Federated Freeze BERT for text classificationThe proposed architecture of D-FedFreezeBERT. All clients share the same set of parameters for BERT. In each communication round, contributing clients train locally their aggregation architecture only then send it to the server. The server aggregates these aggregation architectures with any federated learning algorithm and then sends it back to the contributing clientsBack to article page