Fig. 2From: Federated Freeze BERT for text classificationThe proposed architecture of C-FedFreezeBERT. All clients share the same set of parameters for BERT. Each client passes its local data through BERT to get the embeddings of these data. Clients then send these embeddings to the server that uses these embeddings to fully train the aggregation architecture. After the training is done, The server sends the final weights of the aggregation architecture back to all clientsBack to article page