Skip to main content
Fig. 2 | Journal of Big Data

Fig. 2

From: Federated Freeze BERT for text classification

Fig. 2

The proposed architecture of C-FedFreezeBERT. All clients share the same set of parameters for BERT. Each client passes its local data through BERT to get the embeddings of these data. Clients then send these embeddings to the server that uses these embeddings to fully train the aggregation architecture. After the training is done, The server sends the final weights of the aggregation architecture back to all clients

Back to article page