Refs. | Application area | Method applied | Dataset | Layer size and activation function | Performance evaluation |
---|---|---|---|---|---|
[47] (2017) | Node Classification | Graph Attention Network (GAT) | CORA | GAT Method with 3 layers ReLU function | 76.5% |
[48] (2017) | Traffic prediction | Gated Residual Recurrent Graph Neural Networks | Citation |  ×  | 77.8% |
[49] (2021) | Edge Detection | Sparse Graph Attention Network (GAT) | CORA | GAT Method with 1 layer ReLU function | 82.5% |
[50] (2021) | Fault Diagnosis | KNN + GAT | hardware-in-the-loop (HIL) |  ×  | 87.7% |
[47] (2017) | Citation Network Node Classification | GAT | Cora Citeseer PubMed | GAT 64 hidden features (using ReLU) | F1- score 83.0 ± 0.7% 72.5 ± 0.7% 79.0 ± 0.3% |
[51] (2021) | Node-Prediction | GAT GAT- v2 | OGB | LeakyReLU activation function | GAT 78.1 ± 0.59 GATv2 78.5 ± 0.38 |
[52] (2019) | Node Embeddings | Signed Graph Attention Network (Si-GAT) | Epinions | LeakyReLU | 0.9293 |
[53] (2019) | Node Classification Task | Heterogeneous Graph Attention Network | IMDB DBLP ACM | Random walk-based methods | 10.01 84.76 64.39 |
[54] (2021) | Node Classification Task | Hyperbolic Graph Attention Network | Cora Citeseer PubMed Amazon Photo | 8, 16, 32, 64 (i.e., the number of hidden units in GNN | 0.567 0.427 0.359 0.667 |
[55] (2023) | Rumor Detection | GAT and GRU | Weibo and Pheme dataset | Two-layer GAT having 4 attentionhead to each layer | 97.2% |
[56] (2022) | Disease Prediction | Knowledge Graph Attention Network | Own dataset | fivefold cross validation with KGAT | 84.76 |