Evaluation Measures | Approach/Model | |||||
---|---|---|---|---|---|---|
RNN-LSTM | RNN-biLSTM | CNN | ||||
0 | 1 | 0 | 1 | 0 | 1 | |
Precision | 0.90 | 0.72 | 0.88 | 0.70 | 0.84 | 0.60 |
Recall | 0.91 | 0.68 | 0.91 | 0.64 | 0.90 | 0.46 |
F1-Score | 0.90 | 0.70 | 0.90 | 0.67 | 0.87 | 0.52 |
Accuracy | 0.85 | 0.84 | 0.79 | |||
Macro average | 0.80 | 0.78 | 0.69 | |||
Weighted Average | 0.85 | 0.84 | 0.78 | |||
Approximate Loss | 0.6 | 1.2 | 1.7 |