Skip to main content

Table 3 Used evaluation metrics

From: A comprehensive evaluation of ensemble learning for stock-market prediction

AcronymFull nameFormula
RMSERoot mean squared error\(RMSE = \sqrt {\frac{1}{n}} \mathop \sum \limits_{i = 1}^{n} \left( {t_{i} - y_{i} } \right)\)
MAEMean absolute error\(MAE = \frac{1}{n}\mathop \sum \limits_{i = 1}^{n} \left( {t_{i} - y_{i} } \right)\)
R2The F1-score\(R^{2} = \frac{{2 \times P^{R} \times R^{R} }}{{P^{R} + R^{R} }}\)
RMSLERoot mean squared logarithmic error\(RMSLE = \surd \left( {MSE \left( {{ \log }(y_{n} + 1} \right), { \log }(\hat{y}_{n} + 1} \right))\)
ACCAccuracy\(Acc = \frac{TN + TP}{FP + TP + TN + FN}\)
RECRecall\(REC = \frac{TP}{TP + FN}\)
PREPrecision\(PRE = \frac{TP}{TP + FP}\)
AUCArea under ROC curve\(AUC = \mathop \smallint \limits_{0}^{1} \frac{TP}{{\left( {TP + FN} \right)}}d\frac{FP}{(FP + TN} = \mathop \smallint \limits_{0}^{1} \frac{TP}{P}d\frac{FP}{N}\)
MedAEMedian absolute error\(MedAE\left( {y,\hat{y}} \right) = median\left( {\left| {y_{1} - \hat{y}_{1} } \right|, \ldots ,\left| {y_{n} - \hat{y}_{n} } \right|} \right)\)
EVSExplained variance score 
MeanMean 
STDStandard deviation 
  1. Where \(yi\) is the predicted value produced by the model, \(ti\) is the actual value and n = total number of test-dataset. Also, TP number of true positive values, TN number of true negative values, FP number of false-positive values, FN number of false-negative values