From: Model fusion of deep neural networks for anomaly detection
Operation layer
Size and activation
Dense
128 (Relu)
Dropout
0.5
64 (Relu)
0.2
32 (Relu)
Dense (classes)
2 or 4 or 5 (Softmax)