Layer type | # of neurons | # of parameters |
---|---|---|
Input | 125 | 0 |
Dense | 32 | 4032 |
Batch normalization | 32 | 128 |
ReLU activation | 32 | 0 |
Dropout \(P=0.5\) | 32 | 0 |
Dense | 32 | 1056 |
Batch normalization | 32 | 128 |
ReLU activation | 32 | 0 |
Dropout \(P=0.5\) | 32 | 0 |
Dense | 1 | 33 |
Sigmoid activation | 1 | 0 |