Layer Type | Size/Configuration | Parameters |
LSTM | 256 units | activation = “relu”, return sequences = True, |
|
| kernel and recurrent regularizer = l1_l2 (1e−5, 1e−4) |
Batch Normalization | - | Normalizes the activations from the previous layer |
Dropout | 0.5 | Randomly sets input units to 0 at each step to prevent overfitting |
LSTM | 128 units | activation = “relu”, return sequences = True, |
|
| kernel and recurrent regularizer = l1_l2 (1e−5, 1e−4) |
Batch Normalization | - | - |
Dropout | 0.5 | - |
LSTM | 64 units | activation = “relu”, return sequences = True, |
|
| kernel and recurrent regularizer = l1_l2 (1e−5, 1e−4) |
Batch Normalization | - | - |
Dropout | 0.5 | - |
LSTM | 32 units | activation = “relu”, |
|
| kernel and recurrent regularizer = l1_l2 (1e−5, 1e−4) |
Batch Normalization | - | - |
Dropout | 0.5 | - |
Dense | 16 units | activation = “relu”, kernel regularizer = l1_l2 (1e−5, 1e−4) |
Batch Normalization | - | - |
Dropout | 0.5 | - |
Dense | 1 unit | activation = “sigmoid”, kernel regularizer = l1_l2 (1e−5, 1e−4) |