Layer | The number of units | Activation function |
input | 512 × 512 × 1 |
|
convolution | 126 × 126 × 48 | ReLU |
pooling | 42 × 42 × 48 |
|
convolution | 38 × 38 × 64 | ReLU |
pooling | 12 × 12 × 64 |
|
convolution | 8 × 8 × 128 | ReLU |
pooling | 4 × 4 × 128 |
|
full connection | 512 | ReLU |
full connection | 128 | ReLU |
full connection | 7 or 4 | sigmoid or soft max |
output | 7 or 4 |
|