Introduction
Material and Methods
Brain Tumor Data
Pre-Processing
Multi-Modal Image Registration
Tumor Segmentation
Data Normalization
Data Augmentation
Convolutional Neural Networks
-
Input layer. This layer holds the raw pixels values of the input image after applied pre-processing steps.
-
Convolutional layer. This layer is composed of several feature maps along the depth dimension, each corresponding to a different convolution filter. All neurons with the same spatial dimension (width and height) are connected to the same receptive filed in the input image or, generally, in the previous layer. This allows capturing a wide variety of imaging features. The depth of the layer, i.e., the number of convolution filters, defines the number of features that can be extracted from each input receptive field. Each neuron in a feature map shares exactly the same weights, which define the convolution filter. This allows reducing the number of weights, and thus increasing the generalization ability of the architecture.
-
Activation layer. This layer applies an activation function to each neuron in the output of the previous layer. For example, rectified linear unit (RELU) where RELU(x) = max(0,x) is the most common activation function used in CNNs architectures and fires the real value of the output and thresholds at zero. This layer does not change the size of the previous layer. It simply replaces negative values with ‘0’.
-
Pooling layer. Placed after an activation layer, and this layer down-samples along spatial dimensions (width and height). It selects the invariant imaging features by reducing the spatial dimension of the convolution layer. The most popular type is max pooling, which selects the maximum value of its inputs as the output, thus preserving the most prominent filter responses.
-
Fully connected layer. As with neural networks, this layer connects all neurons in the previous layer to this layer with a weight for each such connection. If used as the output, each output nodes represents the ‘score’ for each possible class.
Multi-Scale CNN Parameters
Implementation and Experiments
Implementation
Experiments
Results
Configurations | Sensitivity | Specificity | Accuracy |
---|---|---|---|
1 | 80.0% | 46.7% | 63.3% |
2 | 86.7% | 64.4% | 75.6% |
3 | 84.4% | 73.3% | 78.9% |
4 | 93.3% | 82.2% | 87.7% |
Configurations | Sensitivity | Specificity | Accuracy |
---|---|---|---|
SGD | 93.3% | 82.2% | 87.7% |
RMSprop | 84.4% | 84.4% | 84.4% |
AdaDelta | 82.2% | 84.4% | 83.3% |
Adam | 88.8% | 82.2% | 85.5% |
n = 90 | Predicted labels | ||
---|---|---|---|
Co-deleted | Non-deleted | ||
Actual labels | Co-deleted | 42 | 3 |
Non-deleted | 8 | 37 |