Final Project Presentation

Fatemehsadat Abadianzadeh


Dataset

  • Brats2018 Dataset
  • 210 High Grade Glioma
  • 75 High Grade Glioma
  • Each Image Dimention (240, 240,155)
  • T1, T2, Flair, T1c Images
  • Whole, Core, Eduma Ground Truth
In [0]:
#@title
plt.figure(figsize=(15,10))

plt.subplot(241)
plt.title('T1')
plt.axis('off')
plt.imshow(T1[90, 0, :, :],cmap='gray')

plt.subplot(242)
plt.title('T2')
plt.axis('off')
plt.imshow(T2[90, 0, :, :],cmap='gray')
    
plt.subplot(243)
plt.title('Flair')
plt.axis('off')
plt.imshow(Flair[90, 0, :, :],cmap='gray')

plt.subplot(244)
plt.title('T1c')
plt.axis('off')
plt.imshow(T1c[90, 0, :, :],cmap='gray')

plt.subplot(245)
plt.title('Ground Truth(Full)')
plt.axis('off')
plt.imshow(Label_full[90, 0, :, :],cmap='gray')

plt.subplot(246)
plt.title('Ground Truth(Core)')
plt.axis('off')
plt.imshow(Label_core[90, 0, :, :],cmap='gray')

plt.subplot(247)
plt.title('Ground Truth(ET)')
plt.axis('off')
plt.imshow(Label_ET[90, 0, :, :],cmap='gray')

plt.subplot(248)
plt.title('Ground Truth(All)')
plt.axis('off')
plt.imshow(Label_all[90, 0, :, :],cmap='gray')

plt.show()

Preprocessing

  • Histogram Equlization
  • Normalization to Mean and Std of Each Slice
  • Normalization to Mean and Std of Brain Area
In [0]:
#@title
count = 150
pul_seq = 'flair'
Flair = create_data_onesubject_val('/content/drive/My Drive/Colab Notebooks/2018/HGG/', '**/*{}.nii.gz'.format(pul_seq), count, label=False,hist_equ = True)
pul_seq = 't1ce'
T1c = create_data_onesubject_val('/content/drive/My Drive/Colab Notebooks/2018/HGG/', '**/*{}.nii.gz'.format(pul_seq), count, label=False,hist_equ = True)
pul_seq = 't1'
T1 = create_data_onesubject_val('/content/drive/My Drive/Colab Notebooks/2018/HGG/', '**/*{}.nii.gz'.format(pul_seq), count, label=False,hist_equ = True)
pul_seq = 't2'
T2 = create_data_onesubject_val('/content/drive/My Drive/Colab Notebooks/2018/HGG/', '**/*{}.nii.gz'.format(pul_seq), count, label=False,hist_equ = True)
label_num = 5
Label_full = create_data_onesubject_val('/content/drive/My Drive/Colab Notebooks/2018/HGG/', '**/*seg.nii.gz', count, label=True,hist_equ = True)
label_num = 2
Label_core = create_data_onesubject_val('/content/drive/My Drive/Colab Notebooks/2018/HGG/', '**/*seg.nii.gz', count, label=True,hist_equ = True)
label_num = 4
Label_ET = create_data_onesubject_val('/content/drive/My Drive/Colab Notebooks/2018/HGG/', '**/*seg.nii.gz', count, label=True,hist_equ = True)
label_num = 3
Label_all = create_data_onesubject_val('/content/drive/My Drive/Colab Notebooks/2018/HGG/', '**/*seg.nii.gz', count, label=True,hist_equ = True)

#@title
plt.figure(figsize=(15,10))

plt.subplot(241)
plt.title('T1')
plt.axis('off')
plt.imshow(T1[90, 0, :, :],cmap='gray')

plt.subplot(242)
plt.title('T2')
plt.axis('off')
plt.imshow(T2[90, 0, :, :],cmap='gray')
    
plt.subplot(243)
plt.title('Flair')
plt.axis('off')
plt.imshow(Flair[90, 0, :, :],cmap='gray')

plt.subplot(244)
plt.title('T1c')
plt.axis('off')
plt.imshow(T1c[90, 0, :, :],cmap='gray')

plt.subplot(245)
plt.title('Ground Truth(Full)')
plt.axis('off')
plt.imshow(Label_full[90, 0, :, :],cmap='gray')

plt.subplot(246)
plt.title('Ground Truth(Core)')
plt.axis('off')
plt.imshow(Label_core[90, 0, :, :],cmap='gray')

plt.subplot(247)
plt.title('Ground Truth(ET)')
plt.axis('off')
plt.imshow(Label_ET[90, 0, :, :],cmap='gray')

plt.subplot(248)
plt.title('Ground Truth(All)')
plt.axis('off')
plt.imshow(Label_all[90, 0, :, :],cmap='gray')

plt.show()

U-net Models


Proposed Model


Data Augmentation

Available Data

  • 285 3D Images
  • Use 70 Middle Slices
  • 70*285 = 19950

Basic Transformation Using Image Data Generator

  • Shift
  • Mirror
  • Zoom
  • Rotation
In [0]:
#@title
plt.figure(figsize=(15,10))


    
plt.subplot(241)
plt.title('Flair')
plt.axis('off')
plt.imshow(Flair[90, 0, :, :],cmap='gray')

plt.subplot(242)
plt.title('T1c')
plt.axis('off')
plt.imshow(Flair[91, 0, :, :],cmap='gray')

plt.subplot(245)
plt.title('Ground Truth(Full)')
plt.axis('off')
plt.imshow(label[90, 0, :, :],cmap='gray')

plt.subplot(246)
plt.title('Ground Truth(Full)')
plt.axis('off')
plt.imshow(label[91, 0, :, :],cmap='gray')



plt.show()

Advance Methods

  • Elastic Deformation
In [0]:
#@title
plt.figure(figsize=(15,10))


    
plt.subplot(241)
plt.title('Flair')
plt.axis('off')
plt.imshow(Flair[90, 0, :, :],cmap='gray')

plt.subplot(242)
plt.title('T1c')
plt.axis('off')
plt.imshow(Flair[91, 0, :, :],cmap='gray')

plt.subplot(245)
plt.title('Ground Truth(Full)')
plt.axis('off')
plt.imshow(label[90, 0, :, :],cmap='gray')

plt.subplot(246)
plt.title('Ground Truth(Full)')
plt.axis('off')
plt.imshow(label[91, 0, :, :],cmap='gray')



plt.show()

Results

Network Architecture

In [0]:
#@title

def dice_coef(y_true, y_pred):
    y_true_f = K.flatten(y_true)
    y_pred_f = K.flatten(y_pred)
    intersection = K.sum(y_true_f * y_pred_f)
    return (2. * intersection + smooth) / (K.sum(y_true_f) + K.sum(y_pred_f) + smooth)


def dice_coef_loss(y_true, y_pred):
    return -dice_coef(y_true, y_pred)
  
  
def unet_model():
    inputs = Input((1, img_size, img_size))
    conv1 = Conv2D(64, (3, 3), activation='relu', padding='same') (inputs)
#     batch1 = (conv1)
    batch1 = BatchNormalization(axis=3)(conv1)
    conv1 = Conv2D(64, (3, 3), activation='relu', padding='same') (batch1)
    batch1 = BatchNormalization(axis=3)(conv1)
    pool1 = MaxPooling2D((2, 2)) (batch1)
    pool1 = Dropout(dropout*0.5)(pool1)
    
    conv2 = Conv2D(128, (3, 3), activation='relu', padding='same') (pool1)
    batch2 = BatchNormalization(axis=3)(conv2)
#     batch2 = BatchNormalization(axis=3)(conv2)
    conv2 = Conv2D(128, (3, 3), activation='relu', padding='same') (batch2)
    batch2 = BatchNormalization(axis=3)(conv2)
    pool2 = MaxPooling2D((2, 2)) (batch2)
    pool2 = Dropout(dropout)(pool2)
    
    conv3 = Conv2D(256, (3, 3), activation='relu', padding='same') (pool2)
    batch3 = BatchNormalization(axis=3)(conv3)
    conv3 = Conv2D(256, (3, 3), activation='relu', padding='same') (batch3)
    batch3 = BatchNormalization(axis=3)(conv3)
    pool3 = MaxPooling2D((2, 2)) (batch3)
    pool3 = Dropout(dropout)(pool3)
    
    conv4 = Conv2D(512, (3, 3), activation='relu', padding='same') (pool3)
    batch4 = BatchNormalization(axis=3)(conv4)
    conv4 = Conv2D(512, (3, 3), activation='relu', padding='same') (batch4)
    batch4 = BatchNormalization(axis=3)(conv4)
    pool4 = MaxPooling2D(pool_size=(2, 2)) (batch4)
    pool4 = Dropout(dropout)(pool4)
#     
    conv5 = Conv2D(1024, (3, 3), activation='relu', padding='same') (pool4)
    batch5 = BatchNormalization(axis=3)(conv5)
    conv5 = Conv2D(1024, (3, 3), activation='relu', padding='same') (batch5)
    batch5 = BatchNormalization(axis=3)(conv5)
    
    up6 = Conv2DTranspose(512, (2, 2), strides=(2, 2), padding='same') (batch5)
    up6 = concatenate([up6, conv4], axis=1)
    up6 = Dropout(dropout)(up6)
    conv6 = Conv2D(512, (3, 3), activation='relu', padding='same') (up6)
    batch6 = BatchNormalization(axis=3)(conv6)
    conv6 = Conv2D(512, (3, 3), activation='relu', padding='same') (batch6)
    batch6 = BatchNormalization(axis=3)(conv6)
    
    up7 = Conv2DTranspose(256, (2, 2), strides=(2, 2), padding='same') (batch6)
    up7 = concatenate([up7, conv3], axis=1)
    up7 = Dropout(dropout)(up7)
    conv7 = Conv2D(256, (3, 3), activation='relu', padding='same') (up7)
    batch7 = BatchNormalization(axis=3)(conv7)
    conv7 = Conv2D(256, (3, 3), activation='relu', padding='same') (batch7)
    batch7 = BatchNormalization(axis=3)(conv7)
    
    up8 = Conv2DTranspose(128, (2, 2), strides=(2, 2), padding='same') (batch7)
    up8 = concatenate([up8, conv2], axis=1)
    up8 = Dropout(dropout)(up8)
    conv8 = Conv2D(128, (3, 3), activation='relu', padding='same') (up8)
    batch8 = BatchNormalization(axis=3)(conv8)
    conv8 = Conv2D(128, (3, 3), activation='relu', padding='same') (batch8)
    batch8 = BatchNormalization(axis=3)(conv8)
    
    up9 = Conv2DTranspose(64, (2, 2), strides=(2, 2), padding='same') (batch8)
    up9 = concatenate([up9, conv1], axis=1)
    up9 = Dropout(dropout)(up9)
    conv9 = Conv2D(64, (3, 3), activation='relu', padding='same') (up9)
    batch9 = BatchNormalization(axis=3)(conv9)
    conv9 = Conv2D(64, (3, 3), activation='relu', padding='same') (batch9)
    batch9 = BatchNormalization(axis=3)(conv9)

    conv10 = Conv2D(1, (1, 1), activation='sigmoid')(batch9)

    model = Model(inputs=[inputs], outputs=[conv10])

    model.compile(optimizer=Adam(lr=1e-5), loss=dice_coef_loss, metrics=['accuracy', dice_coef])

    return model
  
m = unet_model()
m.summary()



# from keras.utils import plot_model
# plot_model(m, to_file='model.png')
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_8 (InputLayer)            (None, 1, 128, 128)  0                                            
__________________________________________________________________________________________________
conv2d_134 (Conv2D)             (None, 64, 128, 128) 640         input_8[0][0]                    
__________________________________________________________________________________________________
batch_normalization_127 (BatchN (None, 64, 128, 128) 512         conv2d_134[0][0]                 
__________________________________________________________________________________________________
conv2d_135 (Conv2D)             (None, 64, 128, 128) 36928       batch_normalization_127[0][0]    
__________________________________________________________________________________________________
batch_normalization_128 (BatchN (None, 64, 128, 128) 512         conv2d_135[0][0]                 
__________________________________________________________________________________________________
max_pooling2d_29 (MaxPooling2D) (None, 64, 64, 64)   0           batch_normalization_128[0][0]    
__________________________________________________________________________________________________
dropout_57 (Dropout)            (None, 64, 64, 64)   0           max_pooling2d_29[0][0]           
__________________________________________________________________________________________________
conv2d_136 (Conv2D)             (None, 128, 64, 64)  73856       dropout_57[0][0]                 
__________________________________________________________________________________________________
batch_normalization_129 (BatchN (None, 128, 64, 64)  256         conv2d_136[0][0]                 
__________________________________________________________________________________________________
conv2d_137 (Conv2D)             (None, 128, 64, 64)  147584      batch_normalization_129[0][0]    
__________________________________________________________________________________________________
batch_normalization_130 (BatchN (None, 128, 64, 64)  256         conv2d_137[0][0]                 
__________________________________________________________________________________________________
max_pooling2d_30 (MaxPooling2D) (None, 128, 32, 32)  0           batch_normalization_130[0][0]    
__________________________________________________________________________________________________
dropout_58 (Dropout)            (None, 128, 32, 32)  0           max_pooling2d_30[0][0]           
__________________________________________________________________________________________________
conv2d_138 (Conv2D)             (None, 256, 32, 32)  295168      dropout_58[0][0]                 
__________________________________________________________________________________________________
batch_normalization_131 (BatchN (None, 256, 32, 32)  128         conv2d_138[0][0]                 
__________________________________________________________________________________________________
conv2d_139 (Conv2D)             (None, 256, 32, 32)  590080      batch_normalization_131[0][0]    
__________________________________________________________________________________________________
batch_normalization_132 (BatchN (None, 256, 32, 32)  128         conv2d_139[0][0]                 
__________________________________________________________________________________________________
max_pooling2d_31 (MaxPooling2D) (None, 256, 16, 16)  0           batch_normalization_132[0][0]    
__________________________________________________________________________________________________
dropout_59 (Dropout)            (None, 256, 16, 16)  0           max_pooling2d_31[0][0]           
__________________________________________________________________________________________________
conv2d_140 (Conv2D)             (None, 512, 16, 16)  1180160     dropout_59[0][0]                 
__________________________________________________________________________________________________
batch_normalization_133 (BatchN (None, 512, 16, 16)  64          conv2d_140[0][0]                 
__________________________________________________________________________________________________
conv2d_141 (Conv2D)             (None, 512, 16, 16)  2359808     batch_normalization_133[0][0]    
__________________________________________________________________________________________________
batch_normalization_134 (BatchN (None, 512, 16, 16)  64          conv2d_141[0][0]                 
__________________________________________________________________________________________________
max_pooling2d_32 (MaxPooling2D) (None, 512, 8, 8)    0           batch_normalization_134[0][0]    
__________________________________________________________________________________________________
dropout_60 (Dropout)            (None, 512, 8, 8)    0           max_pooling2d_32[0][0]           
__________________________________________________________________________________________________
conv2d_142 (Conv2D)             (None, 1024, 8, 8)   4719616     dropout_60[0][0]                 
__________________________________________________________________________________________________
batch_normalization_135 (BatchN (None, 1024, 8, 8)   32          conv2d_142[0][0]                 
__________________________________________________________________________________________________
conv2d_143 (Conv2D)             (None, 1024, 8, 8)   9438208     batch_normalization_135[0][0]    
__________________________________________________________________________________________________
batch_normalization_136 (BatchN (None, 1024, 8, 8)   32          conv2d_143[0][0]                 
__________________________________________________________________________________________________
conv2d_transpose_29 (Conv2DTran (None, 512, 16, 16)  2097664     batch_normalization_136[0][0]    
__________________________________________________________________________________________________
concatenate_29 (Concatenate)    (None, 1024, 16, 16) 0           conv2d_transpose_29[0][0]        
                                                                 conv2d_141[0][0]                 
__________________________________________________________________________________________________
dropout_61 (Dropout)            (None, 1024, 16, 16) 0           concatenate_29[0][0]             
__________________________________________________________________________________________________
conv2d_144 (Conv2D)             (None, 512, 16, 16)  4719104     dropout_61[0][0]                 
__________________________________________________________________________________________________
batch_normalization_137 (BatchN (None, 512, 16, 16)  64          conv2d_144[0][0]                 
__________________________________________________________________________________________________
conv2d_145 (Conv2D)             (None, 512, 16, 16)  2359808     batch_normalization_137[0][0]    
__________________________________________________________________________________________________
batch_normalization_138 (BatchN (None, 512, 16, 16)  64          conv2d_145[0][0]                 
__________________________________________________________________________________________________
conv2d_transpose_30 (Conv2DTran (None, 256, 32, 32)  524544      batch_normalization_138[0][0]    
__________________________________________________________________________________________________
concatenate_30 (Concatenate)    (None, 512, 32, 32)  0           conv2d_transpose_30[0][0]        
                                                                 conv2d_139[0][0]                 
__________________________________________________________________________________________________
dropout_62 (Dropout)            (None, 512, 32, 32)  0           concatenate_30[0][0]             
__________________________________________________________________________________________________
conv2d_146 (Conv2D)             (None, 256, 32, 32)  1179904     dropout_62[0][0]                 
__________________________________________________________________________________________________
batch_normalization_139 (BatchN (None, 256, 32, 32)  128         conv2d_146[0][0]                 
__________________________________________________________________________________________________
conv2d_147 (Conv2D)             (None, 256, 32, 32)  590080      batch_normalization_139[0][0]    
__________________________________________________________________________________________________
batch_normalization_140 (BatchN (None, 256, 32, 32)  128         conv2d_147[0][0]                 
__________________________________________________________________________________________________
conv2d_transpose_31 (Conv2DTran (None, 128, 64, 64)  131200      batch_normalization_140[0][0]    
__________________________________________________________________________________________________
concatenate_31 (Concatenate)    (None, 256, 64, 64)  0           conv2d_transpose_31[0][0]        
                                                                 conv2d_137[0][0]                 
__________________________________________________________________________________________________
dropout_63 (Dropout)            (None, 256, 64, 64)  0           concatenate_31[0][0]             
__________________________________________________________________________________________________
conv2d_148 (Conv2D)             (None, 128, 64, 64)  295040      dropout_63[0][0]                 
__________________________________________________________________________________________________
batch_normalization_141 (BatchN (None, 128, 64, 64)  256         conv2d_148[0][0]                 
__________________________________________________________________________________________________
conv2d_149 (Conv2D)             (None, 128, 64, 64)  147584      batch_normalization_141[0][0]    
__________________________________________________________________________________________________
batch_normalization_142 (BatchN (None, 128, 64, 64)  256         conv2d_149[0][0]                 
__________________________________________________________________________________________________
conv2d_transpose_32 (Conv2DTran (None, 64, 128, 128) 32832       batch_normalization_142[0][0]    
__________________________________________________________________________________________________
concatenate_32 (Concatenate)    (None, 128, 128, 128 0           conv2d_transpose_32[0][0]        
                                                                 conv2d_135[0][0]                 
__________________________________________________________________________________________________
dropout_64 (Dropout)            (None, 128, 128, 128 0           concatenate_32[0][0]             
__________________________________________________________________________________________________
conv2d_150 (Conv2D)             (None, 64, 128, 128) 73792       dropout_64[0][0]                 
__________________________________________________________________________________________________
batch_normalization_143 (BatchN (None, 64, 128, 128) 512         conv2d_150[0][0]                 
__________________________________________________________________________________________________
conv2d_151 (Conv2D)             (None, 64, 128, 128) 36928       batch_normalization_143[0][0]    
__________________________________________________________________________________________________
batch_normalization_144 (BatchN (None, 64, 128, 128) 512         conv2d_151[0][0]                 
__________________________________________________________________________________________________
conv2d_152 (Conv2D)             (None, 1, 128, 128)  65          batch_normalization_144[0][0]    
==================================================================================================
Total params: 31,034,497
Trainable params: 31,032,545
Non-trainable params: 1,952
__________________________________________________________________________________________________
In [0]:
#@title
from IPython.display import SVG
from keras.utils.vis_utils import model_to_dot
SVG(model_to_dot(m).create(prog='dot', format='svg'))
Out[0]:
G 139950252573864 input_8: InputLayer 139950213678360 conv2d_134: Conv2D 139950252573864->139950213678360 139950252019664 batch_normalization_127: BatchNormalization 139950213678360->139950252019664 139950252019384 conv2d_135: Conv2D 139950252019664->139950252019384 139950215315240 batch_normalization_128: BatchNormalization 139950252019384->139950215315240 139950194776160 concatenate_32: Concatenate 139950252019384->139950194776160 139950212206432 max_pooling2d_29: MaxPooling2D 139950215315240->139950212206432 139950211848232 dropout_57: Dropout 139950212206432->139950211848232 139950211850136 conv2d_136: Conv2D 139950211848232->139950211850136 139950211013656 batch_normalization_129: BatchNormalization 139950211850136->139950211013656 139950210686144 conv2d_137: Conv2D 139950211013656->139950210686144 139950210067928 batch_normalization_130: BatchNormalization 139950210686144->139950210067928 139950197607616 concatenate_31: Concatenate 139950210686144->139950197607616 139950209940056 max_pooling2d_30: MaxPooling2D 139950210067928->139950209940056 139950209267528 dropout_58: Dropout 139950209940056->139950209267528 139950209269432 conv2d_138: Conv2D 139950209267528->139950209269432 139950208595168 batch_normalization_131: BatchNormalization 139950209269432->139950208595168 139950208596792 conv2d_139: Conv2D 139950208595168->139950208596792 139950208092704 batch_normalization_132: BatchNormalization 139950208596792->139950208092704 139950200266480 concatenate_30: Concatenate 139950208596792->139950200266480 139950207798688 max_pooling2d_31: MaxPooling2D 139950208092704->139950207798688 139950207255720 dropout_59: Dropout 139950207798688->139950207255720 139950207258072 conv2d_140: Conv2D 139950207255720->139950207258072 139950206495544 batch_normalization_133: BatchNormalization 139950207258072->139950206495544 139950206139248 conv2d_141: Conv2D 139950206495544->139950206139248 139950205322968 batch_normalization_134: BatchNormalization 139950206139248->139950205322968 139950202439216 concatenate_29: Concatenate 139950206139248->139950202439216 139950205409936 max_pooling2d_32: MaxPooling2D 139950205322968->139950205409936 139950205261864 dropout_60: Dropout 139950205409936->139950205261864 139950205263544 conv2d_142: Conv2D 139950205261864->139950205263544 139950204060896 batch_normalization_135: BatchNormalization 139950205263544->139950204060896 139950204062520 conv2d_143: Conv2D 139950204060896->139950204062520 139950203550240 batch_normalization_136: BatchNormalization 139950204062520->139950203550240 139950203259984 conv2d_transpose_29: Conv2DTranspose 139950203550240->139950203259984 139950203259984->139950202439216 139950202438936 dropout_61: Dropout 139950202439216->139950202438936 139950202441680 conv2d_144: Conv2D 139950202438936->139950202441680 139950201416112 batch_normalization_137: BatchNormalization 139950202441680->139950201416112 139950201416056 conv2d_145: Conv2D 139950201416112->139950201416056 139950200911856 batch_normalization_138: BatchNormalization 139950201416056->139950200911856 139950200609368 conv2d_transpose_30: Conv2DTranspose 139950200911856->139950200609368 139950200609368->139950200266480 139950200265696 dropout_62: Dropout 139950200266480->139950200265696 139950199819904 conv2d_146: Conv2D 139950200265696->139950199819904 139950198772064 batch_normalization_139: BatchNormalization 139950199819904->139950198772064 139950198773576 conv2d_147: Conv2D 139950198772064->139950198773576 139950198261184 batch_normalization_140: BatchNormalization 139950198773576->139950198261184 139950197981704 conv2d_transpose_31: Conv2DTranspose 139950198261184->139950197981704 139950197981704->139950197607616 139950197606328 dropout_63: Dropout 139950197607616->139950197606328 139950197698456 conv2d_148: Conv2D 139950197606328->139950197698456 139950196652184 batch_normalization_141: BatchNormalization 139950197698456->139950196652184 139950196651288 conv2d_149: Conv2D 139950196652184->139950196651288 139950195861544 batch_normalization_142: BatchNormalization 139950196651288->139950195861544 139950195407784 conv2d_transpose_32: Conv2DTranspose 139950195861544->139950195407784 139950195407784->139950194776160 139950194974392 dropout_64: Dropout 139950194776160->139950194974392 139950194973720 conv2d_150: Conv2D 139950194974392->139950194973720 139950193900960 batch_normalization_143: BatchNormalization 139950194973720->139950193900960 139950193738640 conv2d_151: Conv2D 139950193900960->139950193738640 139950193198584 batch_normalization_144: BatchNormalization 139950193738640->139950193198584 139950192769456 conv2d_152: Conv2D 139950193198584->139950192769456

First Experiment

  • BatchNormalization
  • No dropont
  • Simple Augmentation
  • Dice Coefficient as Loss
  • Normalization to Mean and Std of Each Slice
  • Without Histogram Equlizer

Drawing

Second Experiment

  • BatchNormalization
  • Dropont 0.1
  • Simple Augmentation
  • Dice Coefficient as Loss
  • Normalization to Mean and Std of Each Slice
  • Without Histogram Equlizer

Drawing

Third Experiment

  • BatchNormalization
  • Dropont 0.2
  • Simple Augmentation
  • Dice Coefficient as Loss
  • Normalization to Mean and Std of Brain Area
  • Without Histogram Equlizer

Drawing

Fourth Experiment

  • BatchNormalization
  • Dropont 0.1
  • Simple Augmentation + Elastic Deformation
  • Dice Coefficient as Loss
  • Normalization to Mean and Std of Brain Area
  • Histogram Equlizer

Training is in progress

Fifth Experiment

  • BatchNormalization
  • Dropont 0.1
  • Simple Augmentation + Elastic Deformation
  • Dice Coefficient + Cross Entropy + KL Divergence- as Loss
  • Normalization to Mean and Std of Brain Area
  • Histogram Equlizer

Training is in progress

Sixth Experiment

  • Regularization L2
  • Activation LeakyRelu
  • Dropont 0.2
  • Simple Augmentation & Elastic Deformation
  • Dice Coefficient + Cross Entropy as Loss
  • Normalization to Mean and Std of Brain Area
  • Without Histogram Equlizer

Drawing

In [0]:
#@title
m = unet_model()
m.summary()
W0715 18:30:35.425029 139952028260224 deprecation.py:506] From <ipython-input-57-a5ed0ee20bc2>:24: calling weighted_cross_entropy_with_logits (from tensorflow.python.ops.nn_impl) with targets is deprecated and will be removed in a future version.
Instructions for updating:
targets is deprecated, use labels instead
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_9 (InputLayer)            (None, 1, 128, 128)  0                                            
__________________________________________________________________________________________________
conv2d_153 (Conv2D)             (None, 64, 128, 128) 640         input_9[0][0]                    
__________________________________________________________________________________________________
leaky_re_lu_1 (LeakyReLU)       (None, 64, 128, 128) 0           conv2d_153[0][0]                 
__________________________________________________________________________________________________
conv2d_154 (Conv2D)             (None, 64, 128, 128) 36928       leaky_re_lu_1[0][0]              
__________________________________________________________________________________________________
leaky_re_lu_2 (LeakyReLU)       (None, 64, 128, 128) 0           conv2d_154[0][0]                 
__________________________________________________________________________________________________
max_pooling2d_33 (MaxPooling2D) (None, 64, 64, 64)   0           leaky_re_lu_2[0][0]              
__________________________________________________________________________________________________
dropout_65 (Dropout)            (None, 64, 64, 64)   0           max_pooling2d_33[0][0]           
__________________________________________________________________________________________________
conv2d_155 (Conv2D)             (None, 128, 64, 64)  73856       dropout_65[0][0]                 
__________________________________________________________________________________________________
leaky_re_lu_3 (LeakyReLU)       (None, 128, 64, 64)  0           conv2d_155[0][0]                 
__________________________________________________________________________________________________
conv2d_156 (Conv2D)             (None, 128, 64, 64)  147584      leaky_re_lu_3[0][0]              
__________________________________________________________________________________________________
leaky_re_lu_4 (LeakyReLU)       (None, 128, 64, 64)  0           conv2d_156[0][0]                 
__________________________________________________________________________________________________
max_pooling2d_34 (MaxPooling2D) (None, 128, 32, 32)  0           leaky_re_lu_4[0][0]              
__________________________________________________________________________________________________
dropout_66 (Dropout)            (None, 128, 32, 32)  0           max_pooling2d_34[0][0]           
__________________________________________________________________________________________________
conv2d_157 (Conv2D)             (None, 256, 32, 32)  295168      dropout_66[0][0]                 
__________________________________________________________________________________________________
leaky_re_lu_5 (LeakyReLU)       (None, 256, 32, 32)  0           conv2d_157[0][0]                 
__________________________________________________________________________________________________
conv2d_158 (Conv2D)             (None, 256, 32, 32)  590080      leaky_re_lu_5[0][0]              
__________________________________________________________________________________________________
leaky_re_lu_6 (LeakyReLU)       (None, 256, 32, 32)  0           conv2d_158[0][0]                 
__________________________________________________________________________________________________
max_pooling2d_35 (MaxPooling2D) (None, 256, 16, 16)  0           leaky_re_lu_6[0][0]              
__________________________________________________________________________________________________
dropout_67 (Dropout)            (None, 256, 16, 16)  0           max_pooling2d_35[0][0]           
__________________________________________________________________________________________________
conv2d_159 (Conv2D)             (None, 512, 16, 16)  1180160     dropout_67[0][0]                 
__________________________________________________________________________________________________
leaky_re_lu_7 (LeakyReLU)       (None, 512, 16, 16)  0           conv2d_159[0][0]                 
__________________________________________________________________________________________________
conv2d_160 (Conv2D)             (None, 512, 16, 16)  2359808     leaky_re_lu_7[0][0]              
__________________________________________________________________________________________________
leaky_re_lu_8 (LeakyReLU)       (None, 512, 16, 16)  0           conv2d_160[0][0]                 
__________________________________________________________________________________________________
max_pooling2d_36 (MaxPooling2D) (None, 512, 8, 8)    0           leaky_re_lu_8[0][0]              
__________________________________________________________________________________________________
dropout_68 (Dropout)            (None, 512, 8, 8)    0           max_pooling2d_36[0][0]           
__________________________________________________________________________________________________
conv2d_161 (Conv2D)             (None, 1024, 8, 8)   4719616     dropout_68[0][0]                 
__________________________________________________________________________________________________
leaky_re_lu_9 (LeakyReLU)       (None, 1024, 8, 8)   0           conv2d_161[0][0]                 
__________________________________________________________________________________________________
conv2d_162 (Conv2D)             (None, 1024, 8, 8)   9438208     leaky_re_lu_9[0][0]              
__________________________________________________________________________________________________
leaky_re_lu_10 (LeakyReLU)      (None, 1024, 8, 8)   0           conv2d_162[0][0]                 
__________________________________________________________________________________________________
conv2d_transpose_33 (Conv2DTran (None, 512, 16, 16)  2097664     leaky_re_lu_10[0][0]             
__________________________________________________________________________________________________
concatenate_33 (Concatenate)    (None, 1024, 16, 16) 0           conv2d_transpose_33[0][0]        
                                                                 conv2d_160[0][0]                 
__________________________________________________________________________________________________
conv2d_163 (Conv2D)             (None, 512, 16, 16)  4719104     concatenate_33[0][0]             
__________________________________________________________________________________________________
leaky_re_lu_11 (LeakyReLU)      (None, 512, 16, 16)  0           conv2d_163[0][0]                 
__________________________________________________________________________________________________
conv2d_164 (Conv2D)             (None, 512, 16, 16)  2359808     leaky_re_lu_11[0][0]             
__________________________________________________________________________________________________
leaky_re_lu_12 (LeakyReLU)      (None, 512, 16, 16)  0           conv2d_164[0][0]                 
__________________________________________________________________________________________________
conv2d_transpose_34 (Conv2DTran (None, 256, 32, 32)  524544      leaky_re_lu_12[0][0]             
__________________________________________________________________________________________________
concatenate_34 (Concatenate)    (None, 512, 32, 32)  0           conv2d_transpose_34[0][0]        
                                                                 conv2d_158[0][0]                 
__________________________________________________________________________________________________
conv2d_165 (Conv2D)             (None, 256, 32, 32)  1179904     concatenate_34[0][0]             
__________________________________________________________________________________________________
leaky_re_lu_13 (LeakyReLU)      (None, 256, 32, 32)  0           conv2d_165[0][0]                 
__________________________________________________________________________________________________
conv2d_166 (Conv2D)             (None, 256, 32, 32)  590080      leaky_re_lu_13[0][0]             
__________________________________________________________________________________________________
leaky_re_lu_14 (LeakyReLU)      (None, 256, 32, 32)  0           conv2d_166[0][0]                 
__________________________________________________________________________________________________
conv2d_transpose_35 (Conv2DTran (None, 128, 64, 64)  131200      leaky_re_lu_14[0][0]             
__________________________________________________________________________________________________
concatenate_35 (Concatenate)    (None, 256, 64, 64)  0           conv2d_transpose_35[0][0]        
                                                                 conv2d_156[0][0]                 
__________________________________________________________________________________________________
conv2d_167 (Conv2D)             (None, 128, 64, 64)  295040      concatenate_35[0][0]             
__________________________________________________________________________________________________
leaky_re_lu_15 (LeakyReLU)      (None, 128, 64, 64)  0           conv2d_167[0][0]                 
__________________________________________________________________________________________________
conv2d_168 (Conv2D)             (None, 128, 64, 64)  147584      leaky_re_lu_15[0][0]             
__________________________________________________________________________________________________
leaky_re_lu_16 (LeakyReLU)      (None, 128, 64, 64)  0           conv2d_168[0][0]                 
__________________________________________________________________________________________________
conv2d_transpose_36 (Conv2DTran (None, 64, 128, 128) 32832       leaky_re_lu_16[0][0]             
__________________________________________________________________________________________________
concatenate_36 (Concatenate)    (None, 128, 128, 128 0           conv2d_transpose_36[0][0]        
                                                                 conv2d_154[0][0]                 
__________________________________________________________________________________________________
conv2d_169 (Conv2D)             (None, 64, 128, 128) 73792       concatenate_36[0][0]             
__________________________________________________________________________________________________
leaky_re_lu_17 (LeakyReLU)      (None, 64, 128, 128) 0           conv2d_169[0][0]                 
__________________________________________________________________________________________________
conv2d_170 (Conv2D)             (None, 64, 128, 128) 36928       leaky_re_lu_17[0][0]             
__________________________________________________________________________________________________
leaky_re_lu_18 (LeakyReLU)      (None, 64, 128, 128) 0           conv2d_170[0][0]                 
__________________________________________________________________________________________________
conv2d_171 (Conv2D)             (None, 1, 128, 128)  65          leaky_re_lu_18[0][0]             
==================================================================================================
Total params: 31,030,593
Trainable params: 31,030,593
Non-trainable params: 0
__________________________________________________________________________________________________
In [0]:
#@title
from IPython.display import SVG
from keras.utils.vis_utils import model_to_dot
SVG(model_to_dot(m).create(prog='dot', format='svg'))
Out[0]:
G 139950188326360 input_9: InputLayer 139950297380400 conv2d_153: Conv2D 139950188326360->139950297380400 139950188326584 leaky_re_lu_1: LeakyReLU 139950297380400->139950188326584 139950188325576 conv2d_154: Conv2D 139950188326584->139950188325576 139950188440376 leaky_re_lu_2: LeakyReLU 139950188325576->139950188440376 139950182575800 concatenate_36: Concatenate 139950188325576->139950182575800 139950188188896 max_pooling2d_33: MaxPooling2D 139950188440376->139950188188896 139950188190072 dropout_65: Dropout 139950188188896->139950188190072 139950188120720 conv2d_155: Conv2D 139950188190072->139950188120720 139950187832208 leaky_re_lu_3: LeakyReLU 139950188120720->139950187832208 139950187833160 conv2d_156: Conv2D 139950187832208->139950187833160 139950187607152 leaky_re_lu_4: LeakyReLU 139950187833160->139950187607152 139950183302928 concatenate_35: Concatenate 139950187833160->139950183302928 139950187609952 max_pooling2d_34: MaxPooling2D 139950187607152->139950187609952 139950187607880 dropout_66: Dropout 139950187609952->139950187607880 139950187297704 conv2d_157: Conv2D 139950187607880->139950187297704 139950187388040 leaky_re_lu_5: LeakyReLU 139950187297704->139950187388040 139950187388152 conv2d_158: Conv2D 139950187388040->139950187388152 139950186784192 leaky_re_lu_6: LeakyReLU 139950187388152->139950186784192 139950184459568 concatenate_34: Concatenate 139950187388152->139950184459568 139950186784696 max_pooling2d_35: MaxPooling2D 139950186784192->139950186784696 139950186784080 dropout_67: Dropout 139950186784696->139950186784080 139950186437152 conv2d_159: Conv2D 139950186784080->139950186437152 139950186558856 leaky_re_lu_7: LeakyReLU 139950186437152->139950186558856 139950186559808 conv2d_160: Conv2D 139950186558856->139950186559808 139950186337840 leaky_re_lu_8: LeakyReLU 139950186559808->139950186337840 139950185164080 concatenate_33: Concatenate 139950186559808->139950185164080 139950186340024 max_pooling2d_36: MaxPooling2D 139950186337840->139950186340024 139950186338456 dropout_68: Dropout 139950186340024->139950186338456 139950186024296 conv2d_161: Conv2D 139950186338456->139950186024296 139950186125632 leaky_re_lu_9: LeakyReLU 139950186024296->139950186125632 139950186126920 conv2d_162: Conv2D 139950186125632->139950186126920 139950185517520 leaky_re_lu_10: LeakyReLU 139950186126920->139950185517520 139950185519256 conv2d_transpose_33: Conv2DTranspose 139950185517520->139950185519256 139950185519256->139950185164080 139950185162344 conv2d_163: Conv2D 139950185164080->139950185162344 139950185090016 leaky_re_lu_11: LeakyReLU 139950185162344->139950185090016 139950185088560 conv2d_164: Conv2D 139950185090016->139950185088560 139950184768176 leaky_re_lu_12: LeakyReLU 139950185088560->139950184768176 139950184771032 conv2d_transpose_34: Conv2DTranspose 139950184768176->139950184771032 139950184771032->139950184459568 139950184547216 conv2d_165: Conv2D 139950184459568->139950184547216 139950183832600 leaky_re_lu_13: LeakyReLU 139950184547216->139950183832600 139950183832936 conv2d_166: Conv2D 139950183832600->139950183832936 139950183629768 leaky_re_lu_14: LeakyReLU 139950183832936->139950183629768 139950183630608 conv2d_transpose_35: Conv2DTranspose 139950183629768->139950183630608 139950183630608->139950183302928 139950183304160 conv2d_167: Conv2D 139950183302928->139950183304160 139950183236608 leaky_re_lu_15: LeakyReLU 139950183304160->139950183236608 139950183237056 conv2d_168: Conv2D 139950183236608->139950183237056 139950182900624 leaky_re_lu_16: LeakyReLU 139950183237056->139950182900624 139950182902192 conv2d_transpose_36: Conv2DTranspose 139950182900624->139950182902192 139950182902192->139950182575800 139950182573560 conv2d_169: Conv2D 139950182575800->139950182573560 139950181977504 leaky_re_lu_17: LeakyReLU 139950182573560->139950181977504 139950181976048 conv2d_170: Conv2D 139950181977504->139950181976048 139950182180624 leaky_re_lu_18: LeakyReLU 139950181976048->139950182180624 139950182180008 conv2d_171: Conv2D 139950182180624->139950182180008

Segmentation Result of Network

alt text