Studies on Category Prediction of Ovarian Cancers Based on Magnetic Resonance Images
Mao, Yunfei (2020-01-29)
Studies on Category Prediction of Ovarian Cancers Based on Magnetic Resonance Images
Mao, Yunfei
(29.01.2020)
Julkaisu on tekijänoikeussäännösten alainen. Teosta voi lukea ja tulostaa henkilökohtaista käyttöä varten. Käyttö kaupallisiin tarkoituksiin on kielletty.
avoin
Julkaisun pysyvä osoite on:
https://urn.fi/URN:NBN:fi-fe2020062645947
https://urn.fi/URN:NBN:fi-fe2020062645947
Tiivistelmä
Ovarian cancer is the gynecological malignant tumor with low early diagnosis rate and high mortality. Ovarian epithelial cancer (OEC) is the most common subtype of ovarian cancer. Pathologically, OEC is divided into two subtypes: Type I and Type II. These two subtypes of OEC have different biological characteristics and treatment response. Therefore, it is important to accurately categorize these two groups of patients and provide the reference for clinicians in designing treatment plans.
In the current magnetic resonance (MR) examination, the diagnoses given by the radiologists are largely based on individual judgment and not sufficiently accurate. Because of the low accuracy of the results and the risk of suffering Type II OEC, most patients will undertake the fine-needle aspiration, which may cause harm to patients’ bodies. Therefore, there is need for the method for OEC subtype classification based on MR images.
This thesis proposes the automatic diagnosis system of ovarian cancer based on the combination of deep learning and radiomics. The method utilizes four common useful sequences for ovarian cancer diagnosis: sagittal fat-suppressed T2WI (Sag-fs-T2WI), coronal T2WI (Cor-T2WI), axial T1WI (Axi-T1WI), and apparent diffusion coefficient map (ADC) to establish a multi-sequence diagnostic model. The system starts with the segmentation of the ovarian tumors, and then obtains the radiomic features from lesion parts together with the network features. Selected Features are used to build model to predict the malignancy of ovarian cancers, the subtype of OEC and the survival condition.
Bi-atten-ResUnet is proposed in this thesis as the segmentation model. The network is established on the basis of U-Net with adopting Residual block and non-local attention module. It preserves the classic encoder/decoder architecture in the U-Net network. The encoder part is reconstructed by the pretrained ResNet to make use of transfer learning knowledge, and bi-non-local attention modules are added to the decoder part on each level. The application of these techniques enhances the network’s performance in segmentation tasks. The model achieves 0.918, 0.905, 0.831, and 0.820 Dice coefficient respectively in segmenting on four MR sequences.
After the segmentation work, the thesis proposes a diagnostic model with three steps: quantitative description feature extraction, feature selection, and establishment of prediction models. First, radiomic features and network features are obtained. Then iterative sparse representation (ISR) method is adopted as the feature selection to reduce the redundancy and correlation. The selected features are used to establish a predictive model, and support vector machine (SVM) is used as the classifier.
The model achieves an AUC of 0.967 in distinguishing between benign and malignant ovarian tumors. For discriminating Type I and Type II OEC, the model yields an AUC of 0.823. In the survival prediction, patients categorized in high risk group are more likely to have poor prognosis with hazard ratio 4.169.
In the current magnetic resonance (MR) examination, the diagnoses given by the radiologists are largely based on individual judgment and not sufficiently accurate. Because of the low accuracy of the results and the risk of suffering Type II OEC, most patients will undertake the fine-needle aspiration, which may cause harm to patients’ bodies. Therefore, there is need for the method for OEC subtype classification based on MR images.
This thesis proposes the automatic diagnosis system of ovarian cancer based on the combination of deep learning and radiomics. The method utilizes four common useful sequences for ovarian cancer diagnosis: sagittal fat-suppressed T2WI (Sag-fs-T2WI), coronal T2WI (Cor-T2WI), axial T1WI (Axi-T1WI), and apparent diffusion coefficient map (ADC) to establish a multi-sequence diagnostic model. The system starts with the segmentation of the ovarian tumors, and then obtains the radiomic features from lesion parts together with the network features. Selected Features are used to build model to predict the malignancy of ovarian cancers, the subtype of OEC and the survival condition.
Bi-atten-ResUnet is proposed in this thesis as the segmentation model. The network is established on the basis of U-Net with adopting Residual block and non-local attention module. It preserves the classic encoder/decoder architecture in the U-Net network. The encoder part is reconstructed by the pretrained ResNet to make use of transfer learning knowledge, and bi-non-local attention modules are added to the decoder part on each level. The application of these techniques enhances the network’s performance in segmentation tasks. The model achieves 0.918, 0.905, 0.831, and 0.820 Dice coefficient respectively in segmenting on four MR sequences.
After the segmentation work, the thesis proposes a diagnostic model with three steps: quantitative description feature extraction, feature selection, and establishment of prediction models. First, radiomic features and network features are obtained. Then iterative sparse representation (ISR) method is adopted as the feature selection to reduce the redundancy and correlation. The selected features are used to establish a predictive model, and support vector machine (SVM) is used as the classifier.
The model achieves an AUC of 0.967 in distinguishing between benign and malignant ovarian tumors. For discriminating Type I and Type II OEC, the model yields an AUC of 0.823. In the survival prediction, patients categorized in high risk group are more likely to have poor prognosis with hazard ratio 4.169.