Convolutional Neural Network for Reconstruction of 7T-like Images from 3T MRI Using Appearance and Anatomical Features
The advanced 7 Tesla (7T) Magnetic Resonance Imaging (MRI) scanners provide images with higher resolution anatomy than 3T MRI scanners, thus facilitating early diagnosis of brain diseases. However, 7T MRI scanners are less accessible, compared to the 3T MRI scanners. This motivates us to reconstruct 7T-like images from 3T MRI. We propose a deep architecture for Convolutional Neural Network (CNN), which uses the appearance (intensity) and anatomical (labels of brain tissues) features as input to non-linearly map 3T MRI to 7T MRI. In the training step, we train the CNN by feeding it with both appearance and anatomical features of the 3T patch. This outputs the intensity of center voxel in the corresponding 7T patch. In the testing step, we apply the trained CNN to map each input 3T patch to the 7T-like image patch. Our performance is evaluated on 15 subjects, each with both 3T and 7T MR images. Both visual and numerical results show that our method outperforms the comparison methods.
KeywordsTraining Image Anatomical Feature Sparse Representation Convolutional Neural Network Deep Convolutional Neural Network
- 2.DOTmed Daily News (2012). www.dotmed.com/news/story/17820
- 4.Burgos, N., Cardoso, M.J., Thielemans, K., Modat, M., Pedemonte, S., Dickson, J., Barnes, A., Ahmed, R., Mahoney, C.J., Schott, J.M., Duncan, J.S., Atkinson, D., Arridge, S.R., Hutton, B.F., Ourselin, S.: Attenuation correction synthesis for hybrid PET-MR scanners: application to brain studies. IEEE Trans. Med. Imag. 33(12), 2332–2341 (2014)CrossRefGoogle Scholar
- 5.Bahrami, K., Shi, F., Zong, X., Shin, H.W., An, H., Shen, D.: Hierarchical reconstruction of 7T-like images from 3T MRI using multi-level CCA and group sparsity. MICCAI, pp. 1–8 (2015)Google Scholar
- 8.Kulkarni, K., Lohit, S., Turaga, P.K., Kerviche, R., Ashok, A.: ReconNet: non-iterative reconstruction of images from compressively sensed random measurements. arXiv:1601.06892 (2016)