Investigating Sex Related Phenotype Changes in Knockout Mice by Applying Deep Learning to X-Ray Images

Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 1248)


We train a convolutional neural network (CNN) to classify the sex of mice from x-ray images in order to develop a tool that can be used not only for quality control of high throughput image data, but also to identify sex-related phenotype alterations. The method achieved 98% accuracy, recall of .98 and precision of .98 and identified the chest and pelvis as the areas relevant for sex classification. We identified four knockout lines (Duoxa2, Tmem189, Dusp3 and Il10rb) potentially affected by phenotype changes related to sex.

This study demonstrates that CNNs can be trained for the purpose of quality control of images and can aid the discovery of novel genotype-phenotype associations. In addition to facilitating quality control, the method presented (1) allows the creation of a tool that will help phenotypers flag images of mice that should be inspected in more detail, (2) has highlighted areas of the mouse that are of particular interest in phenotype changes related to sex, and (3) has the potential to identify genes that may be causing sex related phenotype changes and/or are involved in sexual dimorphism.


Phenotype Knockout CNN Classification Genotype 



Our thanks to Professor Jesús Ruberte of UAB-Barcelona, and Dr David Lafont of the Wellcome Trust Sanger Institute for insightful discussions about the physiological relevance of the CAMs. This work was supported by the United States National Institutes of Health (NIH) Grant U54 HG006370.


  1. 1.
    Dickinson, M., Flenniken, A., Ji, X., et al.: High-throughput discovery of novel developmental phenotypes. Nature 537, 508–514 (2016). Scholar
  2. 2.
    LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521, 436–444 (2015)CrossRefGoogle Scholar
  3. 3.
    Litjens, G., et al.: A survey on deep learning in medical image analysis. Med. Image Anal. 42, 60–88 (2017)CrossRefGoogle Scholar
  4. 4.
    Paszke, A., Gross, S., et al.: PyTorch: an imperative style, high-performance deep learning library. In: Wallach, H., Larochelle, H., Beygelzimer, A., d’ Alche-Buc, F., Fox, E., Garnett, R. (eds.) Advances in Neural Information Processing Systems 32, pp. 8024–8035. Curran Associates, Inc. (2019).
  5. 5.
    Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image recognition. In: 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, 7–9 May 2015, Conference Track Proceedings (2015).
  6. 6.
    Vandereyken, M.M., et al.: Dual-specificity phosphatase 3 deletion protects female, but not male, mice from endotoxemia-induced andpolymicrobial-induced septic shock. J. Immunol. 199(7), 2525–2527 (2017). Scholar
  7. 7.
    Yosinski, J., Clune, J., Bengio, Y., Lipson, H.: How transferable are features in deep neural networks? In: Ghahramani, Z., Welling, M., Cortes, C., Lawrence, N.D., Weinberger, K.Q. (eds.) Advances in Neural Information Processing Systems 27, pp. 3320–3328 (2014).
  8. 8.
    Zeiler, M.D., Fergus, R.: Visualizing and understanding convolutional networks. In: Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T. (eds.) ECCV 2014. LNCS, vol. 8689, pp. 818–833. Springer, Cham (2014). CrossRefGoogle Scholar
  9. 9.
    Zhou, B., Khosla, A., Lapedriza, A., Oliva, A., Torralba, A.: Learning deep features for discriminative localization. In: CVPR (2016)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.European Molecular Biology LaboratoryEuropean Bioinformatics InstituteHinxton, CambridgeUK

Personalised recommendations