Skip to main content
Log in

Virtual Fluorescence Translation for Biological Tissue by Conditional Generative Adversarial Network

  • Article
  • Published:
Phenomics Aims and scope Submit manuscript

Abstract

Fluorescence labeling and imaging provide an opportunity to observe the structure of biological tissues, playing a crucial role in the field of histopathology. However, when labeling and imaging biological tissues, there are still some challenges, e.g., time-consuming tissue preparation steps, expensive reagents, and signal bias due to photobleaching. To overcome these limitations, we present a deep-learning-based method for fluorescence translation of tissue sections, which is achieved by conditional generative adversarial network (cGAN). Experimental results from mouse kidney tissues demonstrate that the proposed method can predict the other types of fluorescence images from one raw fluorescence image, and implement the virtual multi-label fluorescent staining by merging the generated different fluorescence images as well. Moreover, this proposed method can also effectively reduce the time-consuming and laborious preparation in imaging processes, and further saves the cost and time.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Data Availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request. The code is available within the supplementary information.

Abbreviations

CNN:

Convolutional neural network

DL:

Deep learning

GAN:

Generative adversarial network

H&E:

Hematoxylin and Eosin

DAPI:

4′,6-Diamidino-2-phenylindole

WGA:

Wheat germ agglutinin

cGAN:

Conditional generative adversarial network

MS-SSIM:

Multi-scale structural similarity

CLAHE:

Contrast limited adaptive histogram equalization

SSIM:

Structural similarity

PSNR:

Peak signal-to-noise ratio

MAE:

Mean absolute error

NA:

Numerical aperture

CMOS:

Complementary metal oxide semiconductor

References

Download references

Acknowledgements

The authors are grateful to Mengyang Lu (Academy for Engineering and Technology, Fudan University, China) for providing the experimental supports. This work was supported in part by the National Natural Science Foundation of China (61871263, 12274092, and 12034005), in part by the Explorer Program of Shanghai (21TS1400200), in part by the Natural Science Foundation of Shanghai (21ZR1405200), and in part by the Medical Engineering Fund of Fudan University (YG2022-6).

Author information

Authors and Affiliations

Authors

Contributions

XL and BL performed the analysis, drafted the paper. BL and CL supported in gathering study data, analysis design. XL and DT made critical revisions and approved the final version.

Corresponding author

Correspondence to Dean Ta.

Ethics declarations

Conflict of interest

The authors declare no conflict of interest.

Ethics approval

Not applicable.

Consent to participate

Not applicable.

Consent to publication

Not applicable.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (RAR 10729 KB)

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, X., Li, B., Liu, C. et al. Virtual Fluorescence Translation for Biological Tissue by Conditional Generative Adversarial Network. Phenomics 3, 408–420 (2023). https://doi.org/10.1007/s43657-023-00094-1

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s43657-023-00094-1

Keywords

Navigation