摘要
Using a deep neural network,we demonstrate a digital staining technique,which we term PhaseStain,to transform the quantitative phase images(QPI)of label-free tissue sections into images that are equivalent to the brightfield microscopy images of the same samples that are histologically stained.Through pairs of image data(QPI and the corresponding brightfield images,acquired after staining),we train a generative adversarial network and demonstrate the effectiveness of this virtual-staining approach using sections of human skin,kidney,and liver tissue,matching the brightfield microscopy images of the same samples stained with Hematoxylin and Eosin,Jones’stain,and Masson’s trichrome stain,respectively.This digital-staining framework may further strengthen various uses of label-free QPI techniques in pathology applications and biomedical research in general,by eliminating the need for histological staining,reducing sample preparation related costs and saving time.Our results provide a powerful example of some of the unique opportunities created by data-driven image transformations enabled by deep learning.
基金
The Ozcan Research Group at UCLA acknowledges the support of NSF Engineering Research Center(ERC,PATHS-UP)
the Army Research Office(ARO,W911NF-13-1-0419 and W911NF-13-1-0197)
the ARO Life Sciences Division
the National Science Foundation(NSF)CBET Division Biophotonics Program
the NSF Emerging Frontiers in Research and Innovation(EFRI)Award
the NSF INSPIRE Award,NSF Partnerships for Innovation:Building Innovation Capacity(PFI:BIC)Program
the National Institutes of Health(NIH,R21EB023115)
the Howard Hughes Medical Institute(HHMI)
Vodafone Americas Foundation
the Mary Kay Foundation
Steven&Alexandra Cohen Foundation.