Arbeitsgruppe

KI in der Diagnostik

Tätigkeiten der Arbeitsgruppe

Goals of the working group:

Image analysis is heavily used to quantify phenotypes of interest to biologists, especially in high-throughput experiments. Recent advances in automated imaging techniques and image analysis allow many treatment con­ditions to be tested in a single day, thus enabling the systematic evaluation of particular morphologies of cells and tissue samples.
A major challenge in many modern biological laboratories is obtaining information-rich morphological measurements of cells and tissues in high-throughput. Herein, the term morphology will be used to refer to the full spectrum of biological phenotypes that can be observed and distinguished in images, including not only metrics of shape but also intensities, staining patterns, and spatial relationships.
A further development is currently underway: images are also being used as unbiased sources of quan­titative information about cell or tissue state in an approach known as image-based profiling. In image-based profiling, a rich collection of measurements called a morphological profile is created. The morphological profile can be seen as a fingerprint of the cell or tissue under consideration. With image-based profiling, various treatment conditions can be compared to identify biologically relevant similarities for clustering samples or identifying matches. The general profiling strategy is described in [1].
Deep learning is revolutionizing computer vision across many domains. Deep learning eliminates the need for feature engineering, because it operates on the raw pixels of each image and simultaneously accumulates useful abstractions while ignoring irrelevant information. High-content image analysis is a perfect fit to deep learning because it can quickly produce thousands to millions of images for training the deep network. Recent open-source frameworks, such as TensorFlow (Google), Keras, and CAFFE, provide convenient and effective means to adopt powerful deep learning architectures, such as convolutional neural networks, into medical and biomedical imaging research. With a wide range of practical applications emerging in various industries, we expect that such frameworks will become increasingly stable and robust, and thereby fit for clinical applications. The first clinical deep learning application, a medical imaging platform for healthcare, was approved by the FDA in 2017 (Forbes article). This medical imaging platform will be used to help doctors diagnose heart problems.
Major challenges for fully capitalizing on the power of deep learning and applying it medical images are the development of user-friendly workflows, satisfying heavy computational requirements, evaluation of diagnostic accuracy, and approval for clinical use.

Activities of the working group
The working group will organize introductory workshops and networking events. Topics include:
·         High-content image analysis of cells and tissues in high-throughput
·         Medical and biomedical image analysis with deep learning, in particular:
o   Development of user-friendly deep learning workflows
o   Cloud computing (i.e., satisfying heavy computational requirements for deep learning)
o   Evaluation of diagnostic accuracy
o   Approval for clinical use
·         Image-based profiling
·         High-content image analysis for different imaging techniques, such as
o   PET/CT-imaging
o   imaging flow cytometry

  

Publications in the context of the working group Image Analysis:

[1] J. Caicedo,…,H. Hennig,…, A.E. Carpenter. Data analysis strategies for image-based cell profiling. Nature Methods 14, 849 (2017)

[2] H. Hennig, P. Rees, T. Blasi, L. Kamentsky, J. Hung, D. Dao, A.E. Carpenter, and A. Filby. An open-source solution for advanced imaging flow cytometry data analysis using machine learning. Methods 112, 201 (2017)

[3] T. Blasi, H. Hennig, H.D. Summers, F.J. Theis, D. Davies, A. Filby, A.E. Carpenter, P. Rees. Label-free cell cycle analysis for high-throughput imaging flow cytometry. Nature Communications 7, 10256 (2016)