Defining Artificial Intelligence, Deep Learning, and Machine Learning in Diagnostic Imaging

Understanding the meanings of AI terms and their roles in diagnostic imaging.

Lee este blog en español.

If you attended a diagnostic imaging conference or read an industry publication in the past year, you likely heard the term “artificial intelligence”. If you’re unclear about its meaning– and its applications in diagnostic imaging – you’re not alone. In this blog, we’ll define artificial intelligence, deep learning, and machine learning – and explain their differences.

Let’s start our definitions with artificial intelligence (AI) which is both the theory and development of computer systems that can perform tasks  that normally require human intelligence. As a research field, it investigates how to construct specific algorithms so they will behave in a way which can be deemed intelligent.We use artificial intelligence almost daily for numerous tasks. (1) Amazon suggests related products for you to buy and which movies to watch next. Credit card companies use it to detect and alert you to unusual changes in your buying behavior that might indicate fraud.

Learn about AI applications at the Carestream stand at ECR 2018!

Image of man interacting with table.

Carestream is building AI applications for diagnostic imaging.

It starts with an algorithm

The building block for these and other artificial intelligence applications is an algorithm. An algorithm is “a set of computing instructions designed to perform a specific task.” (2)  

The most basic ones automate a human activity; for example, when we ask our smartphones for a weather report. (3) Almost everything that has a computer inside it uses algorithms in one way or another. You are engaging simple image processing algorithms when you crop or resize a photo. Facebook uses a proprietary and complex algorithm to determine which posts will appear at the top of your news feed.

Algorithms are not magic and they’re not executed in a black box. They are the result of detailed coding created by people that tells software exactly what to do – step by step.

So how do we make the leap from Siri serving up a list of nearby pizzerias when asked, to self-driving vehicles with powerful predictive capabilities? By teaching the machine reasoning, prioritization, problem solving, and deduction – which brings us to explanations of machine learning and deep learning.

Machine learning – interpreting data sets

Machine learning and deep learning are subsets and applications of AI. They explore the study and construction of algorithms that can learn from data and make predictions on it.

Let’s start with defining machine learning. According to Upwork, this “branch of computer science has to do with building algorithms that are guided by data. Rather than relying on human programmers to provide explicit instructions, machine learning algorithms use training sets of real-world data to infer models that are more accurate and sophisticated than humans could devise on their own.” (4)  In other words, machine learning applications can learn. They get refined with experience, not unlike the human concept of learning.

For example, a simple algorithm can be programmed to recognize a photo of a Labrador Retriever on your phone as a “dog”.  But it would not recognize a beagle as a “dog”. To do so would require a computer programmer to define all characteristics of all breeds of dogs upfront, providing the necessary step-by-step instructions.

In contrast, machine learning uses complex algorithms to learn on its own. In the example of our Labrador, a computer programmer creates a data set of general characteristics about dogs. Based on this data set, the algorithms are able to make their own inferences about other images based on that data, thus being able to identify that a Chihuahua is as much of a dog as a Great Dane; and that a Siamese cat is not a canine.

Gaming technology gives boost to machine learning

The earliest applications of machine learning were introduced in the 1980s. However, the complex computations required more computing power than was readily available at the time. This changed around 2000 with the introduction of graphic processing units (GPUs). This same technology that can render virtual 3D fantasy worlds in games brought needed computing power to machine learning. At the same time, inexpensive access to storage and the commoditization of hardware fueled development of machine learning applications. Today, machine learning technology is accelerating at a rate beyond Moore’s Law, with algorithms and models doubling in capability every six months, according to Health Data Management. (5)

Deep learning – no training required

picture of Carestream's stand at ECR 2018

Learn about AI applications in diagnostic imaging at ECR 2018. Schedule your demo today!

Deep learning is machine learning on steroids. This greatly advanced form of machine learning explores the use of artificial neural networks – a form of algorithms inspired by the structure and function of the human brain.

Evolving from machine learning to deep learning – and from analysis to interpretation – is a very big leap. Why? Because brains have more than 85 billion neural connections. Although artificial neural networks have increased in size, they typically contain only between a few thousand and a few million neurons. (6).

However, progress is being made. A new generation of computer chip technology known as neuromorphic processors are being designed to more efficiently run brain-simulator code. Systems such as IBM’s Watson cognitive computing platform use high-level simulations of human neurological processes to carry out an ever-growing range of tasks without being specifically taught how to do them. The possibilities led Google’s Director of Engineering, Ray Kurzweil, to predict that machines will have the capacity to be smarter than humans by 2029. (7)

Current AI applications in diagnostic imaging

Now that we understand the meanings of artificial intelligence, deep learning, and machine learning, let’s explore their applications in diagnostic imaging. There are two main areas being developed. One is quantification; the other is interpretation.

Here is an example of quantification currently supported by Carestream. Our partner, Zebra Medical Vision, created an algorithm that can calculate the density of a bone captured with CT imaging. The algorithm executes itself any time it detects an image of a bone – even though bone density is not the concern that brought a patient to see a physician. Additional algorithms analyze the pixel and bytes of data contained within the image to detect the distinct patterns associated with bone density.

The outcome of the algorithmic analysis is a metric. This is an example of the use of AI for quantification. The resulting number can be compared with a threshold metric to determine whether the patient is at risk of fracture. If the number is below the threshold, a doctor can prescribe a regular intake of calcium or other preventative measure. Other machine learning applications currently supported by Carestream include detection of high coronary calcium levels, fatty liver tissue, emphysema and brain bleeds – and the list is growing.

Diagnostic images are treasure trove for AI learning 

Evolving to this level of preventative care requires not only expertise in imaging analytics to develop the algorithms, but also access to huge libraries of images. The algorithms need to analyze many cases in order to become more knowledgeable and more accurate.

Carestream can play a key role in this evolution. We manage hundreds of billions of images in our medical imaging repositories in the cloud and at national and regional healthcare data centers; and the volume grows daily. These images can be raw material for clinical leaders to develop, test, and validate new algorithms.

Research teams and start-up companies across the world are producing new algorithms to cover more body parts and pathologies. It won’t take long before radiologists are equipped with thousands of predictive algorithms to automatically detect the patterns of the most standard diseases. This application of advanced data analysis holds the exciting prospect of predicting which patients are most at risk for clinical events that require early intervention. It is an important advancement for delivering preventative care. Carestream is prepared to support these new investments with our modular and standards-based Clinical Collaboration Platform that is open to different integrations.

Will artificial intelligence replace radiologists in the near future?

Will deep learning put an end to the profession in 2029? It doesn’t take an algorithm to realize the answer is no.

Dr. Eliot Siegel, Professor and Vice Chair Research Informatics at the University of Maryland School of Medicine, Department of Diagnostic Radiology, states that “while neural networks have been successful with very small (e.g. 220 x 240 pixel) images, they have not been applied to the far more complex images on a radiograph, much less a volumetric CT or MRI study.”

In his blog on 5 reasons why the future of radiologists is secure, he writes that “No one is anywhere close to having general success applying current techniques to medical images.  In order to create a system to make radiology observations, one would need to combine thousands of algorithms developed over the past 25 years. Even these would only cover a tiny fraction of the diseases and diagnoses made by radiologists.”

Role of AI in radiology will grow

Predicting the future is risky business. However, we can say unequivocally that AI healthcare applications will continue to evolve. Research published by MarketsandMarkets projects that the healthcare artificial intelligence market will grow to more than $7.9 billion by 2022. Further proof could be seen at RSNA17 which featured a Machine Learning Community and a Machine Learning Showcase for the first time in 2017.

Whether you call it artificial intelligence, big data, smart learning or some other term, you can be sure that algorithms will play an increasing role in quantification of diagnostic images. I hope this blog has given you a better understanding of the terms artificial intelligence, deep learning, and machine learning.  #AI #bigdata #ECR18 #ECR2018

Want to learn more about the meanings of artificial intelligence, deep learning, and machine learning?

Watch the video presentation from RSNA’s Machine Learning Showcase on “Creating Value in Enterprise Imaging Platforms with Analytics & AI”.

And read our blogs on:

Big Data in Radiology: Is The Future of Imaging a Number?

Will Radiologists Be Replaced by Computers?

Dario Arfelli is  Global Health IT Marketing Manager at Carestream Health.

 

 

 

 

 

 

 

References

1. Beebom: 10 Examples of Artificial Intelligence You’re Using in Daily Life //beebom.com/examples-of-artificial-intelligence/

2. Techterms: //techterms.com/definition/algorithm

3. Mitrefinch: //advancesystemsinc.com/ai-products-making-humans-former-employees/

4. Upwork:  //www.upwork.com/hiring/data/neural-networks-demystified/

5. Health Data Management: //www.healthdatamanagement.com/news/lack-of-access-to-health-data-said-to-limit-potential-of-machine-learning

6. //www.upwork.com/hiring/data/neural-networks-demystified/

7. Fox News: //www.foxnews.com/tech/2017/03/16/ray-kurzweil-predicts-computers-will-be-as-smart-as-humans-in-12-years.html

8. Marketsand Markets: //www.marketsandmarkets.com/PressReleases/artificial-intelligence-healthcare.asp

POST A COMMENT