Home Artificial Intelligence and Radiology Have Rumors of the Radiologist's Demise Been Greatly Exaggerated?
Post
Cancel

Artificial Intelligence and Radiology Have Rumors of the Radiologist's Demise Been Greatly Exaggerated?

Artificial intelligence is a rapidly evolving computerized technology affecting multiple aspects of our lives. It is predicted that artificial intelligence will lead to a fundamental change in practice of many professional fields, including medicine. One of the most significant advances in artificial intelligence involves digital imaging and image recognition. Consequently, radiologists, who work in the most digitalized field of medicine, need to be familiar with this rapidly progressing technology. “Artificial intelligence,” “machine learning,” and “deep learning” are terms that tend to be used interchangeably in terms of advanced computer algorithms, but each has a different meaning. Objectives for this article are to demystify these terms for radiologists and to establish a basic understanding of this topic for the reader. We also discuss the impact that artificial intelligence might have on the field of radiology in the foreseeable future. Although artificial intelligence is unlikely to replace radiologists any time soon (if ever), we explore how this technology could be beneficial to radiologists.

Introduction

Artificial intelligence is a rapidly growing technical field positioned at the intersection of statistics and computer science. Artificial intelligence is currently used in many industries and has multiple applications in health care. Recently, there has been increased interest in applying artificial intelligence to medical imaging for a more accurate diagnosis of diseases. Consequently, radiology could become the first medical specialty significantly affected by this rapidly developing field.

Artificial intelligence technology has existed for more than 50 years and has become increasingly sophisticated. The British mathematician Alan Turing, who was one of the founders of modern computer science and artificial intelligence, largely reserved the phrase “artificial intelligence” for a technology that could broadly mimic the intelligence of humans, which later became popularized as the “Turing test” . The present revolution in data science started in early 2013 with the advent of IBM’s Watson supercomputer, which has immense computing power and the ability to analyze images with astonishing speed and accuracy. The advances in this field have been partially attributed to the wide availability of computer graphics processing units, which have made parallel processing faster, cheaper, and more powerful, allowing for major improvements in image recognition. In 2015, IBM purchased Merge Healthcare, providing supercomputers access to a vast amount of existing medical records data for the purpose of training to improve their ability to read imaging studies, initiating the entrance of large corporations into the realm of automated image interpretation .

The terms “artificial intelligence,” “machine learning,” and “deep learning” have different meanings but are often used interchangeably. The purpose of this article was to define and clarify these fundamental terms for radiologists and to discuss the effects that artificial intelligence could have on the radiology profession in the near future. We also discuss how this technology may soon be beneficial to radiologists.

Artificial Intelligence: Basic Terms and Principles

Before the advent of artificial intelligence, traditional computer programs relied on written lines of code to achieve a specific task. The computer did not “think” but simply performed the task as it was programmed to do. In recent years, advanced algorithms have allowed computers to make decisions autonomously. These computers are not explicitly instructed on the paths to use when performing specific tasks but rather rely on mathematical and statistical models to direct their decision-making to arrive at optimal solutions to problems. Artificial intelligence is the broadest way to consider this advanced computer intelligence. In 1956, at the Dartmouth Artificial Intelligence Conference, this technology was described as follows: “Every aspect of learning or any other feature of intelligence can, in principle, be so precisely described, that a machine can be made to simulate it” .

Machine learning is one subfield of artificial intelligence. In 1959, Arthur Samuel, a pioneer in artificial intelligence research, defined machine learning as the “field of study that gives computers the ability to learn without being explicitly programmed” . By using statistical learning manipulations, computers can automatically discover patterns in input data. Unlike software programs that require specific instructions to complete a task, with machine learning, the computer system develops the ability to recognize patterns independently and make predictions. Machine learning is now being applied in multiple everyday applications, including data security, financial trading, marketing personalization, fraud detection, product recommendations, online searches, speech recognition, translation between languages, and smart cars.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 1, Supervised machine learning—training. Input geometric shapes are manually labeled by the operator and pass through the algorithm to “teach” the computer of the desired output sorting scheme.

Figure 2, Supervised machine learning—execution. After training is completed, the algorithm is able to perform the sorting by itself on a dataset that it had been previously trained, using the same labeling that was used during training.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 3, Unsupervised machine learning—training. The computer recognizes by itself the differences between the input data, with no manual labeling done by the operator. The computer sorts the input shapes into output groups using its own labeling method.

Figure 4, Unsupervised machine learning—execution. After training is completed, the algorithm is able to perform the sorting by itself, even on a new dataset that it had not been previously trained, using its own labeling method.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 5, Deep learning—neural network. Each node in the diagram represents a processing unit. Nodes that perform similar operations are organized into layers. Information is processed sequentially from lower-level layers to high-level layers, with increasing levels of abstraction. The end result is recognition of the input geometric shape as a category of a unique shape.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Will Artificial Intelligence Technology Replace Radiologists in the Foreseeable Future?

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

How Artificial Intelligence Will Affect Radiological Practice

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Conclusion

Get Radiology Tree app to read full this article<

References

  • 1. Turing A.M.: Computing machinery and intelligence. Mind 1950; 59: pp. 433-460.

  • 2. Watson to gain ability to “see” with planned $1B acquisition of merge healthcare. August 6; Available at http://www.merge.com/News/Article.aspx?ItemID=660

  • 3. Dartmouth Artificial Intelligence (AI) conference. Available at http://www.livinginternet.com/i/ii_ai.htm

  • 4. Samuel A.L.: Some studies in machine learning using the game of checkers. IBM J Res Dev 1959; 3: pp. 535-554.

  • 5. Dheeba J., Singhemail N., Selviemail S.: Computer-aided detection of breast cancer on mammograms: a swarm intelligence optimized wavelet neural network approach. J Biomed Inform 2014; 49: pp. 45-52.

  • 6. Summers R.M., Beaulieu C.F., Pusanik L.M., et. al.: Automated polyp detector for CT colonography: feasibility study. Radiology 2000; 216: pp. 284-290.

  • 7. Chen S., Suzuki K., MacMahon H.: A computer-aided diagnostic scheme for lung nodule detection in chest radiographs by means of two-stage nodule-enhancement with support vector classification. Med Phys 2011; 38: pp. 1844-1858.

  • 8. El-Sayed A., Mohsen H.M., Revett K., et. al.: Computer-aided diagnosis of human brain tumor through MRI: a survey and a new algorithm. Expert Syst Appl 2014; 41: pp. 5526-5545.

  • 9. Gothwal H., Kedawat S., Kumar R.: Cardiac arrhythmias detection in an ECG beat signal using fast Fourier transform and artificial neural network. J Biomed Sci Eng 2011; 4: pp. 289-296.

  • 10. Erickson B.J., Korfiatis P., Akkus Z., et. al.: Machine learning for medical imaging. Radiographics 2017; 37: pp. 505-515.

  • 11. Le Cun Y., Jackel L.D., Boser B., et. al.: Handwritten digit recognition: applications of neural net chips and automatic learning. Neurocomputing 1990; 68: pp. 303-318.

  • 12. Lee H., Hansung L., Yunsu C., et. al.: Face image retrieval using sparse representation classifier with Gabor-LBP histogram. Lect Notes Comput Sci 2011; 6513: pp. 273-280.

  • 13. Google’s AI wins fifth and final game against Go genius Lee Sedol. Cade Metz business; March 15; Available at https://www.wired.com/2016/03/googles-ai-wins-fifth-final-game-go-genius-lee-sedol/

  • 14. The Economist : Artificial intelligence: the impact on jobs. Will smarter machines cause mass unemployment?. June 252016.

  • 15. Obermeyer Z., Emanuel E.J.: Predicting the future—big data, machine learning, and clinical medicine. N Engl J Med 2016; 375: pp. 1216-1219.

  • 16. Chockley K., Emanuel E.: The end of radiology? Three threats to the future practice of radiology. J Am Coll Radiol 2016; 13: pp. 1415-1420.

  • 17. Lakhani P., Sundaram B.: Deep learning at chest radiography: automated classification of pulmonary tuberculosis by using convolutional neural networks. Radiology 2017; 284: pp. 526-529.

  • 18. Hwan B.M., Hoon L.J., Heon Y.D., et. al.: Erroneous computer electrocardiogram interpretation of atrial fibrillation and its clinical consequence. Clin Cardiol 2012; 35: pp. 348-353.

  • 19. Bogun F., Anh D., Kalahasty G., et. al.: Misdiagnosis of atrial fibrillation and its clinical consequences. Am J Med 2004; 117: pp. 636-642.

  • 20. Anh D., Krishnan S., Bogun F.: Accuracy of electrocardiogram interpretation by cardiologists in the setting of incorrect computer analysis. J Electrocardiol 2006; 39: pp. 343-345.

  • 21. Davidenko J.M., Snyder L.S.: Causes of errors in the electrocardiographic diagnosis of atrial fibrillation by physicians. J Electrocardiol 2007; 40: pp. 450-456.

  • 22. Guglin M.E., Thatai D.: Common errors in computer electrocardiogram interpretation. Int J Cardiol 2006; 106: pp. 232-237.

  • 23. Roopa C.K., Harish B.S.: A survey on various machine learning approaches for ECG analysis. Int J Comput Appl 2017; 163: pp. 25-33.

  • 24. Lindfors K.K., McGahan M.C., Rosenquist C.J., et. al.: Computer-aided detection of breast cancer: a cost-effectiveness study. Radiology 2006; 239: pp. 710-717.

  • 25. Fenton J.J., Taplin S.H., Carney P.A., et. al.: Influence of computer-aided detection on performance of screening mammography. N Engl J Med 2007; 356: pp. 1399-1409.

  • 26. Rao V.M., Levin D.C., Parker L., et. al.: How widely is computer-aided detection used in screening and diagnostic mammography?. J Am Coll Radiol 2010; 7: pp. 802-805.

  • 27. Fenton J.J., Abraham L., Taplin S.H., et. al.: Effectiveness of computer-aided detection in community mammography practice. J Natl Cancer Inst 2011; 103: pp. 1152-1161.

  • 28. Siegal E.: Peering into the future through the looking glass of artificial intelligence. Presented at; the 2016 Society for Imaging Informatics in Medicine (SIIM) Conference; Available at https://c.ymcdn.com/sites/siim.org/resource/resmgr/siim2016/presentation/SIIM16_ClosingGS_Siegel.pdf

  • 29. Azavedo E., Zackrisson S., Mejare I., et. al.: Is single reading with computer-aided detection (CAD) as good as double reading in mammography screening? A systematic review. BMC Med Imaging 2012; 12: pp. 22.

  • 30. Gromet M.: Comparison of computer-aided detection to double reading of screening mammograms: review of 231,221 mammograms. AJR Am J Roentgenol 2008; 190: pp. 854-859.

  • 31. Castelvecchi D.: Can we open the black box of AI?. Nature 2016; 538: pp. 20-23.

  • 32. Parasuraman R.: Humans and automation: use, misuse, disuse, abuse. Hum Factors 1997; 39: pp. 230-253.

  • 33. Gillies R.J., Kinahan P.E., Hricak H.: Radiomics: images are more than pictures, they are data. Radiology 2016; 278: pp. 563-577.

  • 34. Lim M.C., Tan C.H., Cai J., et. al.: CT volumetry of the liver: where does it stand in clinical practice?. Clin Radiol 2014; 69: pp. 887-895.

This post is licensed under CC BY 4.0 by the author.