The impact of the introduction of Artificial Intelligence to Ireland

T. Anderson, WC. Torreggiani

Department of Radiology, Tallaght University Hospital, Dublin 24

 

Artificial intelligence (AI) is a branch of computer science dealing with the capability of a machine to imitate intelligent human behaviour. It has become a rapidly evolving part of the technology industry.

Diagnostic radiology, a computer-based service, is unsurprisingly at the forefront of the discussion of the use of AI in medicine. There are differing schools of thought regarding its use; namely, will AI eventually replace the radiologist? Or indeed will it ever be fully capable of replacing radiology as a specialty, but rather be used as an aid, whereby a human’s input will always be required?

The current challenge for AI is to expand on the machine’s ability for “deep learning”. This concerns the potential for a computer programme to imitate the brain and learn by experience. To allow for this to occur, the programme must be introduced to large amounts of existing data so that it may learn how to make its own diagnoses based on example. Its downfall however lies in its inability to explain itself once a decision is made. This is because it considers multifactorial, multi-layered and individually weighted connections to reach a final conclusion1. It has been shown that doctors will not blindly accept a machine’s decision without the transparency of knowing how the diagnosis was reached2.

It is proposed that AI will expediate diagnoses based on quick image interpretation and furthermore could prove to be cheaper for the Health Service. The question then arises, whom is liable when software makes an error? To err is human, and as Garland pointed out in 19493, radiologists too are prone to human errors. A review from the UK published earlier this year4 highlighted that almost a third of medicolegal claims over the last 11 years targeted radiology as the primary specialty at fault5. A recent article regarding robotics in healthcare, demonstrated the current lack of liability framework where AI is concerned. The authors highlighted that although regulations will be needed, it would be important not to allow these rules to be so stringent as to impede technological developments6.

If, for example, a software flaw is inherent in an image diagnostics programme, one might assume that the negligence liability will lie with the software developer. The courts however, have seemed reluctant to afford liability to the developers of healthcare applications of AI. Instead, the final decision and hence liability, has been deemed to rest with the healthcare practitioner7.

In Ireland, arguably the most important medical negligence case to date was that of Dunne vs. National Maternity Hospital in 19898. This landmark Supreme Court judgment laid out certain fundamental principles pertaining to medical negligence law in this country, including but not limited to the concept that: “the true test for establishing negligence in diagnosis or treatment on the part of a medical practitioner is whether he has been proved to be guilty of such failure as no medical practitioner of equal specialist status would be guilty of it [when] acting with ordinary care”. This recognises that even when using the “ordinary level of care”, that doctors can get things wrong without being negligent per se. This case recognises that there can reasonably be “honest difference of opinions between doctors as to which is the better of two ways of treating a patient” and that choosing treatment A over treatment B does not necessarily provide grounds for negligence.

Currently in diagnostic radiology, a clinical query is asked of the interpreting radiologist. Once filmed, onus is on the radiologist to report on the entire image, not solely on the organ system to which the clinical query pertains.

The question arises: will AI’s high false positive rate require the radiologist to comment on every single highlighted aspect of a scan whether they think it pertinent or not? From a medicolegal point of view, it might be prudent to state why it is that they have ignored an alert deemed by the computer to be of significance.

In the absence of AI, should a radiologist omit to detect an early cancerous lesion, it might be reasonable to infer that human error led to a missed diagnosis. This would be especially true if they could show that another radiologist “of equal status could be guilty of such failure when acting with ordinary care”.

Conversely, it is proposed that if a radiologist using AI chooses to ignore a lesion highlighted by the software, and that lesion later turns out to be cancerous, the radiologist may not be deemed negligent if, at the time they have acknowledged the lesion in their official radiology report but, as in Dunne vs. National Maternity Hospital expressed an honest difference of opinion with the software. Should they omit to do so, it is suggested by this author that they will potentially face legal ramifications when a diagnosis is missed. I propose that in order to avoid liability, the radiologist will have to comment on each false positive alert, thus massively increasing their workload.

There is no doubt, based on the literature, that not many foresee the imminent replacement of radiologists by AI. The common thought it that radiologists will remain a central crucial cog in the diagnostic process of image-based medicine, with AI acting as a “cognitive companion”.  It will likely improve patient outcomes and eventually save money in the process.

However, if each alert by a diagnostics programme must be addressed individually and remarked upon, it could mean an increased workload for the radiologist. This would be especially true in its infancy and as such refute the idea that AI will replace radiology as a specialty.

Conflict of Interest
The authors of this manuscript declare no conflicts of interest in writing this piece. The authors state that this work has not received any funding.

Correspondence:
Dr. Toni Anderson
E-mail: [email protected]

References
1. Shortliffe EH. Computer-based medical consultations: MYCIN. New York, NY: American Elsevier; 1976.
2. Teach RL, Shortliffe EH. An analysis of physician attitudes regarding computer-based clinical consultation systems. Computer Biomedical Research.
3. Garland LH. On the scientific evaluation of diagnostic procedures. Radiology 1949; 52: pp 309-328.
4. Hulson O. Litigation claims in relation to radiology: what can we learn? Clinical Radiology. 2018 Jul 3. pii: S0009-9260(18)30218-6. doi: 10.1016/j.crad.2018.05.025 [Epub ahead of print]
5. Charles SC. Coping with a medical malpractice suit. West J Med, 174 (1) (2001), pp. 55-58.
6. Cresswell K, Cunningham-Burley S1, Sheikh A1. Health Care Robotics: Qualitative Exploration of Key Challenges and Future Directions. J Med Internet Res. 2018 Jul 4;20(7):e10410. doi: 10.2196/10410.
7. Randolph A. Miller & Sarah M. Miller, Legal and Regulatory Issues Related to the Use of Clinical Sotware in Health Care Delivery, in Clinical Decision Support: The Road Ahead 423, 426 (Robert A. Greenes ed., 2007).
8. Dunne vs. National Maternity Hospital [1989] IR 91 (SC)

 

P799