Human-AI Teaming for the Next Generation Smart City Healthcare Systems

Written by M. Abdur Rahman, M. Shamim Hossain, Ahmad J. Showail, and Nabil A. Alrajeh

We have been witnessing impressive advancements in healthcare provisioning in Smart Cities. Several technological advancements have contributed to this advancement, including the Internet of Medical Things (IoMT), medical big data, edge learning, and 6G. With the support of Artificial Intelligence (AI) capability at the edge, IoMT nodes, such as the CT Scan machine, can now do the diagnosis at the hospital edge nodes with very high accuracy and share the results with authorized medical personnel almost in real time. The massive amount of medical big data that are generated by IoMT devices each day is becoming unmanageable by humans. Hence, AI contributed to superior forecasting and prediction, emergency health operations and response, prevention of infection spreading, highly accurate medical diagnosis, treatment, and drug research capabilities.

may2021 4 1
Figure 1. Potential AI-assisted clinical workflow.

 

One area in which AI excels is demystifying complex patterns from medical images. The speed and accuracy levels of quantifiable digital evidence from medical images complement medical doctors during their decision-making processes [1]. Moreover, AI allows aggregating multiple types of diagnosis results available from diversified types of electronic health and medical records such as genomics, radiological images, and pathological, physiological, and psychological data.

As shown in Fig. 1, AI has contributed to 4 high-level clinical processes:

a) detection and isolation,
b) clinical judgment and interpretation,
c) observation and treatment, and
d) continuous monitoring of objects of interest.

 

For example, AI algorithms can localize and annotate suspected regions from medical images to lower observational oversights and assist in initial screening. AI algorithms can help early detection of lung cancer, incidental findings of asymptomatic brain abnormalities, robust screening mammography interpretation, prostate lesion detection from cancer imaging, and macular edema in fundus images. Once detected, AI algorithms can, for example in the case of cancer tumor cells, help in providing clinical interpretation of both intra- and inter-tumor heterogeneity, abnormality, growth rate, and variability. AI algorithms can also help in classifying abnormalities as benign or carcinogenic, categorizing tumors into cancer stages and associating tumor features with genomic data. Finally, capturing such a large number of discriminative tumor features would allow the AI monitoring process to spatially locate the physical spreading of the cancerous tumor cells, as well as temporally track the changes in the tumor cell characteristics as prognosis. AI can support clinical workflow with interventions at different stages of medical care. For example, in the case of oncological treatment, AI can assist human subject matter experts during radiological diagnosis of a mass lesion, histopathologic diagnosis, molecular genotyping, and determination of clinical outcome.

Despite the superiority of computing and medical data processing power, AI algorithms were not widely accepted by the existing smart city healthcare scenarios. This is because AI has not yet been able to support human-like interaction and context-aware responses to human queries. In fact, the new generation of AI models had passed its “black box” era and entered a new dimension of social acceptability. By incorporating explainability within the AI eco-system, AI researchers have been trying to combine the power of human intelligence and superiority of machine intelligence.

Human-AI teaming for smart city healthcare applications poses several challenges that need to be addressed. Healthcare domains in society are driven by human ethics, fairness, safety and security of the patients, financial regulations and insurance policies, privacy of the medical data, and auditing support in case there is any mistake by the medical doctors. By adopting diversified features such as ethics, non-biasness, trustworthiness, explainability, safety guarantee, data privacy, system security, and auditability, AI is becoming more sophisticated, and an increasing amount of decision-making is being performed via ‘human-AI teaming’ without compromising high performance. In the medical environment, every healthcare service provider understands the context of his/her patients and uses the diagnosis results and medical datapoints as evidence of the future treatment plan. In some cases, the intelligence of the medical doctor needs to be complemented, or even overruled, by AI cyber elements. To bring trust to the AI system, clinicians must be convinced through semantic insight and evidence of the underlying AI processes [2]. Making Explainable AI (or ‘XAI’) interpretable by healthcare stakeholders offers them a degree of qualitative, functional understanding, or what has been called semantic interpretations. Although some progress has been made towards the advancement of XAI, allowing healthcare stakeholders to understand XAI’s decision making process is a novel challenge [3].

What makes XAI appealing is that it would offer answers to queries such as “Would you trust the AI that advises invasive surgery based on the tumor image classification?” by presenting various types of evidence and diagnostic data points. Figure 2 shows a high-level scenario where AI will work with human actors as a team. We assume that the next generation of AI will have multimodal features embedded to support human-like ethics, avoid biasness toward dataset or inferencing, provide trust to its stakeholders, provide privacy and security of the underlying dataset or the model, allow debugging/auditing of the process, and make sure the safety of the AI model. Human queries are answered by the AI engine with the appropriate evidence. Steps in the treatment plan are illustrated just like one doctor works with another doctor and they share their mutual experience in deciding a treatment plan, thereby allowing doctors to work with AI counterparts side by side.

Figure 2 shows a human doctor – XAI interaction where the doctor reviews a melanoma skin cancer report generated by the XAI. After getting the query from the doctor, the XAI replies with the final inference from the skin cancer model. The human doctor can then question different aspects of the AI decision by sending queries to the engine to be convinced with the results. For example, the XAI engine replies with the marker on top of the cancerous area that was used in the decision-making process. Having a possible co-morbidity of COVID-19 infections, the doctor asks about the diagnosis results available from the X-Ray and CT Scan images. After returning a positive result, the XAI model presents 4 pieces of evidence that were used by the algorithm itself. First, it marked the suspected region of infection. Second, the algorithm did not find any similarity of this infection with existing pneumonia. Third, it found 97.5% similarity with the COVID-19 symptom database it has matched with. And finally, it presented the progression of the same patient’s days 3, 7, and 9 that play a key role in COVID-19 pathogen’s spreading pattern.

We envision a democratized XAI adopted by the smart cities where physical patients and the IoMT devices will have their digital twin managed by the XAI. Future smart cities will leverage XAI to offer innovative medical breakthroughs in telehealthcare, doctor on demand, remote medical diagnosis, and in-home personalized healthcare support, opening the door or ubiquitous human-XAI teaming in healthcare.

 

may2021 4 2

Figure 2. A scenario of the next generation smart city healthcare system, where AI will work with human actors as a team.

 

 

 

References

  1. W. L. Bi et al., “Artificial intelligence in cancer imaging: Clinical challenges and applications,” CA. Cancer J. Clin., vol. 69, no. 2, pp. 127–157, 2019, doi: 10.3322/caac.21552.
  2. M. A. Rahman, M. S. Hossain, N. Alrajeh, and N. Guizani, “B5G and Explainable Deep Learning Assisted Healthcare Vertical at the Edge COVID 19 Perspective,” IEEE Netw., 2020.
  3. J. B. Lamy, B. Sekar, G. Guezennec, J. Bouaud, and B. Séroussi, “Explainable artificial intelligence for breast cancer: A visual case-based reasoning approach,” Artif. Intell. Med., vol. 94, pp. 42–53, 2019.

 

This article was edited by Wei Zhang

For a downloadable copy of the June 2021 eNewsletter which includes this article, please visit the IEEE Smart Cities Resource Center.

DrRahman UPM PublicPic
Dr. Md Abdur Rahman is an Associate Professor in the Department of Cyber Security and Forensic Computing, College of Computing and Cyber Sciences, University of Prince Muqrin (UPM), Madinah, Saudi Arabia. Dr. Rahman holds an honorary external research fellowship of King’s College London (KCL), UK. His research interests include Blockchain and off-chain solutions, Digital Twin systems, Explainable AI for smart city, Cyber Physical Multimedia Systems, IoT and 5G security. He is the recipient of BEST Researcher Award from UPM for the year 2018, 2019, and 2020. He has authored more than 125 publications. He has 1 US patent granted and several are pending. Dr. Rahman has received more than 19 million SAR as research grant from KACST, KSA and from other international funding bodies. Dr. Rahman has received three best paper awards from ACM and IEEE. Dr. Rahman is a member of ACM and a senior member of IEEE.
PicturePlace Holder
M. Shamim Hossain is currently a Professor with the Department of Software Engineering, College of Computer and Information Sciences, King Saud University, Riyadh, Saudi Arabia. He is also an adjunct professor with the School of Electrical Engineering and Computer Science, University of Ottawa, ON, Canada. He is the chair of IEEE Special Interest Group on Artificial Intelligence (AI) for Health with IEEE ComSoc eHealth Technical Committee. Currently, he is the Co-Chair of the 1st IEEE GLOBECOM 2021 Workshop on Edge-AI and IoT for Connected Health. He is on the editorial board of the IEEE Transactions on Multimedia, IEEE Multimedia, IEEE Network, IEEE Wireless Communications, IEEE Access, Journal of Network and Computer Applications, International Journal of Multimedia Tools and Applications. He is a senior member of both the IEEE, and ACM. He is an IEEE ComSoc Distinguished Lecturer (DL).
DrNabilPic
Dr. Nabil A. Alrajeh obtained his Ph.D. in biomedical Informatics engineering from Vanderbilt University, USA. Currently, Dr. Alrajeh is a professor of Health Informatics at King Saud University and the Rector of Prince Mugrin Bin Abdulaziz University. Dr. Alrajeh worked as a senior advisor for the Ministry of Higher Education, his role was implementing development programs including educational affairs, strategic planning and research and innovation. Dr. Alrajeh is a board member of several private universities in Saudi Arabia.
pic DrShowail
Dr. Ahmad Showail is an assistant professor of computer science in the College of Computer and Cyber Sciences at the University of Prince Mugrin, Madinah, Saudi Arabia. He is also an assistant professor of computer engineering in the College of Computer Science and Engineering at Taibah University. Dr. Ahmad holds a BSc (Honors) in computer engineering from King Fahd University of Petroleum and Minerals, MSc and PhD in computer science from King Abdullah University of Science and Technology. While at KAUST, he was the recipient of the Academic Excellence Award (top 5%). Before joining KAUST, Ahmad worked for several years as a system engineer with SABIC. Dr. Ahmad published several papers in top journals and magazines and filed a US patent. Dr. Ahmad’s research focuses on wireless networks, smart cities and the Internet of Things. Dr. Ahmad was a visiting scholar at the University of Oxford and Texas A&M University.

IEEE Smart Cities Publications Journals and Magazines Special Issues

This web page displays the effort of IEEE Smart Cities Publications Committee in proposing and guest editing special issues for IEEE Journals and Magazines which is of interests to IEEE Smart Cities Community. Please click here to view.

Past Issues

To view archived articles, and issues, which deliver rich insight into the forces shaping the future of the smart cities. Older eNewsletter can be found here. To download full issues, visit the publications section of the IEEE Smart Cities Resource Center.