1. Explainable artificial intelligence (XAI): closing the gap between image analysis and navigation in complex invasive diagnostic procedures
In this literature review, the challenges of accurately diagnosing bladder cancer through cystoscopy are discussed. False negatives and false positives are highlighted as risks associated with this procedure. The authors propose that XAI robot-assisted cystoscopes could be used to overcome these risks and provide a more accurate diagnosis. The authors suggest that cystoscopy is a good starting point for automation and could establish a model for other procedures. Additionally, the use of a specialized nurse to perform cystoscopy could free up urologists' time. The result of automated diagnostic cystoscopy would be a short video that could be reviewed by the urologist at a more convenient time.
2. Applications of Explainable Artificial Intelligence in Diagnosis and Surgery
Artificial intelligence (AI) has shown great promise in medicine in recent years. However, its black-box nature has made clinical applications of AI challenging due to explainability issues. To overcome these limitations, some researchers have explored the use of explainable artificial intelligence (XAI) techniques. XAI can provide both decision-making and explanations of the model, making it a more transparent and interpretable AI method. In this literature review, the authors surveyed recent trends in medical diagnosis and surgical applications using XAI by searching articles published between 2019 and 2021 from several databases. The authors included articles that met their selection criteria and extracted and analyzed relevant information from the studies. The review includes an experimental showcase on breast cancer diagnosis and illustrates how XAI can be applied in medical XAI applications. The authors also summarize the XAI methods used in medical XAI applications, the challenges encountered by researchers, and discuss future research directions. The survey result indicates that medical XAI is a promising research direction, and the review aims to serve as a reference for medical experts and AI scientists when designing medical XAI applications.
3. Explainable artificial intelligence in skin cancer recognition: A systematic review
The use of deep neural networks (DNNs) in medical applications is becoming increasingly popular due to their ability to solve complex problems. However, the decision-making process of DNNs is essentially a black-box process, which makes it difficult for physicians to judge the reliability of the decisions. Explainable artificial intelligence (XAI) has been suggested as a solution to this problem. In this study, the authors investigate how XAI is used for skin cancer detection, including the development of new DNNs, commonly used visualizations, and evaluations with dermatologists or dermatopathologists. The authors searched for peer-reviewed studies published between January 2017 and October 2021 in various databases using specific search terms. They found that XAI is commonly applied during the development of DNNs for skin cancer detection, but there is a lack of systematic and rigorous evaluation of its usefulness in this scenario.
4. Explainable artificial intelligence for precision medicine in acute myeloid leukemia
In this article, the author discusses the limitations of using artificial intelligence (AI) in personalized treatments based on drug screening and whole-exome sequencing experiments (WES) due to the "black box" nature of AI decision-making. The article introduces explainable AI (XAI) as a potential solution to make AI results more understandable to humans. The article presents a new XAI method called multi-dimensional module optimization (MOM) that associates drug screening with genetic events to provide an interpretable and robust therapeutic strategy for acute myeloid leukemia (AML) patients. The article highlights the success of the MOM method in predicting AML patient response to several drugs based on FLT3, CBFβ-MYH11, and NRAS status. The article emphasizes the potential of XAI to aid healthcare providers and drug regulators in better understanding AI medical decisions.
5. Machine learning in postgenomic biology and personalized medicine
In recent years Artificial Intelligence in the form of machine learning has been revolutionizing biology, biomedical sciences, and gene-based agricultural technology capabilities. Massive data generated in biological sciences by rapid and deep gene sequencing and protein or other molecular structure determination, on the one hand, requires data analysis capabilities using machine learning that are distinctly different from classical statistical methods; on the other, these large datasets are enabling the adoption of novel data-intensive machine learning algorithms for the solution of biological problems that until recently had relied on mechanistic model-based approaches that are computationally expensive. This review provides a bird's eye view of the applications of machine learning in post-genomic biology. Attempt is also made to indicate as far as possible the areas of research that are poised to make further impacts in these areas, including the importance of explainable artificial intelligence (XAI) in human health. Further contributions of machine learning are expected to transform medicine, public health, agricultural technology, as well as to provide invaluable gene-based guidance for the management of complex environments in this age of global warming.
6. Deep Learning in Neuroimaging: Overcoming Challenges With Emerging Approaches
Niciun comentariu:
Trimiteți un comentariu