A recent study published in Radiology: Imaging Cancer, found that most women are comfortable with AI being used to support radiologists in interpreting mammograms. The study that surveyed women, found that 71% have said they are fine with AI playing a supporting role in the review process. However, the trust in AI alone is still low. Fewer than 5% women said they would be comfortable with AI being the sole reviewer of their mammogram results. This also suggested that while women were looking forward to innovation, they still value human expertise, especially when it comes to something as sensitive as breast cancer, which could be life-altering. What Is A Mammography?It is a specialized medical imaging test that uses low-dose X-rays to detect any early signs of breast cancer, even before symptoms appear. It can also help identify tumors, calcifications, and other abnormalities in breast tissues. Mammograms are also a vital part of routine breast cancer screening, especially for women who are over the age of 40 or those with a family history of the disease. During a mammogram, the breast is compressed between two plates to spread the tissue and obtain clear images. These images are then analyzed by the radiologists who then look for any suspicious signs that may indicate cancer. How AI Plays A Role In Mammography?With advances in technology, artificial intelligence (AI) is also being used to assist radiologists in reviewing mammogram results. AI algorithms are trained to identify patterns and flag any abnormalities. This also often spots subtle changes that could have been missed by the human eye.In an AI-assisted mammography, the computer does not replace the radiologist, but acts as a second reader. It also offers another layer of review. This can also increase accuracy, reduce false positives or negatives, and streamline the screening process. Most importantly, AI can quickly process thousands of images, making it a valuable tool in busy healthcare setting. How Was The Study Conducted?The study surveyed 518 women at UT Southwestern Medical Center for seven months in 2023. The results showed that women with a higher education levels and more awareness of AI were twice as likely to accept AI in their screenings. However, the study found that Black and Hispanic women were at a greater concern about data privacy and potential AI basis. These groups were also less likely to trust AI. They have highlighted the importance of addressing equity and transparency in AI development. There have been previous cases where AI has been biased against African American descents. A Detroit resident Robert Williams was arrested right in front of his children and held in detention for a night after a false positive in an AI facial recognition system. He eventually found out that faulty AI had identified him as a suspect. Another Detroit resident, Michael Oliver, and in New Jersey, Nijeer Parks experienced the same, they both were victims of false positives in AI facial recognition systems. These instances are the reason why women of color are also apprehensive about AI being used in their diagnosis, due to its perceived bias. However, women with a family history of breast cancer showed a higher degree of trust when their results were normal, possibly due to relief and reassurance. But women who had previously experienced abnormal results were more cautious, especially when AI and radiologists offered conflicting assessments.Dr. Basak Dogan, a co-author of the study and director of breast imaging research at UT Southwestern, emphasized the importance of gaining patient trust. “If patients are hesitant or skeptical about AI’s role in their care, this could impact screening adherence and, consequently, overall health care outcomes,” she said.Patients need to feel confident not just in the technology but also in how it’s being used.