Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I did a bunch of research essays into medical uses of AI/ML and I'm not terrified, in fact the single most significant use of these technologies is probably in or around healthcare. One of the most cited uses would be expert analysis of medical imaging, especially breast cancer imaging. There is a lot of context to unpack around breast cancer imaging, or more sucinctly put, controversial drama! The fact is there is a statisticalluy high rate of false positives in breast cancer diagnostics made by human doctors. This reality resulted in a big overall policy shift to have women breast scanned less often, depending on their age, or something like that. Because so many women were victimized with breast surgery that turned out to be false positive or whatever. The old saying to make an omlet one must break a few eges is sometimes used, and that's a terrible euphamism. AI has proven to be better at looking at medical image, and in the case of breast cancer seems to out perform humans. And of course the humans have a monotonous job revewing image after image, and they want to be safe instead of latter being sorry, so of course they have high false possitives. The machines never get tired, they never get biased (this is a bone of contention), and they never stop. Ultimatly a human doctor still has to review the images, and the machines simply inform if the doctor is being too agressive in diagnosis, or possibly missing something. The whole thing gets escellated if there is any disparity. The out come from early studdies is encouraging, but these studies take years, and are very expensive. One of the biggest problems is the technology proficiency of medical staff is low, and so we are now in a situation where software engineers are cross traning to be at the level of a nurse or even doctors in rare cases.


> AI has proven to be better at looking at medical image, and in the case of breast cancer seems to out perform humans

FWIW, https://pmc.ncbi.nlm.nih.gov/articles/PMC11073588/ from 2024 Apr 4 ("Revolutionizing Breast Cancer Detection With Artificial Intelligence (AI) in Radiology and Radiation Oncology: A Systematic Review") says:

"Presently, when a pre-selection threshold is established (without the radiologist's involvement), the performance of AI and a radiologist is roughly comparable. However, this threshold may result in the AI missing certain cancers.

To clarify, both the radiologist and the AI system may overlook an equal number of cases in a breast cancer screening population, albeit different ones. Whether this poses a significant problem hinges on the type of breast cancer detected and missed by both parties. Further assessment is imperative to ascertain the long-term implications"

and concludes

"Given the limitations in the literature currently regarding all studies being retrospective, it has not been fully clear whether this system can be beneficial to breast radiologists in a real-time setting. This can only be evaluated by performing a prospective study and seeing in what situations the system works optimally. To truly gauge the system's effectiveness in real-time clinical practice, prospective studies are necessary to address current limitations stemming from retrospective data."


One very important part your comment doesn't mention: a real human being has to actually take images for the AI to analyze.

The amount of training a radiation technologist (the person who makes you put your body in uncomfortable positions when you break something) is significant. My partner has made a career of it, and the amount of school needed and clinical hours is non-trivial, and harder to do than becoming a nurse from what I understand.

They need to know as much about bones as orthopedic surgeons while also knowing how radiation works, as well as how the entire imagining tech stack works, while also having the soft skills needed to guide injured/ill patients to do difficult things (often in the midst of medical trauma).

The part where a doctor looks at images is really just a very small part of the entire "product." The radiologists who say "there's a broken arm" are never in the room, never see the patient, never have context. It's something that, frankly, an AI can do much more consistently and accurately at this point.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: