Hacker News new | past | comments | ask | show | jobs | submit login

> Sorry, what?

I did not say the doctor would need to trust the answer. Only that they should be required to ask.




What if they ask another doctor instead?

What if the case is so straightforward (you've see thousands of these, hundreds a year, the entire system is built around them) that you know the diagnosis in less than the blink of an eye?

What if it's emergent, and you have no time to think, like a major hemorrhage? Not only is it obvious, but you must act now, right now?

What if there is a highly studied, routinized process (e.g. cardiac arrest) where you're managing a team going through the diagnostic procedure and treatment, which, over decades, have become a carefully interleaved dance performed at stacatto pace, and, again, there is no time to consult an LLM?

What if? What if? What if?

Are you so certain?


Ask an LLM unless the time it would take to ask would risk significant harm.

That seems to cover all your cases?


> That seems to cover all your cases?

It doesn't. LLMs are trained on literature. Women and people of color are severely underrepresented in medical literature.

E.g., patients of color are rarely selected for clinical trials etc.


We should require software engineers to do the same. So much garbage code I've reviewed that would have easily been resolved had the SE just "asked an LLM".

Maybe we can legislate this into existence as well?


The problem is that it is a short trip from a requirement like that to a required inquiry for those that ignore the answer.

Probably most of the time when they ignore it, they'd be right. After all, neither the NN nor themselves are right 100% of the time.

But is that other case a lawsuit risk? And we've developed into a very risk averse society.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: