Im working on a PhD concerning LLMs and somatic medicine, as an MD, and I must admit that my perspective is the complete opposite.
Medical care, at the end of the day, has nothing to do with having a license or not. Its about making the correct diagnosis, in order to administer the correct treatment. Reality does not care about who (or what) made a diagnosis, or how the antibiotic you take was prescribed. You either have the diagnosis, or you do not. The antibiotic helps, or it does not.
Doing this in practice is costly and complicated, which is why society has doctors. But the only thing that actually matters is making the correct decision. And actually, when you test LLMs (in particular o3/gpt-5 and probably gemini 2.5), they are SUPERIOR to individual doctors in terms of medical decisionmaking, at least on benchmarks. That does not mean that they are superior to an entire medical system, or a skillfull attending in a particular speciality, but it does seem imply that they are far from a bad source of medical information. Just like LLMs are good at writing boilerplate code, they are good at boilerplate medical decisions, and the fact is that there is so much medical boilerplate that this skill alone makes it superior to most human doctors. There was one study which tested LLM assisted (I think it was o3) doctors VS LLMs alone (+doctors alone) on a set of cases, and the unassisted LLM did BETTER than doctors, assisted or not.
And so all this medicolegal pearlclutching about how LLMs should not provide medical advice is entirely unfounded when you look at the actual evidence. In fact, the evidence seems to suggest that you should ignore the doctor and listen to chatGPT instead.
And frankly, as a doctor, it really grinds my gears when anyone implies that medical decisions should be a protected domain to our benefit. The point of medicine is not to employ doctors. The point of medicine is to cure patients, by whatever means best serves them. If LLMs take our jobs, because they do a better job than we do, that is a good thing. It is an especially good thing if the general, widely available LLM is the one that does so, and not the expensively licensed "HippocraticGPT-certified" model. Can you imagine anything more frustrating, as a poor kid in the boonies of Bangladesh trying to understand why your mother is sick, than getting told "As a language model I cannot dispense medical advice, please consult your closest healthcare professional".
Medical success is not measured in employment, profits, or legal responsibilities. It is measured in reduced mortality. The means to achieve this is completely irrelevant.
Of course, mental health is a little bit different, and much more nebulous overall. However, from the perspective of someone on the somatic front, overregulation of LLMs is unecessary, and in fact unethical. On average, an LLM will dispense better medical advice than an average person with access to google, which is what it was competing with to begin with. It is an insult to personal liberty and to the Hippocratic oath to support that this should be taken away simply because of some medicolegal bs.
Medical care, at the end of the day, has nothing to do with having a license or not. Its about making the correct diagnosis, in order to administer the correct treatment. Reality does not care about who (or what) made a diagnosis, or how the antibiotic you take was prescribed. You either have the diagnosis, or you do not. The antibiotic helps, or it does not.
Doing this in practice is costly and complicated, which is why society has doctors. But the only thing that actually matters is making the correct decision. And actually, when you test LLMs (in particular o3/gpt-5 and probably gemini 2.5), they are SUPERIOR to individual doctors in terms of medical decisionmaking, at least on benchmarks. That does not mean that they are superior to an entire medical system, or a skillfull attending in a particular speciality, but it does seem imply that they are far from a bad source of medical information. Just like LLMs are good at writing boilerplate code, they are good at boilerplate medical decisions, and the fact is that there is so much medical boilerplate that this skill alone makes it superior to most human doctors. There was one study which tested LLM assisted (I think it was o3) doctors VS LLMs alone (+doctors alone) on a set of cases, and the unassisted LLM did BETTER than doctors, assisted or not.
And so all this medicolegal pearlclutching about how LLMs should not provide medical advice is entirely unfounded when you look at the actual evidence. In fact, the evidence seems to suggest that you should ignore the doctor and listen to chatGPT instead.
And frankly, as a doctor, it really grinds my gears when anyone implies that medical decisions should be a protected domain to our benefit. The point of medicine is not to employ doctors. The point of medicine is to cure patients, by whatever means best serves them. If LLMs take our jobs, because they do a better job than we do, that is a good thing. It is an especially good thing if the general, widely available LLM is the one that does so, and not the expensively licensed "HippocraticGPT-certified" model. Can you imagine anything more frustrating, as a poor kid in the boonies of Bangladesh trying to understand why your mother is sick, than getting told "As a language model I cannot dispense medical advice, please consult your closest healthcare professional".
Medical success is not measured in employment, profits, or legal responsibilities. It is measured in reduced mortality. The means to achieve this is completely irrelevant.
Of course, mental health is a little bit different, and much more nebulous overall. However, from the perspective of someone on the somatic front, overregulation of LLMs is unecessary, and in fact unethical. On average, an LLM will dispense better medical advice than an average person with access to google, which is what it was competing with to begin with. It is an insult to personal liberty and to the Hippocratic oath to support that this should be taken away simply because of some medicolegal bs.