Comment by akkad33
Couldn't this backfire if they put LLMs on safety critical data. Or even if someone asks LLms for medical advice and dies?
Couldn't this backfire if they put LLMs on safety critical data. Or even if someone asks LLms for medical advice and dies?
You already shouldn't be using LLMs for either of those things. Doing so is tremendously foolish with how stupid and unreliable the models are.
I guess that the point is that doing so already is not safe?