It looks like this allows individual States to truly enable whatever-it-is within their own borders.
Still, I don't like it: If a pharmacist wants to use a tool and they validate the result and take legal responsibility for it being accurate, that's different and fine.
But this looks more like enabling states to have robo-pharmacist, and then where's the legal responsibility and liability now?
I'd expect software that issues prescriptions to be a class II or class III medical device and that any or all of the manufacturer, the institution using it, and any human operator in the loop (if there is one) would be liable for errors, depending on the exact situation.
Surely the liability will lie with the insurance company, like with regular doctors in the US. And if there is too much malpractice, the state can revoke the license, like a regular doctor.
I would love to know what lobbyist created this bill and who they were lobbying on behalf of.
There's literally no way this was a creation of a senator or representative pulled out of their own thoughts about a hypothetical future. Someone's business model depends on this.
Could this be due to amazon lobbying? They are trying to capture the pharmacy market; if they can leverage their assets to roll in an AI "pharmacist" that could be extremely profitable.
And one would expect AI approval processes to not be a barrier? My doctor uses tele-visits, and will prescribe medications with merely a few message exchanges. I don't know what AI will solve here.
I for one look forward to being prescribed drugs that do not exist in dosages that can not be fit into the human blood stream to treat the Morgellon’s that the computer agrees is real and something I am afflicted with
Still, I don't like it: If a pharmacist wants to use a tool and they validate the result and take legal responsibility for it being accurate, that's different and fine.
But this looks more like enabling states to have robo-pharmacist, and then where's the legal responsibility and liability now?