I would say there’s a failure in the body responsible for hiring and paying people to answer emergency calls. The only reason there is a shortage is because they are under paying employees. So yes, but AI, like everywhere else it’s been implemented, will fall short of what’s needed and will ultimately cost more financially, with the exception that in this case, lives could also be lost.
There’s without a doubt a problem, but AI isn’t the solution.
I’m not going to argue with you. AI blows. There are article out there about companies hiring people back after going to AI. It really is a snake oil product that corporations have gobbled up. It’s got it’s use cases as a tool, but not as a human replacement, especially in matters of live and death.
You can look up and research some articles of you want, or don’t. Clearly your opinion on the matter is not popular, and that could be some hive mind, or it could be because everyone else sees the problems that you don’t.
Putting a system in place that can’t actually think at all and have it try and comprehend what is or is not an emergency, to me, is a terrible idea, and doomed to fail. Take that as you will, I won’t be following up with anything else. You can have the final word if you want, because I just can’t be bothered to care.
This is for sure me sometimes. I’ll work something out over 10 minutes and decide that I don’t want to deal with any follow up or that the way I typed it wasn’t clear enough and I don’t want to fix it.
As much as I would like to clock and move inside sometimes, I also believe that silence is complacency, and when I feel something said is wrong that others will read, I have an obligation to say something. I’m definitely not always right, but in some matters it’s more perspective and others it’s based on fact. This conversation ran it’s course for me.
I would say there’s a failure in the body responsible for hiring and paying people to answer emergency calls. The only reason there is a shortage is because they are under paying employees. So yes, but AI, like everywhere else it’s been implemented, will fall short of what’s needed and will ultimately cost more financially, with the exception that in this case, lives could also be lost.
There’s without a doubt a problem, but AI isn’t the solution.
Unless it literally is. Do you know that it won’t be? What other example do you have to base your assertion on?
I’m not going to argue with you. AI blows. There are article out there about companies hiring people back after going to AI. It really is a snake oil product that corporations have gobbled up. It’s got it’s use cases as a tool, but not as a human replacement, especially in matters of live and death.
You can look up and research some articles of you want, or don’t. Clearly your opinion on the matter is not popular, and that could be some hive mind, or it could be because everyone else sees the problems that you don’t.
Putting a system in place that can’t actually think at all and have it try and comprehend what is or is not an emergency, to me, is a terrible idea, and doomed to fail. Take that as you will, I won’t be following up with anything else. You can have the final word if you want, because I just can’t be bothered to care.
Block and move on :)
This is for sure me sometimes. I’ll work something out over 10 minutes and decide that I don’t want to deal with any follow up or that the way I typed it wasn’t clear enough and I don’t want to fix it.
As much as I would like to clock and move inside sometimes, I also believe that silence is complacency, and when I feel something said is wrong that others will read, I have an obligation to say something. I’m definitely not always right, but in some matters it’s more perspective and others it’s based on fact. This conversation ran it’s course for me.