I wonder where that “human accuracy” statistic is coming from. Plenty of people don’t know how to read and interpret data, much less use excel in the first place. There’s a difference between 1/4 of people in the workforce not being able to complete a task, and a specialized AI not being able to complete a task. Additionally, this is how you get into the KPI as a goal rather than a proxy issue. AI will never understand context isn’t directly provided in the workbook. If you introduced a new drink at your restaurant in 2020 AI will tell you that the introduction of the drink caused a 100% decrease in foot traffic since there’s no line item for “global pandemic”. I’m not saying AI will never be there, but people using this version of AI instead of actual analysis don’t care about the facts and just want an answer and for that answer to be cheap.
As I’ve said many times, though not in this topic - AI is a tool to be used, and using it is a skill that needs to be learned.
For your pandemic example, that’s something that you would need to provide the AI with the context of. The joke of a “prompt engineer” being a job soon actually has merit, in that you want people who know how to use their tools the best. It’s constantly learning through iteration to give the AI a specific instruction set to get the results you want/need.
I wonder where that “human accuracy” statistic is coming from. Plenty of people don’t know how to read and interpret data, much less use excel in the first place. There’s a difference between 1/4 of people in the workforce not being able to complete a task, and a specialized AI not being able to complete a task. Additionally, this is how you get into the KPI as a goal rather than a proxy issue. AI will never understand context isn’t directly provided in the workbook. If you introduced a new drink at your restaurant in 2020 AI will tell you that the introduction of the drink caused a 100% decrease in foot traffic since there’s no line item for “global pandemic”. I’m not saying AI will never be there, but people using this version of AI instead of actual analysis don’t care about the facts and just want an answer and for that answer to be cheap.
As I’ve said many times, though not in this topic - AI is a tool to be used, and using it is a skill that needs to be learned.
For your pandemic example, that’s something that you would need to provide the AI with the context of. The joke of a “prompt engineer” being a job soon actually has merit, in that you want people who know how to use their tools the best. It’s constantly learning through iteration to give the AI a specific instruction set to get the results you want/need.