As AI enters the workplace, staff won't always trust the decisions the software makes. It is a catch-22. If the software’s decision matches their own choice, AI doesn't add much value. If the AI comes to different conclusions, the users rarely accept the better result at face value. In other words, users of AI want answers to how the decision was made, rather than just proof the result was superior.
This is an area called explainable AI. Within Smart Tendering, TNX produces both the tendering strategy and a human-readable explanation. The explanation indicates what strategy is being followed, the expected outcome, and if this is exploring a new strategy or exploiting the best one. The business value in explaining smart tendering to dispatchers is that, left as a black-box, they may sabotage, overrule, or advocate to remove smart tendering out of distrust or misunderstanding.
Explainable AI accelerates the change management when first using TNX, and helps coach dispatchers to better understand the market dynamics they are experiencing each day.