Why Explainable AI Must Be on the CEO's Agenda
Many issues have been reported on IT projects that fail after millions of dollars have been spent or that do not return the expected value. Today, we invest in AI. The industry is forecast to grow 12.3% to $156.5 billion this year, according to IDC.[1] However, most CEOs and board directors are not prepared to understand how to control the technology they are investing in because the resulting systems are unexplainable.
It is just a matter of time for these investments to turn into the many IT failures that we have seen in the past. Forbes is warning of the same too. CEOs and board members should see unexplainable AI as a risk. Cindy Gordon sees the development of trusted systems — systems that can't cause any harm to humans and society — as the CEO's primary responsibility to control. She is asking herself, "Is the relatively new field of explainable AI the panacea?"[2]
A good start has been made by the US Department of Commerce. In August 2020, they identified four principles that define explainable AI.[3] They say a system must:
- provide an explanation: the system delivers accompanying evidence or reason(s) for all outputs
- be meaningful: the system provides explanations that are understandable to individual users
- have explanation accuracy: the explanation correctly reflects the system's process for generating the output
- have knowledge limits: the system only operates under conditions for which it was designed or when the system reaches a sufficient confidence in its output
But it is not convincing enough for the CEO. Since the required audit systems to interface between AI and humans — providing the meaningful explanations — are not readily available, he will need to invest in the creation of these extra features. What arguments could he use to justify that investments without referencing compliance with the DoC principles?
The answer lies in the fact that boards and CEOs that invest in AI systems without investing in explainability take an unacceptable high risk. I present here at least four reasons: unexplainable AI systems are not trusted by employees, are more difficult to improve over time, don't know their limitations (for example, due to COVID-19 disruptions), and will not meet expected government regulations. In other words, unexplainable AI systems have a lower ROI than explainable AI systems.
Explainability should therefore be the top priority for organizations that want to invest in AI. Better to start with a small system that explains itself and can be readily improved than a big AI investment that needs additional investments to explain itself. How?
Existing methods that have been used in AI systems for a very long time (such as decision tables and business rules), combined with the exploratory power of machine learning algorithms, are available techniques to start small, create understandable systems that explain themselves, and stay in control.
The bottom line is that we can't just use AI instead of understanding — we must understand first and then add AI.
Check out my book AIX: Artificial Intelligence needs eXplanation for illustrated examples and more details.
References
[1] "AI market leaders revealed by IDC," infotechlead, Aug. 8, 2020, https://infotechlead.com/artificial-intelligence/ai-market-leaders-revealed-by-idc-62362
[2] Cindy Gordon, "Why Explainable AI Must Be Grounded In Board Director's Risk Management Practices," Forbes, Aug. 31, 2020, https://www.forbes.com/sites/cindygordon/2020/08/31/why-explainable-ai-must-be-grounded-in-board-directors-risk-management-practices/#1eec3b845479
[3] P. Jonathon Phillips, et al, Four Principles of Explainable Artificial Intelligence, Draft NISTIR 8312 — draft publication available free of charge from: https://doi.org/10.6028/NIST.IR.8312-draft
# # #
About our Contributor:
Online Interactive Training Series
In response to a great many requests, Business Rule Solutions now offers at-a-distance learning options. No travel, no backlogs, no hassles. Same great instructors, but with schedules, content and pricing designed to meet the special needs of busy professionals.
How to Define Business Terms in Plain English: A Primer
How to Use DecisionSpeak™ and Question Charts (Q-Charts™)
Decision Tables - A Primer: How to Use TableSpeak™
Tabulation of Lists in RuleSpeak®: A Primer - Using "The Following" Clause
Business Agility Manifesto
Business Rules Manifesto
Business Motivation Model
Decision Vocabulary
[Download]
[Download]
Semantics of Business Vocabulary and Business Rules