The capabilities of convolutional neural networks, and in fact all manner of artificial intelligence and machine learning capabilities, to explore data in various fields has been documented extensively throughout the literature. One common challenge with adopting AI/ML solutions, however, is the issue of trust. Decision makers are rightfully hesitant to take action based solely on “the computer said so” even if the computer has great confidence that it is correct. There is obvious value in a system that can answer the question of why it made a given prediction and back this up with specific evidence. Basic models like regression or nearest neighbors can support such answers but have significant limitations in real-world applications, and more capable models like neural networks are much too complex to interpret. We have developed a prototype system that combines convolutional neural networks with semantic representations of reasonableness. We use logic similar to how humans justify conclusions, breaking objects into smaller pieces that we trust a neural network to identify. Leveraging a suite of machine learning algorithms, the tool provides not merely an output “conclusion”, but a supporting string of evidence that humans can use to better understand the conclusion, as well as explore potential weaknesses in the AI/ML components (whether as a result of lack of sufficient training data, adversarial attempts to corrupt the system, etc.). We have applied this system to problems of object detection and semantic segmentation of images. This paper will provide an in-depth overview of the prototype and show some exemplar results.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.