Machine Teaching for Explainable AI: Proof of Concept
Master thesis
View/ Open
Date
2022-06-21Metadata
Show full item recordCollections
- Master theses [218]
Abstract
In today’s society, AI and machine learning are becoming more and more relevant. Following this, the field of Explainable AI is becoming of more relevance. The research project ”Machine Teaching for Explainable AI” aims to explain black-box AIs to humans using suitable examples. In this thesis, we present a proof of concept of the basic system in the project proposal. We aim to explain a Convolutional Neural Network trained on a boolean relation of bitmaps containing letters. The thesis introduces a simple repre- sentational language to bridge the gap between the Convolutional Neural Network and a human. We look at how to measure the perceived complexity of humans and mimic their reasoning given this task.