An Interactive Visualization for Feature Localization in Deep Neural Networks.

Zurowietz, Martin and Nattkemper, Tim W. (2020) An Interactive Visualization for Feature Localization in Deep Neural Networks. Open Access Frontiers in Artificial Intelligence, 3 (49). DOI 10.3389/frai.2020.00049.

[img]
Preview
Text
frai-03-00049.pdf - Published Version
Available under License Creative Commons: Attribution 4.0.

Download (3267Kb) | Preview
[img]
Preview
Other (Text)
Image_1_An Interactive Visualization for Feature Localization in Deep Neural Networks.PDF - Supplemental Material
Available under License Creative Commons: Attribution 4.0.

Download (2593Kb) | Preview

Supplementary data:

Abstract

Deep artificial neural networks have become the go-to method for many machine learning tasks. In the field of computer vision, deep convolutional neural networks achieve state-of-the-art performance for tasks such as classification, object detection, or instance segmentation. As deep neural networks become more and more complex, their inner workings become more and more opaque, rendering them a “black box” whose decision making process is no longer comprehensible. In recent years, various methods have been presented that attempt to peek inside the black box and to visualize the inner workings of deep neural networks, with a focus on deep convolutional neural networks for computer vision. These methods can serve as a toolbox to facilitate the design and inspection of neural networks for computer vision and the interpretation of the decision making process of the network. Here, we present the new tool Interactive Feature Localization in Deep neural networks (IFeaLiD) which provides a novel visualization approach to convolutional neural network layers. The tool interprets neural network layers as multivariate feature maps and visualizes the similarity between the feature vectors of individual pixels of an input image in a heat map display. The similarity display can reveal how the input image is perceived by different layers of the network and how the perception of one particular image region compares to the perception of the remaining image. IFeaLiD runs interactively in a web browser and can process even high resolution feature maps in real time by using GPU acceleration with WebGL 2. We present examples from four computer vision datasets with feature maps from different layers of a pre-trained ResNet101. IFeaLiD is open source and available online at https://ifealid.cebitec.uni-bielefeld.de.

Document Type: Article
Keywords: explainable deep learning, deep neural network visualization, visual analytics, interactive visualization, web application, computer vision, machine learning
Refereed: Yes
Open Access Journal?: Yes
DOI etc.: 10.3389/frai.2020.00049
ISSN: 2624-8212
Projects: JPIO-MiningImpact
Date Deposited: 07 Jan 2021 13:40
Last Modified: 08 Jan 2021 11:14
URI: http://oceanrep.geomar.de/id/eprint/51444

Actions (login required)

View Item View Item

Document Downloads

More statistics for this item...