A visual long-short-term memory based integrated CNN model for fabric defect image classification

Document Type

Article

Date of Original Version

3-7-2020

Abstract

Fabric defect classification is traditionally achieved by human visual examination, which is inefficient and labor-intensive. Therefore, using intelligent and automated methods to solve this problem has become a hot research topic. With the increasing diversity of fabric defects, it is urgent to design effective methods to classify defects with a higher accuracy, which can contribute to ensuring the fabric products' quality. Considering that the fabric defect is not obvious against the texture background and many kinds of them are too confusing to distinguish, a visual long-short-term memory (VLSTM) based integrated CNN model is proposed in this paper. Inspired by the human visual perception and visual memory mechanism, three categories of features are extracted, which are the visual perception (VP) information extracted by stacked convolutional auto-encoders (SCAE), the visual short-term memory (VSTM) information characterized by a shallow convolutional neural network (CNN), and the visual long-term memory (VLTM) information characterized by non-local neural networks. Experimental results on three fabric defect datasets have shown that the proposed model provides competitive results to the current state-of-the-art methods on fabric defect classification.

Publication Title, e.g., Journal

Neurocomputing

Volume

380

Share

COinS