Learning Block Group Spase Representation Combined with Convolutional Neural Networks for RGB-D Object Recognition

Learning Block Group Spase Representation Combined with Convolutional Neural Networks for RGB-D Object Recognition

Year:    2014

Journal of Fiber Bioengineering and Informatics, Vol. 7 (2014), Iss. 4 : pp. 603–613

Abstract

RGB-D (Red, Green and Blue-Depth) cameras are novel sensing systems that can improve image recognition by providing high quality color and depth information in computer vision. In this paper we propose a model to study feature representation of combined Convolutional Neural Networks (CNN) and Block Group Sparse Coding (BGSC). Firstly, CNN is used to extract low-level features from raw RGB- D images directly by applying unsupervised algorithm. Then, BGSC is used to obtain higher feature representation for classification by incorporating both the group structure for low-level features and the block structure for the dictionary in subsequent learning processes. Experimental results show that the CNN-BGSC approach has higher accuracy on a household RGB-D object dataset by linear predictive classifier than using Convolutional and Recursive Neural Networks (CNN-RNN), Group Sparse Coding (GSC), and Sparse Representation base Classification (SRC).

You do not have full access to this article.

Already a Subscriber? Sign in as an individual or via your institution

Journal Article Details

Publisher Name:    Global Science Press

Language:    English

DOI:    https://doi.org/10.3993/jfbi12201413

Journal of Fiber Bioengineering and Informatics, Vol. 7 (2014), Iss. 4 : pp. 603–613

Published online:    2014-01

AMS Subject Headings:   

Copyright:    COPYRIGHT: © Global Science Press

Pages:    11

Keywords:    RGB-D

  1. Detection of Sick Laying Hens by Infrared Thermal Imaging and Deep Learning

    Li, Pei

    Lu, Huishan

    Wang, Fujie

    Zhao, Shouyao

    Wang, Ning

    Journal of Physics: Conference Series, Vol. 2025 (2021), Iss. 1 P.012008

    https://doi.org/10.1088/1742-6596/2025/1/012008 [Citations: 2]