Person Identification Using Top-View Image with Depth Information | IGI Global Scientific Publishing
Reference Hub1
Person Identification Using Top-View Image with Depth Information

Person Identification Using Top-View Image with Depth Information

Daichi Kouno, Kazutaka Shimada, Tsutomu Endo
Copyright: © 2013 |Volume: 1 |Issue: 2 |Pages: 13
ISSN: 2166-7160|EISSN: 2166-7179|EISBN13: 9781466633179|DOI: 10.4018/ijsi.2013040106
Cite Article Cite Article

MLA

Kouno, Daichi, et al. "Person Identification Using Top-View Image with Depth Information." IJSI vol.1, no.2 2013: pp.67-79. https://doi.org/10.4018/ijsi.2013040106

APA

Kouno, D., Shimada, K., & Endo, T. (2013). Person Identification Using Top-View Image with Depth Information. International Journal of Software Innovation (IJSI), 1(2), 67-79. https://doi.org/10.4018/ijsi.2013040106

Chicago

Kouno, Daichi, Kazutaka Shimada, and Tsutomu Endo. "Person Identification Using Top-View Image with Depth Information," International Journal of Software Innovation (IJSI) 1, no.2: 67-79. https://doi.org/10.4018/ijsi.2013040106

Export Reference

Mendeley
Favorite Full-Issue Download

Abstract

In this paper, the authors describe a novel image-based person identification task. Traditional face-based person identification methods have a low tolerance for occluded situation, such as overlapping of people in an image. The authors focus on an image from an overhead camera. The authors utilize depth information for the identification task. By using depth information, the authors can capture the precise person’s area and rich information for the identification task as compared with popular RGB cameras. The authors apply four features extracted from images based on depth information to the identification method; (1) estimated body height, (2) estimated body dimensions, (3) estimated body size and (4) depth histogram. In the experiment, the authors evaluated two situations; (a) standing in front of a door and (b) touching a doorknob. The identification accuracy rates are 94.4% and 91.4% on the two situations. The authors obtained the high accuracy by the proposed method.

Request Access

You do not own this content. Please login to recommend this title to your institution's librarian or purchase it from the IGI Global Scientific Publishing bookstore.