Comparative study of visual human state classication; An application for a walker robot

Sajjad Taghvaei, Yasuhisa Hirata, Kazuhiro Kosuge

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

6 Citations (Scopus)

Abstract

The image data of upper body from a depth sensor is used to estimate the state of human focusing on the incidents that might happen while using a walker. Several falling cases along with sitting and normal walking are considered in this study. Two main features namely the centroid and the principal component analysis (PCA) values of the upper body are used to classify the data. The non-walking states are detected either by using a Gaussian Mixture Model of PCA features or training a Continuous Hidden Markov Model (CHMM) with centroid data. The CHMM is also used to detect the type of falling. The state estimation results are used to control the motion of a passive type walker referred to as RT Walker. Falling prevention and sitting/standing assistance are achieved using both methods. Performance of the methods are discussed and compared to each other from different aspect.

Original languageEnglish
Title of host publication2012 4th IEEE RAS and EMBS International Conference on Biomedical Robotics and Biomechatronics, BioRob 2012
Pages1843-1849
Number of pages7
DOIs
Publication statusPublished - 2012
Event2012 4th IEEE RAS and EMBS International Conference on Biomedical Robotics and Biomechatronics, BioRob 2012 - Rome, Italy
Duration: 2012 Jun 242012 Jun 27

Publication series

NameProceedings of the IEEE RAS and EMBS International Conference on Biomedical Robotics and Biomechatronics
ISSN (Print)2155-1774

Conference

Conference2012 4th IEEE RAS and EMBS International Conference on Biomedical Robotics and Biomechatronics, BioRob 2012
Country/TerritoryItaly
CityRome
Period12/6/2412/6/27

Fingerprint

Dive into the research topics of 'Comparative study of visual human state classication; An application for a walker robot'. Together they form a unique fingerprint.

Cite this