Pose-based 3D human motion analysis using Extreme Learning Machine

Arif Budiman, Mohamad Ivan Fanany

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

8 Citations (Scopus)

Abstract

In 3D human motion pose-based analysis, the main problem is how to classify multi-class label activities based on primitive action (pose) inputs efficiently for both accuracy and processing time. Because, pose is not unique and the same pose can be anywhere on different activity classes. In this paper, we evaluate the effectiveness of Extreme Learning Machine (ELM) in 3D human motion analysis based on pose cluster. ELM has reputation as eager classifier with fast training and testing time but the classification result originally has still low testing accuracy even by increasing the hidden nodes number and adding more training data. To achieve better accuracy, we pursue a feature selection method to reduce the dimension of pose cluster training data in time sequence. We propose to use frequency of pose occurrence. This method is similar like bag of words which is a sparse vector of occurrence counts of poses in histogram as features for training data (bag of poses). By using bag of poses as the optimum feature selection, the ELM performance can be improved without adding network complexity (Hidden nodes number and training data).

Original languageEnglish
Title of host publication2013 IEEE 2nd Global Conference on Consumer Electronics, GCCE 2013
Pages3-7
Number of pages5
DOIs
Publication statusPublished - 2013
Event2013 IEEE 2nd Global Conference on Consumer Electronics, GCCE 2013 - Tokyo, Japan
Duration: 1 Oct 20134 Oct 2013

Publication series

Name2013 IEEE 2nd Global Conference on Consumer Electronics, GCCE 2013

Conference

Conference2013 IEEE 2nd Global Conference on Consumer Electronics, GCCE 2013
Country/TerritoryJapan
CityTokyo
Period1/10/134/10/13

Fingerprint

Dive into the research topics of 'Pose-based 3D human motion analysis using Extreme Learning Machine'. Together they form a unique fingerprint.

Cite this