Show simple item record

dc.contributor.authorZhang, Yinlong
dc.contributor.authorLiang, Wei
dc.contributor.authorHe, Hongsheng
dc.contributor.authorTan, Jindong
dc.date.accessioned2018-12-05T04:25:02Z
dc.date.available2018-12-05T04:25:02Z
dc.date.issued2018-11
dc.identifier.citationY. Zhang, W. Liang, H. He and J. Tan, "Wearable Heading Estimation for Motion Tracking in Health Care by Adaptive Fusion of Visual–Inertial Measurements," in IEEE Journal of Biomedical and Health Informatics, vol. 22, no. 6, pp. 1732-1743, Nov. 2018en_US
dc.identifier.issn2168-2194
dc.identifier.otherWOS:000447833100004
dc.identifier.urihttps://doi.org/10.1109/JBHI.2018.2795006
dc.identifier.urihttp://hdl.handle.net/10057/15681
dc.descriptionClick on the DOI link to access the article (may not be free).en_US
dc.description.abstractThe increasing demand for health informatics has become a far-reaching trend in the aging society. The utilization of wearable sensors enables monitoring senior people daily activities in free-living environments, conveniently and effectively. Among the primary health-care sensing categories, the wearable visual-inertial modality for human motion tracking, gradually exerts promising potentials. In this paper, we present a novel wearable heading estimation strategy to track the movements of human limbs. It adaptively fuses inertial measurements with visual features following locality constraints. Body movements are classified into two types: general motion (which consists of both rotation and translation) or degenerate motion (which consists of only rotation). A specific number of feature correspondences between camera frames are adaptively chosen to satisfy both the feature descriptor similarity constraint and the locality constraint. The selected feature correspondences and inertial quaternions are employed to calculate the initial pose, followed by the coarse-to-fine procedure to iteratively remove visual outliers. Eventually, the ultimate heading is optimized using the correct feature matches. The proposed method has been thoroughly evaluated on the straight-line, rotatory, and ambulatory movement scenarios. As the system is lightweight and requires small computational resources, it enables effective and unobtrusive human motion monitoring, especially for the senior citizens in the long-term rehabilitation.en_US
dc.description.sponsorshipNational Science Foundation of China under Contacts 61233007, 61673371, 61305114, and 71661147005, and in part by Youth Innovation Promotion Association, CAS, under Grant 2015157.en_US
dc.language.isoen_USen_US
dc.publisherIEEEen_US
dc.relation.ispartofseriesIEEE Journal of Biomedical and Health Informatics;v.22:no.6
dc.subjectHealth informaticsen_US
dc.subjectUbiquitous sensingen_US
dc.subjectHuman motion trackingen_US
dc.subjectWearable heading estimationen_US
dc.subjectInertial sensingen_US
dc.subjectMonocular camera capturingen_US
dc.subjectMulti-sensor fusionen_US
dc.titleWearable heading estimation for motion tracking in health care by adaptive fusion of visual-inertial measurementsen_US
dc.typeArticleen_US
dc.rights.holder© 2018, IEEEen_US


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record