Wearable heading estimation for motion tracking in health care by adaptive fusion of visual-inertial measurements
Date
2018-11Author
Zhang, Yinlong
Liang, Wei
He, Hongsheng
Tan, Jindong
Metadata
Show full item recordCitation
Y. Zhang, W. Liang, H. He and J. Tan, "Wearable Heading Estimation for Motion Tracking in Health Care by Adaptive Fusion of Visual–Inertial Measurements," in IEEE Journal of Biomedical and Health Informatics, vol. 22, no. 6, pp. 1732-1743, Nov. 2018
Abstract
The increasing demand for health informatics has become a far-reaching trend in the aging society. The utilization of wearable sensors enables monitoring senior people daily activities in free-living environments, conveniently and effectively. Among the primary health-care sensing categories, the wearable visual-inertial modality for human motion tracking, gradually exerts promising potentials. In this paper, we present a novel wearable heading estimation strategy to track the movements of human limbs. It adaptively fuses inertial measurements with visual features following locality constraints. Body movements are classified into two types: general motion (which consists of both rotation and translation) or degenerate motion (which consists of only rotation). A specific number of feature correspondences between camera frames are adaptively chosen to satisfy both the feature descriptor similarity constraint and the locality constraint. The selected feature correspondences and inertial quaternions are employed to calculate the initial pose, followed by the coarse-to-fine procedure to iteratively remove visual outliers. Eventually, the ultimate heading is optimized using the correct feature matches. The proposed method has been thoroughly evaluated on the straight-line, rotatory, and ambulatory movement scenarios. As the system is lightweight and requires small computational resources, it enables effective and unobtrusive human motion monitoring, especially for the senior citizens in the long-term rehabilitation.
Description
Click on the DOI link to access the article (may not be free).