Wearable heading estimation for motion tracking in health care by adaptive fusion of visual-inertial measurements
Authors
Advisors
Issue Date
Type
Keywords
Citation
Abstract
The increasing demand for health informatics has become a far-reaching trend in the aging society. The utilization of wearable sensors enables monitoring senior people daily activities in free-living environments, conveniently and effectively. Among the primary health-care sensing categories, the wearable visual-inertial modality for human motion tracking, gradually exerts promising potentials. In this paper, we present a novel wearable heading estimation strategy to track the movements of human limbs. It adaptively fuses inertial measurements with visual features following locality constraints. Body movements are classified into two types: general motion (which consists of both rotation and translation) or degenerate motion (which consists of only rotation). A specific number of feature correspondences between camera frames are adaptively chosen to satisfy both the feature descriptor similarity constraint and the locality constraint. The selected feature correspondences and inertial quaternions are employed to calculate the initial pose, followed by the coarse-to-fine procedure to iteratively remove visual outliers. Eventually, the ultimate heading is optimized using the correct feature matches. The proposed method has been thoroughly evaluated on the straight-line, rotatory, and ambulatory movement scenarios. As the system is lightweight and requires small computational resources, it enables effective and unobtrusive human motion monitoring, especially for the senior citizens in the long-term rehabilitation.