Show simple item record

dc.contributor.authorHe, Hongsheng
dc.contributor.authorLi, Yan
dc.contributor.authorTan, Jindong
dc.identifier.citationHe, H., Li, Y. & Tan, J. Auton Robot (2018) 42: 615en_US
dc.descriptionClick on the DOI link to access the article (may not be free).en_US
dc.description.abstractThis paper proposes a method to measure the motion of a moving rigid body using a hybrid visual-inertial sensor. The rotational velocity of the moving object is computed from visual optical flow by solving a depth-independent bilinear constraint, and the translational velocity of the moving object is estimated by solving a dynamics constraint that reveals the relation between scene depth and translational motion. By fusing an inertial sensor, the scale of translational velocities can be estimated, which is otherwise unrecoverable from monocular visual optical flow. An iterative refinement scheme is introduced to deal with observation noise and outliers, and the extended Kalman filter is applied for motion tracking. The performance of the proposed method is evaluated by simulation studies and practical experiments, and the results show the effectiveness of the proposed method in terms of accuracy and robustness.en_US
dc.description.sponsorshipNSFC Grant 61305114.en_US
dc.relation.ispartofseriesAutonomous Robots;v.42:no.3
dc.subjectMotion measurementen_US
dc.subjectDynamic scene analysisen_US
dc.subjectVisual-inertial perceptionen_US
dc.subjectSmart cameraen_US
dc.subjectWearable roboticsen_US
dc.titleRelative motion estimation using visual-inertial optical flowen_US
dc.rights.holderCopyright © 2017, Springer Natureen_US

Files in this item


There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record