Department of Electrical
and Computer Engineering,
The University of Texas at Austin,
Austin, TX 78712 USA
kurtjc@gmail.com -
bevans@ece.utexas.edu
This paper focuses on camera-gyroscope calibration and synchronization for cellphones, which estimate (rolling shutter) camera intrinsic parameters, gyroscope bias, relative orientation and timestamp delay between camera and gyroscope. The contributions are (1) a multiple-view coplanarity constraint on camera rotation in a rolling shutter camera model, and (2) an online camera-gyroscope calibration and synchronization method based on an implicit extended Kalman filter using the proposed coplanarity constraint.
Why is this contribution significant (What impact will it have)?
The gyroscope is playing a key role in helping estimate 3D camera rotation for various vision applications on cellphones including video stabilization and feature tracking. Successful fusion of gyroscope and camera data requires that the camera, gyroscope and their relative pose to be calibrated. The proposed method is able to estimate the needed calibration and synchronization parameters online with all kinds of camera motion. This can especially help everyday users to run apps that directly use the camera and gyroscope without any prior knowledge about the devices or any special calibration hardware.
What are the three papers in the published literature most closely related to this paper?
[1] A. Karpenko, D. Jacobs, J. Baek, and M. Levoy, "Digital video stabilization and rolling shutter correction using gyroscopes," Stanford University, Tech. Rep., Mar. 2011.
[2] M. Hwangbo, J.-S. Kim, and T. Kanade, "Gyro-aided feature tracking for a moving camera: fusion, auto-calibration and GPU implementation," Intl. Journal Robotics Research, vol. 30, no. 14, pp. 1755-1774, 2011.
[3] L. Kneip, R. Siegwart, and M. Pollefeys, "Finding the exact rotation between two images independently of the translation," Proc. European Conf. Computer Vision, Oct. 2012.
What is distinctive/new about the current paper relative to these previously published works?
[1] proposes to stabilize videos using gyroscopes, which directly motivates our work in this paper.
[1] and [2] proposes to estimate the needed parameters offline by quickly shaking the camera while pointing at a far-away object. The parameters are estimated based on homography transformation of matched feature points assuming pure rotation. Our paper differs from [1] and [2] in (a) our paper estimate the parameters online while video is being captured, and (b) the method in our paper does not assume pure rotation and is able to work with all kinds of camera motion.
[3] proposes a multiple-view coplanarity constraint on camera rotations for global shutter cameras. Our paper generalizes this constraint to rolling shutter cameras (necessary for cellphone cameras) and applies it as implicit measurement in an extended Kalman filter to estimated the parameters online.
In Matlab I used SIFT matching with VLFeat. You can find it in our released code ('sift_track' function) for a 2012 paper that we had written:
SIFT feature is robust but a little bit slow. You could also try BRISK or ORB. Both of them can be found in OpenCV. Please check this repo if you need a wrapper for Matlab:
https://github.com/kyamagu/mexopencv
Last Updated 03/28/17.