IEEE Transactions on Image Processing , vol. 23, no. 12, pp. 5070-5081, Dec. 2014.

Online Camera-Gyroscope Auto-Calibration for Cellphones

Chao Jia and Brian L. Evans

Department of Electrical and Computer Engineering, The University of Texas at Austin, Austin, TX 78712 USA
kurtjc@gmail.com - bevans@ece.utexas.edu

Paper Draft - Software - Project Site

Video Examples: Running - Panning

Abstract

The gyroscope is playing a key role in helping estimate 3D camera rotation for various vision applications on cellphones including video stabilization and feature tracking. Successful fusion of gyroscope and camera data requires that the camera, gyroscope and their relative pose to be calibrated. Moreover, the timestamps of gyroscope readings and video frames are usually not well synchronized. Previous work performed camera-gyroscope calibration and synchronization offline after the entire video sequence has been captured with restrictions on the camera motion, which is unnecessarily restrictive for everyday users to run apps that directly use the gyroscope. In this paper, we propose an online method that estimates all of the necessary parameters while a user is capturing video. Our contributions are
  1. simultaneous online camera self-calibration and camera-gyroscope calibration based on an implicit extended Kalman filter, and
  2. generalization of the multiple-view coplanarity constraint on camera rotation in a rolling shutter camera model for cellphones.
The proposed method is able to estimate the needed calibration and synchronization parameters online with all kinds of camera motion, and can be embedded in gyro-aided applications such as video stabilization and feature tracking. Both Monte Carlo simulation and cellphone experiments show that the proposed online calibration and synchronization method converges fast to the ground truth values.


Contributions and Significance

What is the contribution of this paper to the image processing community (a couple of sentences)?

This paper focuses on camera-gyroscope calibration and synchronization for cellphones, which estimate (rolling shutter) camera intrinsic parameters, gyroscope bias, relative orientation and timestamp delay between camera and gyroscope. The contributions are (1) a multiple-view coplanarity constraint on camera rotation in a rolling shutter camera model, and (2) an online camera-gyroscope calibration and synchronization method based on an implicit extended Kalman filter using the proposed coplanarity constraint.

Why is this contribution significant (What impact will it have)?

The gyroscope is playing a key role in helping estimate 3D camera rotation for various vision applications on cellphones including video stabilization and feature tracking. Successful fusion of gyroscope and camera data requires that the camera, gyroscope and their relative pose to be calibrated. The proposed method is able to estimate the needed calibration and synchronization parameters online with all kinds of camera motion. This can especially help everyday users to run apps that directly use the camera and gyroscope without any prior knowledge about the devices or any special calibration hardware.

What are the three papers in the published literature most closely related to this paper?

[1] A. Karpenko, D. Jacobs, J. Baek, and M. Levoy, "Digital video stabilization and rolling shutter correction using gyroscopes," Stanford University, Tech. Rep., Mar. 2011.

[2] M. Hwangbo, J.-S. Kim, and T. Kanade, "Gyro-aided feature tracking for a moving camera: fusion, auto-calibration and GPU implementation," Intl. Journal Robotics Research, vol. 30, no. 14, pp. 1755-1774, 2011.

[3] L. Kneip, R. Siegwart, and M. Pollefeys, "Finding the exact rotation between two images independently of the translation," Proc. European Conf. Computer Vision, Oct. 2012.

What is distinctive/new about the current paper relative to these previously published works?

[1] proposes to stabilize videos using gyroscopes, which directly motivates our work in this paper.

[1] and [2] proposes to estimate the needed parameters offline by quickly shaking the camera while pointing at a far-away object. The parameters are estimated based on homography transformation of matched feature points assuming pure rotation. Our paper differs from [1] and [2] in (a) our paper estimate the parameters online while video is being captured, and (b) the method in our paper does not assume pure rotation and is able to work with all kinds of camera motion.

[3] proposes a multiple-view coplanarity constraint on camera rotations for global shutter cameras. Our paper generalizes this constraint to rolling shutter cameras (necessary for cellphone cameras) and applies it as implicit measurement in an extended Kalman filter to estimated the parameters online.

Other Questions

I would ask you how have you obtained match_idx,match_x1,match_x2 data to calibrate the camera?

In Matlab I used SIFT matching with VLFeat. You can find it in our released code ('sift_track' function) for a 2012 paper that we had written:

"Probabilistic 3-D Motion Estimation for Rolling Shutter Video Rectification from Visual and Inertial Measurements"

SIFT feature is robust but a little bit slow. You could also try BRISK or ORB. Both of them can be found in OpenCV. Please check this repo if you need a wrapper for Matlab:

https://github.com/kyamagu/mexopencv


COPYRIGHT NOTICE: All the documents on this server have been submitted by their authors to scholarly journals or conferences as indicated, for the purpose of non-commercial dissemination of scientific work. The manuscripts are put on-line to facilitate this purpose. These manuscripts are copyrighted by the authors or the journals in which they were published. You may copy a manuscript for scholarly, non-commercial purposes, such as research or instruction, provided that you agree to respect these copyrights.


Last Updated 06/24/15.