Hide menu

Virtual Global Shutters for CMOS Cameras

This project ran January 2009 through December 2014.
The project was funded by CENIIT.

See also the LCMM project which is a continuation of this one.

People

The principal investigator for this project was Per-Erik Forssén.
The project also employed the PhD student Erik Ringaby (now graduated).

Check out our tutorial Computer Vision on Rolling Shutter Cameras at CVPR 2012.

Background

Most cellphones and camcorders sold today are equipped with CMOS sensors. Compared to the conventional CCD sensors, a CMOS sensor is cheaper to manufacture, and is easier to integrate with on-chip processing. CMOS sensors, by design make use of what is known as a rolling shutter. In a rolling shutter camera, each detector element gathers light right until the time of readout, and readout happens sequentially, starting in the uppermost row, and ending in the bottom row. The more conventional CCD sensors on the other hand use a global shutter, i.e. all pixels are reset simultaneously, and gather light during the same time interval. The use of a rolling shutter allows a longer exposure time for a given frame rate, resulting in a noise reduction. The downside is that since pixels are acquired at different points in time, camera, and target motions will cause geometrical distortions in the acquired images.

Web illustrations of the rolling shutter effect:

  • An article discussing the pros and cons of CMOS and CCD can be found here.
  • A video clip showing distortions from a rolling shutter camera.
  • Another video clip, this one from an airborne camera.

Introductory video

For a more in-depth description of our algorithm, please have a look at our CVPR talk on videolectures.net.

Another video, where a more recent version of the algorithm is showcased can be found here.

Industrial motives

Due to a need in the automotive industry for accurate and compact sensors for airbag deployment, we today have a variety of micro-electro-mechanical systems (MEMS) that allow accurate short range inertial measurements. These sensors are small and relatively low cost. By attaching MEMS inertial sensors (INS) such as gyros and accelerometers to the camera, the camera ego-motion can accurately be estimated at the time-scale of a frame exposure. This allows each row of pixels to be registered as a bundle of rays emanating from a point in 3D. We can now simulate the effect of a global shutter, by intersecting the pixel rays with a virtual sensor plane. The industrial partner, Saab Bofors Dynamics, has previous experience with integrating cameras and accurate gyros. The company C3 Technologies AB that delivers 3D maps for e.g. hitta.se is a spin-off from Saab Bofors Dynamics. The data-acquisition system currently in use by C3 Technologies for mapping, uses very high quality cameras and gyros, and it would be of great interest to see what could be accomplished in terms of automated 3D mapping, using inexpensive MEMS sensors, and CMOS cameras instead.

Research Context

The project will be carried out at the Computer Vision Laboratory (CVL). CVL has a long and strong background in signal processing and computer vision. We have previous experience from integrating cameras and inertia sensors (INS) from the EU 6th framework programme Markerless real-time Tracking for Augmented Reality Image Synthesis ( MATRIS ), where both CVL and the Automatic Control division, both at the Department of Electrical Engineering, were partners. CVL is also a partner in the current 7th framework programme, Dynamic Interactive Perception-action LEarning in Cognitive Systems ( DIPLECS ), where INS is also used as a tool for ego-motion estimation of ground vehicles.

Project Perspectives

During a three year period, we will develop techniques that allow CMOS rolling-shutter cameras to be used in real-time systems, both for conventional imaging where rolling-shutter distortion needs to be removed, and for 3D structure recovery.

The project is related to the former CENIIT project Contents Associative Indexing and Retrieval of Image Sequences ( CAIRIS ). That project also did computer vision on moving imagery, and the project web page mentions plans to use bundle adjustment techniques, which will most likely be of use in the proposed project as well.

The scope of the project has been developed in cooperation with the industrial partner, Saab Bofors Dynamics in Linköping. They serve as a discussion partner, and have previous experience with integration of accelerometers, gyros, and cameras.

Datasets

During the project we have produced datasets that allow controlled evaluation of algorithm performance.

  • Rolling shutter rectification dataset [webpage]
  • Rolling shutter bundle adjustment dataset [webpage]
  • Video Stacking Dataset [webpage]

Publications

  1. Ringaby, Erik. “Optical Flow Computation on CUDA”, Proceedings of SSBA 2009, IAPR, 2009. 81-84.

  2. Ringaby, Erik and Ahlberg, Jörgen and Forssén, Per-Erik and Wadströmer, Niclas, "Co-alignment of Aerial Push-Broom Strips using Trajectory Smoothness Constraints", Proceedings of SSBA 2010 , IAPR, 2010, 63-66.

  3. Forssén, Per-Erik and Ringaby, Erik, "Rectifying rolling shutter video from hand-held devices", Proceedings of CVPR 2010 , IEEE Computer Society, 2010. IEEE Xplore. [ PDF ]

  4. Ringaby, Erik and Ahlberg, Jörgen and Wadströmer, Niclas and Forssén, Per-Erik, "Co-aligning aerial hyperspectral push-broom strips for change detection", Proceedings of SPIE Security+Defence 2010 , September 2010, SPIE Vol. 7835 [URL]

  5. Ringaby, Erik and Forssén, Per-Erik, "Rectifying Rolling Shutter Video from Hand-Held Devices", Technical Report, SSBA11 , March 2011

  6. Gustav Hanning, "Video Stabilization and Rolling Shutter Correction using Inertial Measurement Sensors", LiU Master Thesis , June 2011

  7. Nicklas Forslöw, "Estimation and Adaptive Smoothing of Camera Orientations for Video Stabilization and Rolling Shutter Correction", LiU Master Thesis , June 2011

  8. Ringaby, Erik and Forssén, Per-Erik, "Efficient Video Rectification and Stabilisation for Cell-Phones", International Journal of Computer Vision, 2012, vol 96, no 3, pp 335-352, Online June 2011. [URL]

  9. Ringaby, Erik and Forssén, Per-Erik, "Scan Rectification for Structured Light Range Sensors with Rolling Shutters", International Conference on Computer Vision, November 2011. [URL]

  10. Hanning, Gustav and Forslöw, Nicklas and Forssén, Per-Erik and Ringaby, Erik and Törnqvist, David and Callmer, Jonas, "Stabilizing Cell Phone Video using Inertial Measurement Sensors", 2nd International Workshop on Mobile Vision, November 2011 [URL] Best paper award.

  11. Hedborg, Johan and Ringaby, Erik and Forssén, Per-Erik and Felsberg, Michael, "Structure and Motion Estimation from Rolling Shutter Video", 2nd International Workshop on Mobile Vision, November 2011 [PDF]

  12. Ringaby, Erik and Forssén, Per-Erik, "Scan Rectification for Structured Light Range Sensors with Rolling Shutters", Technical Report, SSBA12, March 2012 [URL]

  13. Ringaby, Erik, "Geometric Computer Vision for Rolling-shutter an Push-broom Sensors", LiU Licentiate Thesis, June 2012 [PDF]

  14. Hedborg, Johan and Forssén, Per-Erik and Felsberg, Michael and Ringaby, Erik, "Rolling Shutter Bundle Adjustment", Proceedings of CVPR12, IEEE Computer Society, IEEE Xplore. June 2012 [PDF]

  15. Hedborg, Johan and Forssén, Per-Erik and Felsberg, Michael and Ringaby, Erik, "Bundle Adjustment for Rolling Shutter Video", Technical Report, SSBA13, March 2014 [URL]

  16. Ringaby, Erik and Forssén, Per-Erik, "Scattered Data Interpolation for Remote Sensing Applications", Technical Report, SSBA13, March 2014 [URL]

  17. Ringaby, Erik and Forssén, Per-Erik, "Sharp Night Shots on a Hand-held Smartphone", Technical Report, SSBA14, March 2014 [URL] Winner of the SSBA Industry Prize.

  18. Ringaby, Erik and Forssén, Per-Erik, "A Virtual Tripod for Hand-held Video Stacking on Smartphones", Proceedings of ICCP 2014, IEEE Computer Society, IEEE Xplore. May 2014 [PDF]

  19. Ringaby, Erik and Friman, Ola, and Forssén, Per-Erik, and Opsahl, Thomas, and Haavardsholm, Trym Vegard and Kåsen, Ingebjørg, "Anisotropic Scattered Data Interpolation for Pushbroom Image Rectification", IEEE Transactions on Image Processing, Vol. 23, No. 5, Pages 2302-2314. May 2014 [PDF]

  20. Ringaby, Erik, "Geometric Models for Rolling-shutter and Push-broom Sensors", LiU Dissertation No. 1615, September 2014. [PDF]



Last updated: 2015-02-19