LI Logo

Calibration RGB-D based on spheres Repository

RFAI - Laboratoire d'Informatique Fondamentale et Appliquée de Tours (EA 6300) - France
rfai logo

Repository description

The Sphere RGB-D Calibration (Version 2019_03) benchmark is organized as follows:

The goal of this benchmark is to evaluate RGB-D calibration methods based on spheres using the same dataset. We propose several metrics to evaluate the calibration results, as well as the results provided by three algorithms.

graph LR A[Input
RGB-D Images Pairs
RGB Intrinsic Parameters Facultative]-->B[RGB-D Calibration] B-->C[Output
Extrinsic Parameters
Depth Intrinsic Parameters Facultative]

The following details the capture environment, the architecture and the evaluations metrics. For more details, refers to the dataset paper [1] :


To download the datasets, click on the dataset names in the table below :

Dataset Color Camera
Depth Camera
# RGB-D Pairs Support Capture
Known Parameters
Random Hololens n°1
Structure Sensor
72 Single Sphere
Stationary Intrinsic RGB
Intrinsic Depth
Equidistant Hololens n°2
Structure Sensor
27 Single Sphere
Stationary Intrinsic RGB
Intrinsic Depth
Double Realsense SR300
Realsense SR300
40 Double Sphere
Hand-held Intrinsic RGB
Intrinsic Depth
Extrinsic RGB-D
Synthetical 1280x960 - 20 Single Sphere - All
(Absolute Values)

The source code to perform an RGB-D calibration with these files is not available.

Capture environment

These datasets are taken in the following environment :

Dataset tree

In this dataset, images are stored as .PNG, points clouds as .PLY, and configuration files as .YML.

A dataset is organized the same way either it contains real or synthetical data. Linked data have the same name. RGB-D input data, as captured by the RGB-D camera couple is stored in the RGB, POINTS and DEPTH folders. The folder Processed contains pre-computed data, such as the segmented ellipses and point cloud. The folder OUTPUT contains calibration results from several algorithms.

Note that for some dataset, some folders are not provided, either because data was not available, or because it does not make sense (e.g. the folder EllipseImg for synthetical data).

Figure 2: Overview of a dataset organization, and their respective metrics (in light gray) for evaluation of the RGB-D Calibration Process.

Synthetical data

In order to evaluate precisely the influence of noise on the calibration result (especially the Extrinsic parameters), a synthetical dataset, with known values, has been constructed. A GroundTruth scene has been made. We applied 3 types of noise, as shown on Figure 4.

1 - Gaussian Noise on the RGB ellipses points
2 - Noise on the RGB camera Intrinsic parameters (we multiply each value by a scalar)
3 - Noise on the Depth data, as a shift of all the point cloud (which displace the point cloud centroid)

As we apply Gaussian Noise, we performed multiple iterations to study the distribution of the resulting values

Figure 3: Overview of a synthetic dataset generation

In the synthetical data, we only modify the Depth Noise between the multiple evaluations (noise n°3), by increasing its amount by 0,4. The noise values are provided in the /noise.yml. All steps are performed a hundred times to have reliable results. Because of the amount of synthetical datas, the Ground Truth scene (/GroundTruth), as well as the results (/results) are provided in separate folders.

How to use a dataset

To perform a calibration, you can either use the cameras frames (/root), or the ellipse and sphere detection (/Processed). The calibration results have to be stored in the /OUTPUT folder.

To read our multiples files, we recommand the use of the OpenCV library.

How to : Read a Ellipse File

We recommend the use of the "cv::RotatedRect" type as output of this file Parameters :

How to : Read a Sphere File

How to : Read a config.yml File

How to : Read a Calibration.yml File

How to : Read a detectionEvaluation.csv File

This file evaluate the ellipse and sphere detection against the manually segmented ground truth.

How to : Read a noise.yml File (synthetic data only)

How to : Read a CalibrationResults file (synthetic data only)

Currently evaluated algorithms


[1] D. J. T. Boas, S. Poltaretskyi, J.-Y. Ramel, J. Chaoui, J. Berhouet, and M. Slimane, "A Benchmark Dataset for RGB-D Sphere-Based Calibration" - PDF
[2] R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision. Cambridge University Press, ISBN:0521540518, second ed., 2004
[3] A. Staranowicz, G. R. Brown, F. Morbidi, and G. L. Mariottini, “Easy-to-Use and Accurate Calibration of RGBD Cameras from Spheres,” in Image and Video Technology, vol. 8333, pp. 265–278, Springer Berlin Heidelberg, 2014
[4] D. J. T. Boas, S. Poltaretskyi, J.-Y. Ramel, J. Chaoui, J. Berhouet, and M. Slimane, “Relative pose improvement of sphere based rgb-d calibration,” Proceedings of the 14th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, vol. 4, pp. 91–98, 2019

Downloading and condition of use


Technical questions about the database, the format, or problems obtaining the data should be directed to the database editors:

Lab./Dep .Informatique de Tours - PolytechTours
64, Av. Jean Portalis
Tel: +33