Camera Calibration Toolbox for Matlab

来源:互联网 发布:免费虚拟化软件 编辑:程序博客网 时间:2024/05/22 02:10
 
Camera Calibration Toolbox for Matlab
 

First calibration example - Corner extraction, calibration, additional tools

This section takes you through a complete calibration example based on a total of 20 (and 25) images of a planar checkerboard.
This example lets you learn how to use all the features of the toolbox: loadingcalibration images, extracting image corners, running the maincalibration engine, displaying the results, controlling accuracies,adding and suppressing images, undistorting images, exporting calibration data todifferent formats... This example ishighly recommended for someone who is just starting using the toolbox.

  • Download the calibration images all at oncecalib_example.zip (4461Kb zipped) orone by one, and store the 20 images into a seperate folder namedcalib_example.


  • From within matlab, go to the example folder calib_example containing the images.

  • Reading the images:

    Click on the Image names button in the Camera calibration tool window.Enter the basename of the calibration images (Image) and the image format (tif).
    All the images (the 20 of them) are then loaded in memory (through the command Read images that is automatically executed) in the variables I_1,I_2 ,...,I_20. The number of images is stored in the variablen_ima (=20 here).
    The matlab window should look like this:



    The complete set of images is also shown in thumbnail format (this images can always be regenerated by runningmosaic):



    If the OUT OF MEMORY error message occurred during image reading, that means that your computer does not have enough RAM to hold the entire set of images in local memory. This can easily happen of you are running the toolbox on a 128MB or less laptop for example.In this case, you can directly switch to the memory efficient version of the toolbox by runningcalib_gui and selecting the memory efficient mode of operation. The remaining steps of calibration (grid corner extraction and calibration) are exactly the same. Note that in memory efficient mode, the thumbnail image is not displayed since the calibration images are not loaded all at once.


  • Extract the grid corners:

    Click on the Extract grid corners button in the Camera calibration tool window.



    Press "enter" (with an empty argument) to select all the images(otherwise, you would enter a list of image indices like[2 5 8 1012] to extract corners of a subset of images). Then, select thedefault window size of the corner finder:wintx=winty=5 bypressing "enter" with empty arguments to the wintx andwinty question. This leads to a effective window of size 11x11pixels.



    The corner extraction engine includes an automatic mechanism for counting the number of squares in the grid. This tool is specially convenient when working with a large number of images since the user does not have to manually enter the number of squares in both x and y directions of the pattern.On some very rare occasions however, this code may not predict the right number of squares. This would typically happen when calibrating lenses with extreme distortions. At this point in the corner extraction procedure, the program gives the option to the user to disable the automatic square counting code. In that special mode, the user would be prompted for the square count for every image. In this present example, it is perfectly appropriate to keep working in the default mode (i.e. with automatic square counting activated), and therefore, simply press "enter" with an empty argument. (NOTE: it is generally recommended to first use the corner extraction code in this default mode, and then, if need be, re-process the few images with "problems")



    The first calibration image is then shown on Figure 2:



    Click on the four extreme corners on the rectangular checkerboardpattern. The clicking locations are shown on the four followingfigures (WARNING: try to click accurately on the four corners,at most 5 pixels away from the corners. Otherwise some of the corners might be missed by the detector).

    Ordering rule for clicking: The first clicked point is selected to be associated to the origin point of the reference frame attached to the grid.The other three points of the rectangular grid can be clicked in any order. This first-click rule is especially important if you need to calibrate externally multiple cameras (i.e. compute the relative positions of several cameras in space). When dealing with multiple cameras, the same grid pattern reference frame needs to be consistently selected for the different camera images (i.e. grid points need to correspond across the different camera views). For example, it is a requirement to run the stereo calibration toolboxstereo_gui.m (tryhelp stereo_gui and visit thefifth calibration example page for more information).



    The boundary of the calibration grid is then shown on Figure 2:



    Enter the sizes dX and dY in X and Y of each square in the grid (in this case,dX=dY=30mm=default values):



    Note that you could have just pressed "enter" with an empty argument to select the default values.The program automatically counts the number of squares in both dimensions, and shows the predicted grid corners in absence of distortion:




    If the predicted corners are close to the real image corners, then the following step may be skipped (if there is not much image distortion).This is the case in that present image: the predicted corners are close enough to the real image corners. Therefore, it is not necessary to "help" the software to detect the image corners by entering a guessfor radial distortion coefficient. Press "enter", and the corners are automatically extracted using those positions as initial guess.



    The image corners are then automatically extracted, and displayed on figure 3 (the blue squares around the corner points show the limits of the corner finder window):



    The corners are extracted to an accuracy of about 0.1 pixel.
    Follow the same procedure for the 2nd, 3rd, ... , 14th images. For example, here are the detected corners of image 2, 3, 4, 5, 6 and 7:



    Observe the square dimensions dX, dY are always kept to their original values (30mm).
    Sometimes, the predicted corners are not quite close enough to the real image corners to allow for an effective corner extraction.In that case, it is necessary to refine the predicted corners by entering a guess for lens distortion coefficient.This situation occurs at image 15. On that image, the predicted corners are:



    Observe that some of the predicted corners within the grid are far enough from the real grid corners to result into wrong extractions. The cause: image distortion. In order to help the system make a better guess of the corner locations, the user is free to manually input a guess for the first order lens distortion coefficient kc (to be precise, it is the first entry of the full distortion coefficient vectorkc described at thispage). In order to input a guess for the lens distortion coefficient, enter a non-empty string to the questionNeed of an initial guess for distortion? (for example1). Enter then a distortion coefficient ofkc=-0.3 (in practice, this number is typically between -1 and 1).



    According to this distortion, the new predicted corner locations are:



    If the new predicted corners are close enough to the real image corners (this is the case here), input any non-empty string (such as1) to the questionSatisfied with distortion?. The subpixel corner locations are then computed using the new predicted locations (with image distortion) as initial guesses:



    If we had not been satisfied, we would have entered an empty-string to the questionSatisfied with distortion? (by directly pressing "enter"), and then tried a new distortion coefficientkc. You may repeat this process as many times as you want until satisfied with the prediction (side note: the values of distortion used at that stage are only used to help corner extraction and will not affect at all the next main calibration step. In other words, these values are neither used as final distortion coefficients, nor used as initial guesses for the true distortion coefficients estimated through the calibration optimization stage).

    The final detected corners are shown on Figure 3:



    Repeat the same procedure on the remaining 5 images (16 to 20). On these images however, do not use the predicted distortion option, even if the extracted corners are not quite right. In the next steps, we will correct them (in this example, we could have not used this option for image 15, but that was quite useful for illustration).

    After corner extraction, the matlab data file calib_data.mat is automatically generated. This file contains all the information gathered throughout the corner extraction stage (image coordinates, corresponding 3D grid coordinates, grid sizes, ...). This file is only created in case of emergency when for example matlab is abruptly terminated before saving. Loading this file would prevent you from having to click again on the images.

    During your own calibrations, when there is a large amount of distortion in the image, the program may not be able to automatically count the number of squares in the grid.In that case, the number of squares in both X and Y directions have to be entered manually.This should not occur in this present example.

    Another problem may arise when performing your own calibrations. If the lens distortions are really too severe (for fisheye lenses for example), the simple guiding tool based on a single distortion coefficientkc may not be sufficient to provide good enough initial guesses for the corner locations. For those few difficult cases, a script program is included in the toolbox that allows for a completely manual corner extraction (i.e. one click per corner). The script file is calledmanual_corner_extraction.m (in memory efficient mode, you should usemanual_corner_extraction_no_read.m instead) and should be executed AFTER the traditional corner extaction code (the script relies on data that were computed by the traditional corner extraction code -square count, grid size, order of points, ...- even if the corners themselves were wrongly detected). Obviously, this method for corner extraction could be extremely time consuming when applied on a lot of images. It therefore recommended to use it as a last resort when everything else has failed. Most users should never have to worry about this, and it will not happen in this present calibration example.

  • Main Calibration step:

    Aftercorner extraction, click on the button Calibration of theCamera calibration tool to run the main camera calibrationprocedure.
    Calibration is done in two steps: first initialization,and then nonlinear optimization.
    The initialization step computesa closed-form solution for the calibration parameters based notincluding any lens distortion (program name:init_calib_param.m).
    The non-linear optimization stepminimizes the total reprojection error (in the least squares sense)over all the calibration parameters (9 DOF for intrinsic: focal,principal point, distortion coefficients, and 6*20 DOF extrinsic =>129 parameters). For a complete description of the calibrationparameters, click on that link. Theoptimization is done by iterative gradient descent with an explicit(closed-form) computation of the Jacobian matrix (program name:go_calib_optim.m).



    The Calibration parameters are stored in a number ofvariables. For a complete description of them, visit thispage. Noticethat the skew coefficientalpha_c and the 6th order radial distortion coefficient (the last entry ofkc) have not been estimated (thisis the default mode). Therefore, the angle between the x and y pixelaxes is 90 degrees. In most practical situations, this is a very goodassumption. However, later on, a way of introducing the skewcoefficientalpha_c in the optimization will be presented.

    Observe that only 11 gradient descent iterations are requiredin order to reach the minimum. This means only 11 evaluations of thereprojection function + Jacobian computation and inversion. The reasonfor that fast convergence is the quality of the initial guess for theparameters computed by the initialization procedure.
    For now, ignore the recommendation of the system to reduce the distortion model. The reprojection error is still too large to make a judgement on the complexity of the model. This is mainly because some of the grid corners were not very precisely extracted for a number of images.
    Click onReproject on images in the Camera calibration tool toshow the reprojections of the grids onto the original images. Theseprojections are computed based on the current intrinsic and extrinsicparameters. Input an empty string (just press "enter") to the questionNumber(s) of image(s) to show ([] = all images) to indicatethat you want to show all the images:



    The following figures shows the first four images with the detected corners (red crosses) and the reprojected grid corners (circles).





    The reprojection error is also shown in the form of color-coded crosses:



    In order to exit the error analysis tool, right-click on anywhere on the figure (you will understand later the use of this option).
    Click on Show Extrinsic in the Camera calibration tool.The extrinsic parameters (relative positions of the grids with respect to the camera) are then shown in a form of a 3D plot:



    On this figure, the frame(Oc,Xc,Yc,Zc) isthe camera reference frame. The red pyramid corresponds to theeffective field of view of the camera defined by the image plane.To switch from a "camera-centered" view to a "world-centered" view, just click on the Switch to world-centered view button located at the bottom-left corner of the figure.



    On this new figure, every camera position and orientation is represented by a green pyramid. Another click on theSwitch to camera-centered view button turns the figure back to the "camera-centered" plot.

    Looking back at the error plot, notice that the reprojection error is very large across alarge number of figures. The reason for that is that we have not donea very careful job at extracting the corners on some highly distortedimages (a better job could have been done by using the predicteddistortion option). Nevertheless, we can correct for that now byrecomputing the image corners on all images automatically. Here is theway it is going to be done: press on theRecomp. corners buttonin the mainCamera calibration tool and select once again acorner finder window size ofwintx = winty = 5 (the default values):



    To the question Number(s) of image(s) to process ([] = all images) press "enter" with an empty argument to recompute the corners on all the images. Enter then the mode of extraction: the automatic mode (auto) uses the re-projected grid as initial guess locations for the corner, the manual mode lets the user extract the corners manually (the traditional corner extraction method). In the present case, the reprojected grid points are very close to the actual image corners. Therefore, we select the automatic mode: press "enter" with an empty string. The corners on all images are then recomputed. your matlab window should look like:



    Run then another calibration optimization by clicking on Calibration:



    Observe that only six iterations were necessary forconvergence, and no initialization step was performed (theoptimization started from the previous calibration result). The twovalues0.12668 and0.12604 are the standard deviation ofthe reprojection error (in pixel) in both x and y directionsrespectively. Observe that the uncertainties on the calibration parameters are also estimated. The numerical values are approximately three times the standard deviations.
    After optimization, click on Save to savethe calibration results (intrinsic and extrinsic) in the matlab fileCalib_Results.mat



    For a complete description of the calibration parameters, click on that link.
    Once again, click on Reproject on images to reproject the grids onto the original calibration images. The four first images look like:



    Click on Analyse error to view the new reprojection error (observe that the error is much smaller than before):



    After right-clicking on the error figure (to exit the error-analysis tool), click onShow Extrinsic to show the new 3D positions of the grids with respect to the camera:



    A simple click on the Switch to world-centered view button changes the figure to this:



    The tool Analyse error allows you to inspect which points correspond to large errors.Click onAnalyse error and click on the figure region that is shown here (upper-right figure corner):



    After clicking, the following information appears in the main Matlab window:



    This means that the corresponding point is on image 18, at the grid coordinate (0,0) in the calibration grid (at the origin of the pattern). The following image shows a close up of that point on the calibration image (before, exit the error inspection tool by clicking on the right mouse button anywhere within the figure):



    The error inspection tool is very useful in cases where the corners have been badly extracted on one or several images. In such a case, the user can recompute the corners of the specific images using a different window size (larger or smaller).

    For example, let us recompute the image corners using a window size (wintx=winty=9) for all 20 images except for images 20 (usewintx=winty=5), images 5, 7, 8, 19 (usewintx=winty=7), and images 18 (usewintx=winty=8). The extraction of the corners should be performed with three calls ofRecomp. corners. At the first call ofRecomp. corners, selectwintx=winty=9, choose to process images 1, 2, 3, 4, 6, 9, 10, 11, 12, 13, 14, 15, 16 and 17, and select the automatic mode (the reprojections are already very close to the actual image corners):



    At the second call of Recomp. corners, select wintx=winty=8, choose to process image 18 and select once again the automatic mode:



    At the third call of Recomp. corners, select wintx=winty=7, choose to process images 5, 7, 8 and 19 and select once again the automatic mode:



    Re-calibrate by clicking on Calibration:



    Observe that the reprojection error (0.11689,0.11500) is slightly smaller than the previous one. In addition, observe that the uncertainties on the calibration parameters are also smaller. Inspect the error by clicking onAnalyse error:



    Let us look at the previous point of interest on image 18, at the grid coordinate (0,0) in the calibration grid. For that, click onReproject on images and select to show image 18 only (of course, before that, you must exit the error inspection tool by right-cliking within the window):



    A close view at the point of interest (on image 18) shows a smaller reprojection error:



    Click once again on Save to save the calibration results (intrinsic and extrinsic) in the matlab fileCalib_Results.mat



    Observe that the previous calibration result file was copied under Calib_Results_old0.mat (just in case you want to use it later on).

    Download now the five additional images Image21.tif,Image22.tif,Image23.tif,Image24.tif and Image25.tif andre-calibrate the camera using the complete set of 25 images without recomputing everythingfrom scratch.
    After saving the five additional images in the current directory, click on Read images to read the complete new set of images:



    To show a thumbnail image of all calibration images, run mosaic (if you are running in memory efficient mode, runmosaic_no_read instead).



    Click on Extract grid corners to extract the corners on the five new images, with default window sizeswintx=winty=5:



    And go on with the traditional corner extraction on the five images.Afterwards, run another optimization by clicking onCalibration:



    Next, recompute the image corners of the four last images using different window sizes. Usewintx=winty=9 for images 22 and 24, usewintx=winty=8 for image 23, and usewintx=winty=6 for image 25.Follow the same procedure as previously presented (three calls ofRecomp. corners should be enough). After recomputation, runCalibration once again:



    Click once again on Save to save the calibration results (intrinsic and extrinsic) in the matlab fileCalib_Results.mat



    As an exercise, recalibrate based on all images, except images 16, 18,19, 24 and 25 (i.e. calibrate on a new set of 20 images).
    Clickon Add/Suppress images.



    Enter the list of images to suppress ([16 18 19 24 25]):



    Click on Calibration to recalibrate:



    It is up to user to use the function Add/Suppressimages to activate or de-activate images. In effect, this functionsimply updates the binary vectoractive_images setting zeros toinactive images, and ones to active images.
    Next, load the oldcalibration results previously saved in Calib_Results.mat byclicking onLoad:



    The setup is now back to what it was before supressing 5 images 16, 18,19, 24 and 25. Let us now run a calibration by including the skew factoralpha_c describing the angle between the x and y pixel axes. For that, set the variableest_alpha to one (at the matlab prompt). As an exercise, let us fit the radial distortion model up to the 6th order (up to now, it was up to the 4th order, with tangential distortion). For that, set the last entry of the vectorest_dist to one:



    Then, run a new calibration by clicking on Calibration:



    Observe that after optimization, the skew coefficient is very close tozero (alpha_c = 0.00042). This leads to an angle between x andy pixel axes very close to 90 degrees (89.976 degrees). This justifiesthe previous assumption of rectangular pixels (alpha_c = 0). In addition, notice that the uncertainty on the 6th order radial distortion coefficient is very large (the uncertainty is much larger than the absolute value of the coefficient). In this case, it is preferable to disable its estimation. In this case, set the last entry of est_dist to zero:



    Then, run calibration once again by clicking on Calibration:



    Judging the result of calibration satisfactory, let us save the current calibration parameters by clicking onSave:



    In order to make a decision on the appropriate distortion model to use, it is sometimes very useful to visualize the effect of distortions on the pixel image, and the importance of the radial component versus the tangential component of distortion. For this purpose, run the script visualize_distortions at the matlab prompt (this function is not yet linked to any button in the GUI window). The three following images are then produced:







    The first figure shows the impact of the complete distortion model (radial + tangential) on each pixel of the image. Each arrow represents the effective displacement of a pixel induced by the lens distortion. Observe that points at the corners of the image are displaced by as much as 25 pixels. The second figure shows the impact of the tangential component of distortion. On this plot, the maximum induced displacement is 0.14 pixel (at the upper left corner of the image). Finally, the third figure shows the impact of the radial component of distortion.This plot is very similar to the full distortion plot, showing the tangential component could very well be discarded in the complete distortion model. On the three figures, the cross indicates the center of the image, and the circle the location of the principal point.

    Now, just as an exercise (not really recommended in practice), let us run an optimization without the lensdistortion model (by enforcingkc = [0;0;0;0;0]) and without aspect ratio (by enforcing both components offc to be equal). For that, setthe binary variables est_dist to[0;0;0;0;0] andest_aspect_ratio to 0 at the matlabprompt:



    Then, run a new optimization by clicking on Calibration:



    As expected, the distortion coefficient vector kc is now zero, and both components of the focal vector are equal (fc(1)=fc(2)).In practice, this model for calibration is not recommended: for one thing, it makes little sense to estimate skew without aspect ratio. In general, unless required by a specific targeted application, it is recommended to always estimate the aspect ratio in the model (it is the 'easy part').Regarding the distortion model, people often run optimization over a subset of thedistortion coefficients. For example, setting est_dist to[1;0;0;0] keeps estimating the first distortion coefficientkc(1) while enforcing the three others to zero. This model is also known as the second order symmetric radial distortion model. It is a very viable model, especially when using low distortion optical systems (expensive lenses), or when only a few images are used for calibration. Another very common distortion model is the 4th order symmetric radial distortion with no tangential component (est_kc = [1;1;0;0]). This model, used byZhang, is justified by the fact that most lenses currently manufactured do not have imperfection in centering (for more information, visit thispage). This model could have very well been used in this present example, recalling from the previous three figures that the tangential component of the distortion model is significantly smaller that the radial component.

    Finally, let us run a calibration rejecting the aspect ratio fc(2)/fc(1), the principal pointcc, the distortion coefficientskc, and the skewcoefficientalpha_c from the optimization estimation. For thatpurpose, set the four binary variables ,center_optim,est_dist andest_alpha to the following values:



    Generally, if the principal point is not estimated, the best guess for its location is the center of the image:



    Then, run a new optimization by clicking on Calibration:



    Observe that the principal point cc is still at the center of the image after optimization (sincecenter_optim=0).

    Next, load the old calibration results previously saved inCalib_Results.mat by clicking onLoad:





    Additional functions included in thecalibration toolbox:

  • Computation of extrinsic parameters only:Download an additional image of the same calibration grid:Image_ext.tif.



    Notice that this image was not used in the main calibration procedure. The goal of this exercise is to compute the extrinsic parametersattached to this image given the intrinsic camera parameters previously computed.
    Click on Comp. Extrinsic in the Camera calibration tool, and successively enter the image name without extension (Image_ext),the image type (tif), and extract the grid corners (following the same procedure as previously presented - remember: the first clicked point is the origin of the pattern reference frame). The extrinsic parameters (3D location of thegrid in the camera reference frame) is then computed. The main matlab window should look like:



    The extrinsic parameters are encoded in the form of arotation matrix (Rc_ext) and a translation vector(Tc_ext). The rotation vectoromc_ext is related to therotation matrix (Rc_ext) through the Rodrigues formula:Rc_ext = rodrigues(omc_ext).
    Let us give the exactdefinition of the extrinsic parameters:
    Let P be a pointspace of coordinate vector XX = [X;Y;Z] in the grid referenceframe(O,X,Y,Z) shown on the following figure:



    Let XXc = [Xc;Yc;Zc] be the coordinate vector ofP in the camera reference frame(Oc,Xc,Yc,Zc).
    Then XX and XXc are related to each other through the following rigid motion equation:

    XXc = Rc_ext * XX + Tc_ext


    In addition to the rigid motion transformation parameters, the coordinates of the grid points in the grid reference frame are also stored in thematrixX_ext.Observe that the variablesRc_ext, Tc_ext, omc_ext and X_ext are not automatically saved into any matlab file.

  • Undistort images: This function helps you generate the undistorted version of one or multiple images given pre-computed intrinsic camera parameters.
    As an exercise, let us undistort Image20.tif.
    Click on Undistort image in the Camera calibration tool.



    Enter 1 to select an individual image, and successively enter the image name without extension (Image20),the image type (tif). The main matlab window should look like this:



    The initial image is stored in the matrix I, and displayed in figure 2:



    The undistorted image is stored in the matrix I2, and displayed in figure 3:



    The new undistorted image (I2) is also saved on disk under Image20_rect.tif.

    Let us now undistort the complete set of calibration images. Click onUndistort image, and enter an empty argument to the firstquestion. All the calibration images are then undistorted and savedonto disk underImage_rect1.tif,Image_rect2.tif, ...,Image_rect25.tif:



  • Export calibration data to other formats(Willson-Heikkil� and Zhang): This function lets youexport the calibration data (extracted image corners + associated 3Dworld coordinates) to Willson-Heikkil� or Zhang formats. This may beuseful for comparison purposes (if you want to run other peoplecalibration engines on the same data). This function may be used justafter the corner extraction phase. Click onExport calib datain the main toolbox window.



    Enter 0 to select the data format used by Willson and Heikkil�, and enter the basename of the data files (shot). The calibration data of each image is then saved to individual filesshot1,shot2,...,shot25:



    Let us now export the data under Zhang's format. Click on Export calib data, and enter1 to select that new export format. Enter then two file basenames: one for the 3D rig coordinates (Model) and one for the image coordinates (data).The program created then a set of text files (Model1.txt,data1.txt,...,Model25.txt,data25.txt) that can be read by Zhang's code.After export, your matlab window should look like:




Back to main calibration page

0 0