Calibrate the R2A RGB Camera

Here is a walk through of how to calibrate the camera on your ROCK R2A.

Here's a quick video tutorial on some of the elements below

HubSpot Video



Table of Contents

  1. Introduction
  2. Camera Calibration
  3. Camera Image Alignment Parameters
  4. Checking Camera Calibration
  5. Storing the Computed Values

1.  Introduction

The R2A Payload consists of a LiDAR, an INS, and a camera.  The camera must be aligned with the Inertial Measurement Unit (IMU) of the INS in order to  generate an accurately colorized point cloud  with the LiDAR data. This information is then used to generate a georeferenced trajectory. 

The camera alignment process is often referred to as "calibrating" and is done in your PCPainter Software. Your R2A unit arrives to you tested and calibrated from the factory. Over time with prolonged and regular use, or after a bump, drop or crash, your unit may benefit from additional alignment. Typically, if your unit is cared for properly, the default values will remain true for the life of your R2A.  Camera calibration can be accomplished by adjusting certain camera parameters to align the images with the point cloud. 

CAUTION: Please be sure store your original factory camera calibration values (AKA the Boresight dataset) somewhere safe in case you need to revert to them. Failure to retain this information my result in losing your factory settings and needing additional support (and $$ cost) to recover them.

2.  Camera Calibration

Camera calibration is performed based on a scan of an area having a known structure such as a pole,  building or car. Before scanning the calibration pattern, the following convergence maneuvers must be performed to ensure good observability of IMU heading:

  1. Fly straight forward, for a duration of at least 5 seconds, at a speed of at least 5 m/sec
  2. Left and right turns in motion, at least 90 degrees each
  3. Scan calibration pattern (2-3 figure eights)
  4. Left and right turns in motion, at least 90 degrees each
  5. Fly straight forward, for a duration of at least 5 seconds, at a speed of at least 5 m/sec

Below is the trajectory of an example of a proper camera calibration flight path.

Boresight Flight 1

3.  Camera Image Alignment Parameters

All cameras have the following misalignment parameters that are available in PCPainter:

  • Angular Offsets
    • Yaw offset between the IMU and the Camera
    • Pitch offset between the IMU and the Camera
    • Roll offset between the IMU and the Camera
  • Camera Calibration
    • Focal Length
    • Pixel flatness
    • DistortNum
    • DistortDen
    • DeltaX
    • DeltaY
    • VignetteNum
    • VignetteDen
    • Red
    • Green
    • Blue
    • Saturation


Note: Most camera calibration parameters do not need to be changed from their default values. Parameters that are rarely changed will not be covered in this walk through.

Some cameras may have calibration parameters such as distortion density that are specific to the camera.

Each alignment parameter has its own unique effect on image positions in relation to a structure.

It is assumed that when all alignment parameters are correct, the image fits the size of the structure properly, and all the edges of the image align and match the edges of the structure.

In this walk through, we will examine the effect of angular offsets and key camera calibration parameters on the image below. The dark rectangle is the position and orientation of the camera in flight that was projected on to the point cloud. The white lines are provided as a frame of reference in describing the effect of changing offsets on the position of the image.


When using a structure like a house is used for camera calibration, the most important areas to note are the apexes and edges of the roof.

Correct Pitch Offset

Correct Pitch Offset

Incorrect Pitch Offset

incorrect Pitch Offset

The pitch offset effects the displacement of the image in the side-to-side direction. This is indicated by the edges of the building beginning to take the color of the grass beneath it. These edges are circled in red. As a result, the coloring of the building spills into the grass on the areas circled in blue.

Correct Roll Offset

Correct Roll Offset

Incorrect Roll Offset

incorrect Roll Offset

The roll offset of the camera effects the movement of the image in the forward and backward direction. This is evident by the shift in colorization in the images. With incorrect roll offset, the house takes on the color of the grass at the near edge of the roof and the right edge of the roof from this point of view. In addition, the colorization of the far edge and left edge of the roof begins to spill into the grass.

The yaw offset will cause the image to rotate. Below is an extreme example of yaw misalignment.

Yaw Misalignment

It is evident that the image is extremely rotated in the counterclockwise direction. This example shows the effect of yaw offset on the image projection over the cloud. Such an extreme inaccuracy in yaw offset is quite unlikely and users will rarely encounter a situation like this. To see the effect of smaller yaw offset inaccuracies, we will use more zoomed in images focused on the apexes of the roof.

Below is the effect of yaw offset on the largest apex of the roof which ran horizontally in the previous images.

Correct Yaw Offset

Correct Yaw Offset

Incorrect Yaw Offset

incorrect Yaw Offset

It takes a much smaller yaw offset to affect the colorization on the apex of the roof. When using a building for camera calibration, the apexes of the roof are a good place to check yaw offset accuracy. In this case, the yaw offset is correct when the dark lines on each side of the roof runs straight since it is known that this roofline is straight.

The last major parameter for camera calibration is focal length. The focal length determines the size of the projected image. Below is a comparison of images with correct and incorrect focal lengths.

Correct Focal Length

Correct Focal Length

Incorrect Focal Length

Incorrect Focal Length

The incorrect focal length changes the size of the image. The edges of the roof have taken on the color of the grass, indicating that the image is appearing too small.

4.  Verifying Camera Calibration

Once you are satisfied with the position and orientation of the image projected on the structure, there are a few ways to verify your calibration values.

One way to verify camera calibration is by looking at a powerline, if present. When verifying a powerline, you want to look for any distortion of the image or any incorrect colorization of the pole and the ground around the pole. These are key indicators of misalignment. Below is an image detailing the difference between a well calibrated power line and one that is misaligned.

Correct Alignment

Correct Alignment

Incorrect Alignment

Incorrect Alignment

In this example, the pitch offset is misaligned. As a result, the colorization of the powerline was projected over the pole instead of the powerline itself.

Another way to check your calibration is to examine cars that are captured within the point cloud. The windshields of cars do not reflect the lidar, so you can check to see if the windshield lacks colorization and if any points around the car are incorrectly colored. A comparison of correct and incorrect colorization of cars is shown below.


Correct Boresighting


Incorrect Boresighting

The incorrectly calibrated point cloud has an incorrect roll offset noted by the windshield of the car being colorized, as indicated by the red circle.

Finally, you can review camera calibration outcomes against the lines of a road. Here we examine the double yellow lines on a road. First, right click on a rectangle representing the camera position and orientation that will project an image on the road, and select “switch to camera here” as shown below.

Road Projection Switch to Camera View Here

Place the cursor between the lines of the road. In this example, we will place our cursor between the yellow lines as shown below.

Cursor on Yellow Lines

Zoom out from the point cloud without moving the cursor and then select a new camera orientation that will project an image on the road as shown.

Zoom Out Camera View Here

After selecting the new image, zoom in to where your cursor remains. If the cursor is still on the same portion of the road where you intended to leave it, then that is a good indicator of good camera calibration. Here we see that the cursor is still directly between the two yellow lines, just where we had left it. 

Zoom In Check Camera View

5.  Storing the Computed Values

Saving your newly computed camera calibration values in the PCPainter Project file  will allow you to load these values as standard onto your R2A using the web interface. Then they will be recorded in the camera image files on the device when you are capturing data. When you create a new PCPainter Project, these values will also then be automatically loaded there.

To store the desired values:

1. Open the web interface

2. Go to the Settings -> Camera 

3. Click “Read from PCPainter project”

4. Browse for the PCPainter Project file and select it

Now the computer values will be extracted from the file and stored in the LiDAR Payload non-volatile memory.