Colorized 360 slam image alignment

When creating an RGB colorized dataset my pictures are not aligning properly. Parking space lines are broken and misaligned, surfaces have the wrong colors, the side of a building will look like grass, etc. I have calibrated the camera several times but I’m still having the issues. Is there something I am doing wrong? is there a way to fix alignment in post processing?

I don’t have answers but am curious if this was ever solved? I aligned my camera once and it didn’t take - haven’t tried again since.

My camera is calibrated now, but it took a few tries. I still get some strange colors on some surfaces but I think that is just going to happen in some instances. I hope this will improve as the AI gets better. I found a nice 3 story building with lots of windows with white trim to do the alignments, I focused on the corners of the windows. Choose 20 sets, top, bottom, left, and right. Use Extrinsic not auto extrinsic, that gave me the best results. This process can be a little confusing, align 20 sets with the front image and 20 sets with the back, then save the calibration and click next. There is a bug so it wont show you the visualization till you exit. Then check the visualization if it doesn’t look good, do it again, you’ll get it, when you do it makes a big difference.
image

1 Like

Do you have to do this each time you use the SLAM doc with rgb camera?