Datasets:
Camera parameters and depth data format
Hi, thanks so much for creating this dataset! I'm currently working on satellite tracking, and this is super helpful. Just wanna check with you what the assumed camera parameters are for these data.
- For the RGB image, is it safe to assume a monocular pinhole camera with a focal length of 1024/(2tan(25deg))? From what has been published: 1024*1024 pixels and 50 deg FOV.
- For Depth image. Are these absolute depths? Planar depth or range depth? Are they also taken by visual sensors (stereo camera), or simply absolute ground truth measured in a simulator. If by stereo camera, it'll be very helpful to provide camera parameters as well!
- By the way, as I downloaded the dataset, it appears to only have .npz files in the depth folder instead of .png files. This doesn't align with the description, I don't know if something goes wrong.
Thanks so much for doing this again. Appreciate your reply in advance :)
Hi~ 😀
Thanks for your careful questions and pointing out the error. I will fix the depth description.
According to our current data collection code, the RGB images can be treated as coming from an ideal monocular pinhole camera. The camera resolution is 1024×1024 with a 50° FOV, so fx = fy = (1024/2)/tan(25°) = 1097.99 px, and cx = cy = 512.
For depth, the data are not produced by a stereo camera. We directly request AirSim’s DepthPerspective output from the simulator, so this should be treated as simulator ground-truth metric depth rather than stereo-estimated depth. In the current pipeline, the returned float depth is clipped at 10000 m, converted to millimeters, and stored on disk.
So in short:
- RGB: monocular pinhole camera
- Intrinsics: fx = fy = 1097.99, cx = cy = 512
- Depth: absolute metric depth
- Depth type: DepthPerspective (not DepthPlanar)
- Source: simulator ground truth, not stereo reconstruction
About the file format: the current code saves depth as compressed .npz files not .png, and I will fix the README description.
You can visualize the raw data by the following script and learn more details by the code:
# https://github.com/wuaodi/SpaceSense-Bench
python SpaceSense-Toolkit/visualize/raw_data_web_visualizer.py --raw-data data_example
Thanks so much for the reply and clarifications!