Empir3D: Multi-Dimensional Point Cloud Quality Assessment

Center for Embodied Autonomy and Robotics (CEAR)
University at Buffalo

Abstract

Advancements in sensors, algorithms and compute hardware has made 3D perception feasible in real-time. Current methods to compare and evaluate quality of a 3D model such as Chamfer, Hausdorff and Earth-mover’s distance are uni- dimensional and have limitations; including inability to capture coverage, local variations in density and error, and are significantly affected by outliers. In this paper, we propose an evaluation framework for point clouds (Empir3D) that consists of four metrics - resolution (Qr) to quantify ability to distinguish between the individual parts in the point cloud, accuracy (Qa) to measure registration error, coverage (Qc) to evaluate portion of missing data, and artifact-score (Qt) to characterize the presence of artifacts. Through detailed analysis, we demonstrate the complementary nature of each of these dimensions, and the improvement they provide compared to uni-dimensional measures highlighted above. Further, we demonstrate the utility of Empir3D by comparing our metric with the uni-dimensional metrics for two 3D perception applications (SLAM and point cloud completion). We believe that Empir3D advances our ability to reason between point clouds and helps better debug 3D perception applications by providing richer evaluation of their performance. Our implementation of Empir3D, custom real- world datasets, evaluation on learning methods, and detailed documentation on how to integrate the pipeline will be made available upon publication.

Results

Evaluation on Warehouse dataset (simulated); (a),(b),(c),(d) show point clouds ground truth, LeGO-LOAM, FAST-LIO2 and SHINE. (d) exhibits highest Qc,Qa,Qa as shown in zoomed-in view but contains artifacts (lower Qt compared to (c)). Empir3D accurately quantifies all aspects of quality while Dc and Dh identify (c) and (d) as the most similar
Evaluation on point cloud completion, Lamp from MVP reconstructed using PCN, TOPNET, ECG. ECG generates highest quality point clouds which is confirmed by Empir3D metrics while Dc and Dh identify PCN despite presence of artifacts and poor reconstruction.

Compute Performance

Comparative analysis of memory, CPU utilization and computation time across different region sizes (r) and number of processor cores. Memory utilization remains relatively constant across various core counts, whereas CPU utilization increases with more cores, especially for smaller region sizes before decreasing again when regions become too large for multi-threading. Computation time decreases significantly with the increase in region size up to a certain point, beyond which the benefits plateau, as shown in the benchmarks for 1, 2, 4, and 8 core computation of Empir3D, compared to the baseline single-threaded implementations Dc and Dh.

Real-Time Change Detection

Davis Dataset

We introduce Davis, an indoor dataset for evaluating point cloud quality assessment algorithms. We capture ground truth poses using a Robotic Total-Station [47] which is essentially a theodolite with an integrated distance meter that can measure distances and angles. This enables extremely precise pose estimation (millimeter-level) and is widely used in infrastructure and geospatial surveying [48], [49]. Second, we capture LiDAR scans at the exact locations where poses are measured. This is achieved by placing a LiDAR mounted on a custom tripod, the tripod is equipped with a nadir pointed laser that projects a cross-hair onto carefully placed markers on the ground to align scans. For the Davis dataset, over 400 poses and corresponding scans were captured with this method. Finally, the scans are stitched into a dense point cloud of the environment with the recorded poses providing initial alignment. ICP was used for fine registration.

BibTeX

@inproceedings{turkar2024e3d,
            title={Empir3D: Multi-Dimensional Point Cloud Quality Assessment},
            author={Yash Turkar, Pranay Meshram, Christo Aluckal, Charuvahan Adhivarahan, Karthik Dantu},
            year={2024}
          }