Non-local Scan Consolidation for 3D Urban Scene(Patent pending)

ACM Transactions on Graphics (Proceedings SIGGRAPH 2010)

Qian Zheng1    Andrei Sharf1    Guowei Wan2,1    Yangyan Li1   Niloy J. Mitra3    Baoquan Chen1    Daniel Cohen-Or4  
1SIAT,Chinese Academy of Sciences, China
2National University of Defense Technology, China
3Indian Institute of Technology Delhi, India
4Tel Aviv University, Israel

     


(a) ground truth image                                         (b) point cloud                                 (c) consolidated point cloud

Figure 1: Consolidating a LIDAR scan captured 3D urban scene containing noise and missing regions.

Abstract


Recent advances in scanning technologies, in particular devices that extract depth through active sensing, allow fast scanning of urban scenes. Such rapid acquisition incurs imperfections: large regions remain missing, significant variation in sampling density is common, and the data is often corrupted with noise and outliers. However, buildings often exhibit large scale repetitions and self-similarities. Detecting, extracting, and utilizing such large scale repetitions provide powerful means to consolidate the imperfect data. Our key observation is that the same geometry, when scanned multiple times over reoccurrences of instances, allow application of a simple and effective non-local filtering. The multiplicity of the geometry is fused together and projected into a base-geometry defined by clustering corresponding surfaces. Denoising is applied by separating the process into off-plane and in-plane phases. We show that the consolidation of the reoccurrences provides robust denoising and allow reliable completion of missing parts. We present evaluation results of the algorithm on several LIDAR scans of buildings of varying complexity and styles.
     

Paper

89M PDF 7M PDF
     

Slides

35M slides
     

Results

     

Figure 2: Input and consolidation results on two buildings of different styles with progressively poor data quality with height. In each case, we show the input point set, the consolidated data, and zooms of corresponding sections before and after consolidation.

     

Figure 3: Captured walls with high detail and large noise (left) are consolidated (right), while our weighted median in-plane consolidation preserves the fine detail (bottom zooms). In presence of high noise and large missing parts, we falsely detect additional lines.

     

Figure 4: We consolidate a large urban scene, containing cylinders as a repetitive component.

     

Figure 5: Consolidating a LIDAR scan captured 3D building containing noise and missing regions. (Left) Repeated parts are detected and colored. (Right) Result of non-local filtering and consolidation of the repeated parts.

Testing Data Sets

wall-group fat-building tall-building sangtai cylinder

Data Format:
Each line that are not started with "#" contains four variables: X Y Z and L. X Y Z are 3D point coordinates. L is the label for the point indicating which repetition group it belongs to (Label 0 means that the point doesn't belong to any repetition group)

     

Acknowledgements

We thank Shachar Fleishman for his thoughtful comments and the anonymous reviewers for their valuable suggestions. This work was supported in part by National Natural Science Foundation of China (60902104), National High-tech R&D Program of China (2009AA01Z302), CAS Visiting Professorship for Senior International Scientists, CAS Fellowship for Young International Scientists, Shenzhen Science and Technology Foundation (GJ200807210013A). Niloy was partially supported by a Microsoft outstanding young faculty fellowship.

     

BibTex

@ARTICLE{Nola2010,
  title = {Non-local Scan Consolidation for 3D Urban Scene},
  author = {Qian Zheng and Andrei Sharf and Guowei Wan and Yangyan Li and Niloy J. Mitra and Baoquan Chen and Daniel Cohen-Or},
  journal = {ACM Transactions on Graphics (Proceedings of SIGGRAPH 2010)},
  volume = {29},
  issue = {4},
  pages = {Article 94},
  year = {2010}
}