Learning Part-based Templates from Large Collections of 3D Shapes

Abstract

As large repositories of 3D shape collections continue to grow, understanding the data, especially encoding the inter-model similarity and their variations, is of central importance. For example, many data-driven approaches now rely on access to semantic segmentation information, accurate inter-model point-to-point correspondence, and deformation models that characterize the model collections. Existing approaches, however, are either supervised requiring manual labeling; or employ super-linear matching algorithms and thus are unsuited for analyzing large collections spanning many thousands of models. We propose an automatic algorithm that starts with an initial template model and then jointly optimizes for part segmentation, point-to-point surface correspondence, and a compact deformation model to best explain the input model collection. As output, the algorithm produces a set of probabilistic part-based templates that groups the original models into clusters of models capturing their styles and variations. We evaluate our algorithm on several standard datasets and demonstrate its scalability by analyzing much larger collections of up to thousands of shapes.


Learning Part-based Templates from Large Collections of 3D Shapes
     Vladimir G. Kim, Wilmot Li, Niloy J. Mitra, Siddhartha Chaudhuri, Stephen DiVerdi, and Thomas Funkhouser
     SIGGRAPH 2013

Paper: high-res (12.7Mb) low-res (3.6Mb)
BibTex: bib
Notes: read

Code and Data

      →  Download code
      →  Download executables
      →  All Shape Correspondence Projects
      →  Download data
      →  Download matlab figures for correspondence benchmark (Figure 10)