Learning Hierarchical Adaptive Code Clouds for Neural 3D Shape Representation
-
Graphical Abstract
-
Abstract
Neural implicit representation (NIR) has attracted significant attention in 3D shape representation for its efficiency, generalizability, and flexibility compared with traditional explicit representations. Previous works usually parameterize shapes with neural feature grids/volumes, which prove to be inefficient for the discrete position constraints of the representations. While recent advances make it possible to optimize continuous positions for the latent codes, they still lack self-adaptability to represent various kinds of shapes well. In this paper, we introduce a hierarchical adaptive code cloud (HACC) model to achieve an accurate and compact implicit 3D shape representation. Specifically, we begin by assigning adaptive influence fields and dynamic positions to latent codes, which are optimizable during training, and propose an adaptive aggregation function to fuse the contributions of candidate latent codes with respect to query points. In addition, these basic modules are stacked hierarchically with gradually narrowing influence field thresholds and, therefore, heuristically forced to focus on capturing finer structures at higher levels. These formulations greatly improve the distribution and effectiveness of local latent codes and reconstruct shapes from coarse to fine with high accuracy. Extensive qualitative and quantitative evaluations both on single-shape reconstruction and large-scale dataset representation tasks demonstrate the superiority of our method over state-of-the-art approaches.
-
-