PointMask: Towards Interpretable and Bias-Resilient Point Cloud Processing

AbstractDeep classifiers tend to associate a few discriminative input variables with their objective function, which in turn, may hurt their generalization capabilities. To address this, one can design systematic experiments and/or inspect the models via interpretability methods. In this paper, we investigate both of these strategies on deep models operating on point clouds. We propose PointMask, a model-agnostic interpretable information-bottleneck approach for attribution in point cloud models. PointMask encourages exploring the majority of variation factors in the input space while gradually converging to a general solution. More specifically, PointMask introduces a regularization term that minimizes the mutual information between the input and the latent features used to masks out irrelevant variables. We show that coupling a PointMask layer with an arbitrary model can discern the points in the input space which contribute the most to the prediction score, thereby leading to interpretability. Through designed bias experiments, we also show that thanks to its gradual masking feature, our proposed method is effective in handling data bias.

Download publication

Related Resources

See what’s new.



An Experimental Evaluation of Transparent User Interface Tools and Information

The central research issue addressed by this paper is how we can…



A Design Environment for the Rapid Specification and Fabrication of Printable Robots

In this work, we have developed a design environment to allow casual…



Neural Implicit Style-Net: synthesizing shapes in a preferred style exploiting self supervision

We introduce a novel approach to disentangle style from content in the…



Safe Self-Supervised Learning of Insertion Skills in the Real World using Tactile and Visual Sensing

Exploring how to safely train robots how to perform industrial…

Get in touch

Something pique your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities.

Contact us