Publication | IEEE International Conference on Computer Vision and Pattern Recognition (CVPR) 2021

BRepNet

A topological message passing system for solid models

This paper introduces a new approach to building neural networks for the understanding of solid models. It includes details of a convolution scheme which takes full advantage of topology information in the solid and a benchmark segmentation dataset generated from designs submitted to the Autodesk Online Gallery by users of the Fusion 360 CAD application.

Download publication

Abstract

BRepNet: A topological message passing system for solid models

Joseph G. Lambourne, Karl D.D. Willis, Pradeep Kumar Jayaraman, Aditya Sanghi, Peter Meltzer, Hooman Shayani

IEEE International Conference on Computer Vision and Pattern Recognition (CVPR) 2021

Boundary representation (B-rep) models are the standard way 3D shapes are described in Computer-Aided Design (CAD) applications. They combine lightweight parametric curves and surfaces with topological information which connects the geometric entities to describe manifolds. In this paper we introduce BRepNet, a neural network architecture designed to operate directly on B-rep data structures, avoiding the need to approximate the model as meshes or point clouds. BRepNet defines convolutional kernels with respect to oriented coedges in the data structure. In the neighborhood of each coedge, a small collection of faces, edges and coedges can be identified and patterns in the feature vectors from these entities detected by specific learnable parameters.In addition, to encourage further deep learning research with B-reps, we publish the Fusion 360 Gallery segmentation dataset. A collection of over 35,000 B-rep models annotated with information about the modeling operations which created each face. We demonstrate that BRepNet can segment these models with higher accuracy than methods working on meshes, and point clouds.

Get in touch

Something pique your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities.

Contact us