Conference on Neural Information Processing Systems 2023

CAD-LLM

Large Language Model for CAD Generation

FIGURE 1: Example for CAD sketch. The left one is the uncompleted prefix sketch, right one is the full sketch. // TABLE 1: Performance Comparison of different models. The input prefix for all three models is randomly selected between 20% to 80% of the full sketch. Here ChatGPT is finetuned on 15% portion of the data. For all metrics, the bigger the better. The best results are shown in bold. More results are in the supplmentary.

Abstract

Parametric Computer-Aided Design (CAD) is the dominant paradigm for modern mechanical design. Training generative models to reason and generate parametric CAD can dramatically speed up design workflows. Pre trained foundation models have shown great success in natural language processing and computer vision. The cross-domain knowledge embedded in these models holds significant potential for understanding geometry and performing complex reasoning about design. In this work, we develop generative models for CAD by leveraging pre-trained language models and apply them to manipulate engineering sketches. Our results demonstrate that models pre-trained on natural language can be fine-tuned on engineering sketches and achieve remarkable performance in various CAD generation scenarios.

Download publication

Other Resources

Get in touch

Something pique your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities.

Contact us