Publication

CLIP-Forge

Towards Zero-Shot Text-to-Shape Generation

We propose a zero-shot text-to-shape generation method named CLIP-Forge. Without training on any shape-text pairing labels, our method generates meaningful shapes that correctly reflect the common name, (sub-)category, and semantic attribute information.

Generating shapes using natural language can enable new ways of imagining and creating the things around us. While significant recent progress has been made in text-to-image generation, text-to-shape generation remains a challenging problem due to the unavailability of paired text and shape data at a large scale. We present a simple yet effective method for zero-shot text-to-shape generation that circumvents such data scarcity. Our proposed method, named CLIP-Forge, is based on a two-stage training process, which only depends on an unlabeled shape dataset and a pre-trained image-text network such as CLIP. Our method has the benefits of avoiding expensive inference time optimization, as well as the ability to generate multiple shapes for a given text. We not only demonstrate promising zero-shot generalization of the CLIP-Forge model qualitatively and quantitatively, but also provide extensive comparative evaluations to better understand its behavior.

This paper was presented at the IEEE International Conference on Computer Vision and Pattern Recognition (CVPR)

The dataset for this paper is available at Autodesk AI Lab on Github.

Download publication

Related Resources

Publication

2022

Learning Dense Reward with Temporal Variant Self-Supervision

Rewards play an essential role in reinforcement learning for robotic…

Publication

2012

Programming and Controlling Self-Folding Robots

This paper describes a robot in the form of a self-folding sheet that…

Project

2013

Multiscale Interaction

This project investigates the properties and qualities of multiscale…

Project

2022

3D User Interfaces: Human Experience in 3D Environments

Designing user interfaces for interacting with 3D data involves a…

Get in touch

Something pique your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities.

Contact us