Publication

Unsupervised Image to Sequence Translation with Canvas-Drawer Networks

Abstract

Unsupervised Image to Sequence Translation with Canvas-Drawer Networks

Kevin Frans, Chin-Yi Cheng

Encoding images as a series of high-level constructs, such as brush strokes or discrete shapes, can often be key to both human and machine understanding. In many cases, however, data is only available in pixel form. We present a method for generating images directly in a high-level domain (e.g. brush strokes), without the need for real pairwise data. Specifically, we train a ”canvas” network to imitate the mapping of high-level constructs to pixels, followed by a high-level ”drawing” network which is optimized through this mapping towards solving a desired image recreation or translation task. We successfully discover sequential vector representations of symbols, large sketches, and 3D objects, utilizing only pixel data. We display applications of our method in image segmentation, and present several ablation studies comparing various configurations.

Download publication

Associated Researchers

Chin-Yi Cheng

Autodesk Research

Kevin Frans

Massachusetts Institute of Technology

View all researchers

Related Resources

Publication

2024

Task-Centric Application Switching: How and Why Knowledge Workers Switch Software Applications for a Single Task

This research studies task-centric application switching and…

Publication

2022

T-Domino: Exploring Multiple Criteria with Quality-Diversity and the Tournament Dominance Objective

A new ranking system for Multi-Criteria Exploration (MCX) that uses…

Publication

2013

The Method of Cyclic Intrepid Projections: Convergence Analysis and Numerical Experiments

The convex feasibility problem asks to find a point in the…

Publication

1993

Turbulent Wind Fields for Gaseous Phenomena

The realistic depiction of smoke, steam, mist and water reacting to a…

Get in touch

Something pique your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities.

Contact us