Unpaired Neural Implicit Shape Translation Network

AbstractWe introduce UNIST, the first deep neural implicit modelfor general-purpose, unpaired shape-to-shape translation, in both 2D and 3D domains. Our model is built on au-toencoding implicit fields, rather than point clouds whichrepresents the state of the art. Furthermore, our translationnetwork is trained to perform the task over a latent grid rep-resentation which combines the merits of both latent-spaceprocessing and position awareness, to not only enable dras-tic shape transforms but also well preserve spatial featuresand fine local details for natural shape translations. Withthe same network architecture and only dictated by the in-put domain pairs, our model can learn both style-preservingcontent alteration and content-preserving style transfer. Wedemonstrate the generality and quality of the translation re-sults, and compare them to well-known baselines. Code isavailable at

Download publication

Related Resources



Learning from and Inspiring Women at Girl Geek X

Dr. Tonya Custis shares how she learned about management and strategy…



Learning Dense Reward with Temporal Variant Self-Supervision

Rewards play an essential role in reinforcement learning for robotic…



Data Visualization and Visual Analytics

Visual data representations leverage the power of human perception to…



Computational Anatomy and Biomechanics

Computational anatomy incorporates the use of geometric- and…

Get in touch

Something pique your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities.

Contact us