Publication

Robust Representation Learning via Perceptual Similarity Metrics

AbstractA fundamental challenge in artificial intelligence is learning useful representations of data that yield good performance on a downstream task, without overfitting to spurious input features. Extracting such task-relevant predictive information is particularly difficult for real-world datasets. In this work, we propose Contrastive Input Morphing (CIM), a representation learning framework that learns input-space transformations of the data to mitigate the effect of irrelevant input features on downstream performance. Our method leverages a perceptual similarity metric via a triplet loss to ensure that the transformation preserves task-relevant information.Empirically, we demonstrate the efficacy of our approach on tasks which typically suffer from the presence of spurious correlations: classification with nuisance information, out-of-distribution generalization, and preservation of subgroup accuracies. We additionally show that CIM is complementary to other mutual information-based representation learning techniques, and demonstrate that it improves the performance of variational information bottleneck (VIB) when used together.

Download publication

Related Resources

See what’s new.

Publication

01/01/2021

Perception! Immersion! Empowerment! Superpowers as Inspiration for Visualization

We explore how the lens of fictional superpowers can help characterize…

Publication

01/01/2017

Experimental Evaluation of Sketching on Surfaces in VR

Sketching in immersive 3D virtual reality (VR) environments has great…

Publication

01/01/2021

Collective Transport of Unconstrained Objects via Implicit Coordination and Adaptive Compliance

We present a decentralized control algorithm for robots to aid in…

Project

01/01/2019

Command Usage Arc Diagrams | Autodesk Research

Exploring and analyzing a database of over 60 million commands issued…

Get in touch

Something pique your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities.

Contact us