Publication | International Conference on Machine Learning and Applications 2022

SimCURL

Simple Contrastive User Representation Learning from Command Sequences

SimCURL learns user representations from a large corpus of unlabeled command sequences. These learned representations are then transferred to multiple downstream tasks that have only limited labels available.

This paper is an effort towards user modeling based on the raw command sequences of Fusion360. Proper encoding of commands are crucial for better understanding user behavior and making intelligent software. In SimCURL we proposed a method for learning representations of these command sequences.

Download publication

Abstract

SimCURL: Simple Contrastive User Representation Learning from Command Sequences

Hang Chu, Amir Khasahmadi, Karl D.D. Willis, Fraser Anderson, Yaoli Mao, Linh Tran, Justin Matejka, Jo Vermeulen

International Conference on Machine Learning and Applications 2022

User modeling is crucial to understanding user behavior and essential for improving user experience and personalized recommendations. When users interact with software, vast amounts of command sequences are generated through logging and analytics systems. These command sequences contain clues to the users’ goals and intents. However, these data modalities are highly unstructured and unlabeled, making it difficult for standard predictive systems to learn from. We propose SimCURL, a simple yet effective contrastive self-supervised deep learning framework that learns user representation from unlabeled command sequences. Our method introduces a user-session network architecture, as well as session dropout as a novel way of data augmentation. We train and evaluate our method on a real-world command sequence dataset of more than half a billion commands. Our method shows significant improvement over existing methods when the learned representation is transferred to downstream tasks such as experience and expertise classification.

Related Resources

Publication

2023

BOP-Elites: A Bayesian Optimisation Approach to Quality Diversity Search with Black-Box descriptor functions

An algorithm that efficiently tackles expensive black-box optimization…

Publication

2023

WorldSmith: Iterative and Expressive Prompting for World Building with a Generative AI

Using multi-modal generative AI to quickly and iteratively visualize…

Publication

2021

LSD-StructureNet: Modeling Levels of Structural Detail in 3D Part Hierarchies

Generative models for 3D shapes represented by hierarchies of parts…

Publication

2022

Evolving Through the Looking Glass: Learning Improved Search Spaces with Variational Autoencoders.

Nature has spent billions of years perfecting our genetic…

Get in touch

Something pique your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities.

Contact us