Publication

XLB

A Differentiable Massively Parallel Lattice Boltzmann Library in Python

Abstract

XLB: A Differentiable Massively Parallel Lattice Boltzmann Library in Python

Mohammadmehdi Ataei, Hesam Salehipour

The lattice Boltzmann method (LBM) has emerged as a prominent technique for solving fluid dynamics problems due to its algorithmic potential for computational scalability. We introduce XLB library, a Python-based differentiable LBM library based on the JAX platform. The architecture of XLB is predicated upon ensuring accessibility, extensibility, and computational performance, enabling scaling effectively across CPU, TPU, multi-GPU, and distributed multi-GPU or TPU systems. The library can be readily augmented with novel boundary conditions, collision models, or multi-physics simulation capabilities. XLB’s differentiability and data structure is compatible with the extensive JAX-based machine learning ecosystem, enabling it to address physics-based machine learning, optimization, and inverse problems. XLB has been successfully scaled to handle simulations with billions of cells, achieving giga-scale lattice updates per second.

XLB is released under the permissive Apache-2.0 license and is available on GitHub.

Download publication

Code and Datasets

Related Resources

Publication

2024

TimeTunnel Live: Recording and Editing Character Motion in Virtual Reality

An animation authoring interface for recording and editing motion in…

Publication

2023

Peek-At-You: An Awareness, Navigation, and View Sharing System for Remote Collaborative Content Creation

Remote work improved by collaborative features such as conversational…

Publication

2021

RXMesh: A GPU Mesh Data Structure

We propose a new static high-performance mesh data structure for…

Publication

2020

Memory-Based Graph Networks

Graph neural networks (GNNs) are a class of deep models that operate…

Get in touch

Something pique your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities.

Contact us