Publication
Performing Incremental Bayesian Inference by Dynamic Model Counting
AbstractThe ability to update the structure of a Bayesian network when new data becomes available is crucial for building adaptive systems. Recent work by Sang, Beame, and Kautz (AAAI 2005) demonstrates that the well-known Davis-Putnam procedure combined with a dynamic decomposition and caching technique is an effective method for exact inference in Bayesian networks with high density and width. In this paper, we define dynamic model counting and extend the dynamic decomposition and caching technique to multiple runs on a series of problems with similar structure. This allows us to perform Bayesian inference incrementally as the structure of the network changes. Experimental results show that our approach yields significant improvements over the previous model counting approaches on multiple challenging Bayesian network instances.
Download publicationRelated Resources
See what’s new.
2024
The Problem of Generative Parroting: Navigating Toward Responsible AI (Part 1)Expore the challenges of data parroting in generative AI models from a…
2015
Dynamic Opacity Optimization for Scatter PlotsScatterplots are an effective and commonly used technique to show the…
2004
Keeping Your Distance: Remote Usability Testing or the Lab – which is best?Link to Publication: http://uxpamagazine…
2019
Project Discover: Workflow for Generative Design in ArchitectureThis project involves the integration of a rule-based geometric…
Get in touch
Something pique your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities.
Contact us