Intelligence at the Intersection of Design, Making, and the Physical World
Autodesk Researchers and Residency Program Teams share their thoughts on 2026
As we kick off 2026, the most meaningful advances in artificial intelligence are not emerging from a single breakthrough or model architecture, but from convergence. Across AI research, design, manufacturing, and the built environment, boundaries between modalities, disciplines, and workflows are dissolving. We asked our Researchers and some members of the Autodesk Research Residency program what they see as possible shifts or futures in the coming year.
Multimodal Intelligence as Shared Infrastructure
Multimodal AI is transitioning from research novelty to foundational capability. While recent years brought compelling demonstrations – text-to-image generation, 2D-to-3D translation, and language-guided creation – 2026 marks a turning point toward sustained, real-world impact. As Mengliu Zhao, Machine Learning Engineering Manager, notes, “future systems will develop in two directions: lighter, compact architectures that scale on higher-quality data for known tasks, and larger, hierarchical, multi-stage architectures that scale across diverse data sources and tasks.”
What combines these approaches is an increasing emphasis on cross-modal understanding. Zhao points to the human-like ability of “manually sketching on paper and imagining it as a 3D object or verbally describing a scene and visualizing it in motion” as a model for how AI systems are beginning to reason across visual, spatial, and textual information. This capability is foundational not just for creative applications, but for robotics, manufacturing, and the built environment.
From Generative Models to Embedded Design Partners
As generative AI matures, its role in design workflows is shifting from producing isolated outputs to supporting ongoing collaboration. CAD-based generative models are increasingly embedded in real production environments, where they can ingest richer context like design intent, constraints, and downstream implications. Daniele Grandi, Principal Research Scientist, says that as adoption increases, “we will see incremental improvements in the way we interact with these models,” reflecting a move toward more natural, iterative forms of human-AI interaction.
Progress in this space remains tightly coupled to data. Academic and industry researchers continue to advance open-source text-to-CAD models, often navigating limited training datasets. According to Grandi “the publication of new design datasets and design benchmarks” is critical, not only to improve model performance, but to address the design biases that humans bring into the process. Across research domains, the quality, diversity, and structure of data remain central to responsible and effective AI.
Embodied Intelligence and Physical Grounding
Across robotics and manufacturing, a parallel shift is underway: intelligence must be grounded in the physical world. Hui Li, Senior Principal Research Scientist, highlights that data scarcity remains a central bottleneck in robotics, noting that “real-world data is slow and costly to collect, while simulation data often struggles with the notorious reality gap.”
To address this, the field is moving away from purely end-to-end learning toward hybrid approaches that combine learning with classical engineering methods. By blending “good old engineering – planning, optimization, and control – with modern learning,” robots can “leverage structure and physics to dramatically reduce data requirements. In parallel, we’re seeing a surge of work that learns not from carefully teleoperated demonstrations, but directly from human videos.”
Manufacturing AI reflects the same emphasis on physical grounding. Tanmay Aggarwal, Founder and CEO of Lambda Function, an Autodesk Research Residency team, says that intelligent systems must be “grounded in real machines, tools, materials, and physics—not just data abstractions.” World Models allow AI to reason about cause and effect across machining strategies, machine behavior, and outcomes, while NeuroSymbolic AI integrates learned behavior with explicit domain knowledge. These approaches help make AI systems more explainable, trustworthy, and practical in production settings.
Collaboration as a Core Research Principle
Across research communities, autonomy is increasingly balanced by collaboration as a guiding design principle. Athena Moore, Manager, Community Engagement Strategy, points to growing interest in human-AI collaborative frameworks, particularly in manufacturing and animation. Understanding “human behaviors in areas like machining pre-production and biomechanics,” she notes, is essential to optimizing how people and intelligent systems interact in physical-digital workflows.
This emphasis on collaboration resonates strongly in manufacturing, where Lambda Function’s Aggarwal says that the opportunity ahead “isn’t replacing engineers or CNC programmers,” but developing AI systems that work alongside them, accelerating learning, reducing variability, and helping scale expertise. In these contexts, AI delivers the most value when it augments human capability rather than attempting to abstract it away.
Converging Creative, Engineering, and Industrial Workflows
As agentic AI systems mature, traditional boundaries between creative, engineering, and manufacturing workflows continue to blur. Tools are increasingly able to coordinate autonomously, passing intent and constraints across domains. Matthew Spremulli, Strategic Program Manager, observes that creatives now expect AI systems that “respect ownership of their data and offer simple, transparent paths to train models aligned with their values – and based on the ability to Bring Your Own Data.”
At the same time, expectations for control are rising across all disciplines. According to Spremulli, creatives will expect “richer multimodal control to shape generative systems with greater precision and fidelity to their vision. With agentic AI now enabling tools to work together autonomously, we anticipate a new era of convergence; one where boundaries between creative, engineering, and manufacturing workflows blur, unlocking novel forms of expression and production.”
Data Integration at the Scale of the Built Environment
Xin Yang, former Autodesk Research Scientist Intern, anticipates that “with the increased adoption of AI, the next critical step for the AECO industry is a commitment to holistic data sharing and vertical integration.” When stakeholders connect their data streams, Yang explains, “AI can move from being scattered tools to a unifying force.” He continues that “data-driven AI integration will be the catalyst for the next leap in safety, productivity, and quality assurance within the built world.”
From Optimization to Regeneration
Sustainability is increasingly embedded directly into design and engineering workflows. Pamela Conrad, Founder and Executive Director of Climate Positive Design, an Autodesk Research Residency team, describes how climate intelligence is becoming “a core design metric rather than a ‘nice to have.’” Research-driven technologies such as life cycle assessment and scenario modeling allow designers to measure carbon, ecosystem, and land-use impacts early, when decisions still have leverage.
When research, technology, and design align, Conrad notes, the built environment can “move beyond reducing harm to actively restoring ecological and social systems.” This reframes AI not just as an optimization tool, but as a catalyst for regenerative outcomes.
Intelligence as a Connected System
Taken together, these perspectives point to a future where intelligence is defined less by individual models and more by how systems connect across data, disciplines, physical reality, and human values. In 2026, progress will be measured not by autonomy alone, but by integration, collaboration, and real-world impact.
For Autodesk Research, this reinforces a core mission: advancing AI and computational methods that help people imagine, design, and make a better world, responsibly, collaboratively, and at scale.
Get in touch
Have we piqued your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities
Contact us