What’s Next in Robotics Manipulation

Albert Chen

04/05/2023

Without looking (and without my approval) my 2-year-old daughter sometimes sticks her hand into the grocery bag, feels around, and successfully picks the candy bar from the bottom – being somewhat careful not to damage the produce or egg carton inside.

For humans, the ability to feel-around and instinctively know what the object is, grab it, move it, and not let it fall or break, all while ensuring other items aren’t damaged is a subconscious task – an afterthought. For robots in the logistics and fulfillment industry, this is a billion-dollar problem that we’re looking to help solve. At Forcen, when we think about giving robots feelings, we think of touch. And this is one of the many force perception problems we help solve through software enabled hardware. The immensely complex nature of these problems is why we are leveraging Autodesk Research and its Outsight Network community to solve and understand these challenges together.

Robots must be able to move through space and understand and manipulate their environment. As basic as this might sound, these are incredibly difficult, multi-disciplinary problems. Advancements in motors and actuators, cameras, image processing algorithms, batteries, processing power, and artificial intelligence has paved an impressive path in motion and understanding. We think manipulation is the next big task.

The ability to translate analog force-torque data into decision-making digital data can be incredibly convoluted:

  • How do we capture the force applied to an object?
  • What does the raw signal of picking and moving an object look like? What if it starts slipping?
  • What about the raw signal of a swinging grocery bag – what does that look like?
  • How do we process that data and use it to adjust a robot’s arm motion?
  • How do you program a robot arm/gripper to do things ‘subconsciously’ through force-feedback?
  • How do you do this in a fraction of a second – like a reflex?

This first of its kind smart force-torque sensor automatically compensates for inertial changes and is agnostic to the robot arm and positioning.

Reducing the integration complexity of hardware to enable more intelligent software is a necessary condition for industry-wide adoption. The best example of this is the significant growth that machine vision experienced over the last two decades. This was the direct result of an industry-wide effort to accelerate the development in CMOS/CCD, microprocessors, and software. As a result, developers and integrators have been able to integrate machine vision and apply ML/AI into their projects more quickly than just a few years ago.

It is in this context that Forcen seeks to Increase the adoption of subconscious force reflexes in robots without adding developmental complexity. Our approach emphasizes modularity and adaptability to drive customer adoption. Ultimately, users must be able to implement the reflexes easily and focus their development on how a robot should react rather than figuring out how the sensing/reflex works.

We have already built an elegant and scalable transducer hardware platform. Now, we are refining our edge-computing platform, software, and sensor fusion technology. The first goal is to facilitate force feedback to the robotic actuator (i.e. robotic graspers) locally, allowing the graspers to automatically adjust grip strength to prevent package slipping without damaging the package.

We are still a few years away from fine-tuned dexterous sensing and reflexes in robotics. But we believe that providing robots the ability to feel and manipulate the world around them through scalable plug-and-play sensing technology will allow them to be more equipped to work alongside humans. Eventually, they’ll even be able to blindly pick the candy bar from the bottom of the grocery bag without looking. Just like my daughter.

Albert Chen is the CTO at Forcen, a member of the Autodesk Outsight Network.

Get in touch

Have we piqued your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities

Contact us