Despite the incredible capabilities (speed and repeatability) of our hardware today, many robot manipulators are deliberately programmed to avoid dynamics – moving slow enough so they can adhere to quasi-static assumptions of the world. In contrast, people frequently (and subconsciously) make use of dynamic phenomena to manipulate everyday objects – from unfurling blankets, to tossing trash – to improve efficiency and physical reach range. These abilities are made possible by an intuition of physics, a cornerstone of intelligence. How do we impart the same on robots?
Modeling the complex dynamics of the unstructured world is challenging. However, by enabling robots to directly learn perception-action feedback loops from raw sensory data, we show that it is possible to relax the need for accurate physics models. Thereby allowing robots to (i) acquire dynamic skills for complex objects, (ii) adapt to new scenarios using visual feedback, and (iii) use their dynamic interactions to improve their understanding of the world. Learning from data allows us to change the way we think about dynamics – from avoiding it to embracing it – simplifying a number of classically challenging problems, leading to new robot capabilities.