This photo of a Longtailed macaque foraging naturally at Admiralty Park was used in a paper about designing robots that can better find stuff among clutter!
Thanks to Marc Killpack for including this in the awesome paper! So share your photos of Singapore's amazing wildlife doing their wild thing. You never know how they can contribute to science!
Abstract
Clutter creates challenges for robot manipulation, including a lack of non-contact trajectories and reduced visibility for line-of-sight sensors. We demonstrate that robots can use whole-arm tactile sensing to perceive clutter and maneuver within it, while keeping contact forces low. We first present our approach to manipulation, which emphasizes the benefits of making contact across the entire manipulator and assumes the manipulator has low-stiffness actuation and tactile sensing across its entire surface. We then present a novel controller that exploits these assumptions. The controller only requires haptic sensing, handles multiple contacts, and does not need an explicit model of the environment prior to contact. It uses model predictive control with a time horizon of length one and a linear quasi-static mechanical model. In our experiments, the controller enabled a real robot and a simulated robot to reach goal locations in a variety of environments, including artificial foliage, a cinder block, and randomly generated clutter, while keeping contact forces low. While reaching, the robots performed maneuvers that included bending objects, compressing objects, sliding objects, and pivoting around objects. In simulation, whole-arm tactile sensing also outperformed per-link force–torque sensing in moderate clutter, with the relative benefits increasing with the amount of clutter.
Read more in Advait Jain, Marc D Killpack, Aaron Edsinger and Charles C Kemp, Reaching in clutter with whole-arm tactile sensing, The International Journal of Robotics Research 32(4)458–482 DOI: 10.1177/0278364912471865