Walk-man is an humanoid robot recently developed by Istituto Italiano di Tecnologia (Italian Institute of Technology) and University of Pisa in Italy, within the European funded project WALKMAN (www.walk-man.eu). The project is a four years research programme which started in October 2013 and aims to developing a humanoid robot for disaster response operations. Walkman is the acronym of “Whole Body Adaptive Locomotion and Manipulation” underlining its main research goal: enhancing the capabilities of existing humanoid robots, permitting them to operate in emergency situations, while assisting or replacing humans in civil damaged sites including buildings, such as factories, offices and houses. In such scenarios, the Walk-man robot will demonstrate human type locomotion, balance and manipulation capabilities. To reach these targets, Walk-man design principles and implementation relied on the use of high performance actuation systems, compliant body and soft under actuated hand designs taking advantage of the recent developments in mechanical design, actuation and materials. The first prototype of the Walk-man robot will participate to the DARPA Robotics Challenge finals in June, but it will be further developed both in hardware and software, in order to validate the project results through realistic scenarios, consulting also civil defence bodies. The technologies developed within Walk-man project have also a wide range of other applications, including industrial manufacturing, co-worker robots, inspection and maintenance robots in dangerous workspaces, and may be provided to others on request.
The prototype WALK-MAN platform is an adult size humanoid with a height of 1.85m an arm span of 2m and a weight of 118Kg.
The robot is a fully power autonomous, electrically powered by a 2KWh battery unit; its body has 33 degrees of freedom (DOF) actuated by high power electric motors and all equipped with intrinsic elasticity that gives to the robot superior physical interaction capabilities.
The robot perception system includes torque sensing, end effector F/T sensors, and a head module equipped with a stereo vision system and a rotating 3D laser scanner, the posture of which is controlled by a 2DOF neck chain. Extra RGB-D and colour cameras mounted at fixed orientations provide additional coverage of the locomotion and manipulation space.
IMU sensors at the head and the pelvis area provide the necessary inertial/orientation sensing of the body and the head frames.
Protective soft covers mounted along the body will permit the robot to withstand impacts including those occurred during falling incidents.
The software interface of the robot is based on the YARP middleware (www.yarp.it).