Abstract
The use of standard platforms in the field of humanoid robotics can lower the entry barrier for new research groups, and accelerate research by the facilitation of code sharing. Numerous humanoid standard platforms exist in the lower size ranges of up to 60 cm, but beyond that humanoid robots scale up quickly in weight and price, becoming less affordable and more difficult to operate, maintain and modify. The \(\mathrm{igus}^{{\circledR }}\) Humanoid Open Platform is an affordable, fully open-source platform for humanoid research. At 92 cm, the robot is capable of acting in an environment meant for humans, and is equipped with enough sensors, actuators and computing power to support researchers in many fields. The structure of the robot is entirely 3D printed, leading to a lightweight and visually appealing design. This paper covers the mechanical and electrical aspects of the robot, as well as the main features of the corresponding open-source ROS software. At RoboCup 2016, the platform was awarded the first International HARTING Open Source Prize.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
1 Introduction
The field of humanoid robotics is enjoying increasing popularity, with many research groups having developed robotic platforms to investigate topics such as perception, manipulation and bipedal walking. The entry barrier to such research can be significant though, and access to a standard humanoid platform can allow for greater focus on research, and facilitates collaboration and code exchange.
The \(\mathrm{igus}^{{\circledR }}\) Humanoid Open Platform, described in this paper, is a collaboration between researchers at the University of Bonn and \(\mathrm{igus}^{{\circledR }}\) GmbH, a leading manufacturer of polymer bearings and energy chains. The platform seeks to close the gap between small, albeit affordable, standard humanoid platforms, and larger significantly more expensive ones. We designed the platform to be as open, modular, maintainable and customisable as possible. The use of almost exclusively 3D printed plastic parts for the mechanical components of the robot is a result of this mindset, which also simplifies the manufacture of the robots. This allows individual parts to be easily modified, reprinted and replaced to extend the capabilities of the robot, shown in Fig. 1. A demonstration video of the \(\mathrm{igus}^{{\circledR }}\) Humanoid Open Platform is available.Footnote 1
2 Related Work
A number of standard humanoid robot platforms have been developed over the last decade, such as for example the Nao robot by Aldebaran Robotics [1]. The Nao comes with a rich set of features, such as a variety of available gaits, a programming SDK, and human-machine interaction modules. The robot however has a limited scope of use, as it is only 58 cm tall. Also, as a proprietary product, own hardware repairs and enhancements are difficult. Another example is the DARwIn-OP [2], distributed by Robotis. At 45.5 cm, it is half the size of the \(\mathrm{igus}^{{\circledR }}\) Humanoid Open Platform. The DARwIn-OP has the benefit of being an open platform, but its size remains a limiting factor for its range of applications.
Other significantly less widely disseminated robots include the Intel Jimmy robot, the Poppy robot from the Inria Flowers Laboratory [3], and the Jinn-Bot from Jinn-Bot Robotics and Design GmbH in Switzerland. All of these robots are at least in part 3D printed, and the first two are open source. The Jimmy robot is intended for social interactions and comes with software based on the DARwIn-OP framework. The Poppy robot is intended for non-autonomous use, and features a multi-articulated bio-inspired morphology. Jinn-Bot is built from over 90 plastic parts and 24 actuators, making for a complicated build, and is controlled by a Java application running on a smartphone mounted in its head. Larger humanoid platforms, such as the Asimo [4], HRP [5] and Atlas robots, are an order of magnitude more expensive and more complicated to operate and maintain. Such large robots are less robust because of their complex hardware structure, and require a gantry in normal use. These factors limit the use of such robots by most research groups.
3 Hardware Design
The hardware platform was designed in collaboration with \(\mathrm{igus}^{{\circledR }}\) GmbH, which engaged a design bureau to create an appealing overall aesthetic appearance. The main criteria for the design were the simplicity of manufacture, assembly, maintenance and customisation. To satisfy these criteria, a modular design approach was used. Due to the 3D printed nature of the robot, parts can be modified and replaced with great freedom. A summary of the main hardware specifications of the \(\mathrm{igus}^{{\circledR }}\) Humanoid Open Platform is shown in Table 1.
3.1 Mechanical Structure
The plastic shell serves not only for outer appearance, but also as the load-bearing frame. This makes the \(\mathrm{igus}^{{\circledR }}\) Humanoid Open Platform very light for its size. Despite its low weight, the robot is still very durable and resistant to deformation and bending. This is achieved by means of wall thickness modulation in the areas more susceptible to damage, in addition to strategic distribution of ribs and other strengthening components, printed directly as part of the exoskeleton. Utilising the versatile nature of 3D printing, the strengths of the plastic parts can be maximised exactly where they are needed, and not unnecessarily so in other locations. If a weak spot is identified through practical experience, as indeed happened during testing, the parts can be locally strengthened in the CAD model without significantly impacting the remaining design.
3.2 Robot Electronics
The electronics of the platform are built around an Intel i7-5500U processor, running a full 64-bit Ubuntu OS. DC power is provided via a power board, where external power and a 4-cell Lithium Polymer (LiPo) battery can be connected. The PC communicates with a Robotis CM730 subcontroller board, whose main purpose is to electrically interface the twelve MX-106 and eight MX-64 actuators, all connected on a single star topology Dynamixel bus. Due to a number of reliability and performance factors, we redesigned and rewrote the firmware of the CM730 (and CM740). This improved bus stability and error tolerance, and decreased the time required for the reading out of servo data, while still retaining compatibility with the standard Dynamixel protocol. The CM730 incorporates 3-axis gyroscope and accelerometer sensors, is connected externally to an additional 3-axis magnetometer via an \(\mathrm{I}^{2}\mathrm{C}\) interface, and also connects to an interface board that has three buttons, five LEDs and two RGB LEDs.
Further available external connections to the robot include USB, HDMI, Mini DisplayPort, Gigabit Ethernet, IEEE 802.11b/g/n Wi-Fi, and Bluetooth 4.0. The \(\mathrm{igus}^{{\circledR }}\) Humanoid Open Platform is nominally equipped with a single 720p Logitech C905 camera behind its right eye, fitted with a wide-angle lens. A second camera can be optionally mounted behind the left eye for stereo vision.
4 Software
The ROS middleware was chosen as the basis of the software developed for the \(\mathrm{igus}^{{\circledR }}\) Humanoid Open Platform. This fosters modularity, visibility, reusability, and to some degree the platform independence. The software was developed with humanoid robot soccer in mind, but the platform can be used for virtually any other application. This is possible because of the strongly modular way in which the software was written, greatly supported by the natural modularity of ROS, and the use of plugin schemes.
4.1 Vision
The camera driver used in the ROS software nominally retrieves images at 30 Hz in 24bpp BGR format at a resolution of \(640\,\times \,480\). For further processing, the captured image is converted to the HSV colour space. In our target application of soccer, the vision processing tasks include field, ball, goal, field line, centre circle and obstacle detection, as illustrated in Fig. 2 [6]. The wide-angle camera used introduces significant distortion, which must be compensated when projecting image coordinates into egocentric world coordinates. We undistort the image with a Newton-Raphson based approach (top right in Fig. 2). This method is used to populate a pair of lookup tables that allow constant time distortion and undistortion operations at runtime. Further compensation of projection errors is performed by calibrating offsets to the position and orientation of the camera frame in the head of the robot. This is essential for good projection performance (bottom row in Fig. 2), and is done using the Nelder-Mead method.
4.2 State Estimation
State estimation is a vital part of virtually any system that utilises closed-loop control. The 9-axis IMU on the microcontroller board is used to obtain the 3D orientation of the robot relative to its environment through the means of a nonlinear passive complementary filter [7]. This filter returns the full 3D estimated orientation of the robot with use of a novel way of representing orientations—the fused angles representation [8]. An immediate application of the results of the state estimation is the fall protection module, which disables the torque in order to minimise stress in all of the servos if a fall is imminent.
4.3 Actuator Control
As with most robots, motions performed by the \(\mathrm{igus}^{{\circledR }}\) Humanoid Open Platform are dependent on the actuator’s ability to track their set position. This is influenced by many factors, including battery voltage, joint friction, inertia and load. To minimise the effects of these factors, we apply feed-forward control to the commanded servo positions. This allows the joints to move in a compliant way, reduces servo overheating and wear, increases battery life, and reduces the problems posed by impacts and disturbances [9]. The vector of desired feed-forward output torques is computed from the vectors of commanded joint positions, velocities and accelerations using the full-body inverse dynamics of the robot, with help of the Rigid Body Dynamics Library. Each servo in the robot is configured to use exclusively proportional control. Time-varying dimensionless effort values on the unit interval [0, 1] are used per joint to interpolate the current proportional gain.
4.4 Gait Generation
The gait is formulated in three different pose spaces: joint space, abstract space, and inverse space. The joint space simply specifies all joint angles, while the inverse space specifies the Cartesian coordinates and quaternion orientations of each of the limb end effectors relative to the trunk link frame. The abstract space however, is a representation that was specifically developed for humanoid robots in the context of walking and balancing.
The walking gait is based on an open loop central pattern generated core that is calculated from a gait phase angle that increments at a rate proportional to the desired gait frequency. This open loop gait extends the gait of our previous work [10]. Since then, a number of simultaneously operating basic feedback mechanisms have been built around the open loop gait core to stabilise the walking [11]. The feedback in each of these mechanisms derives from the fused pitch and fused roll state estimates, and adds corrective action components to the central pattern generated waveforms in both the abstract and inverse spaces [8].
4.5 Motions
Often there is a need for a robot to play a particular pre-designed motion. This is the task of the motion player, which implements a nonlinear keyframe interpolator that connects robot poses and smoothly interpolates joint positions and velocities, in addition to modulating the joint efforts and support coefficients. This allows the actuator control scheme to be used meaningfully during motions with changing support conditions. To create and edit the motions, a trajectory editor was developed for the \(\mathrm{igus}^{{\circledR }}\) Humanoid Open Platform. All motions can be edited in a user-friendly environment with a 3D preview of the robot poses. We have designed numerous motions including kicking, waving, balancing, get-up, and other motions. A still image of the kicking motion is shown in Fig. 3 along with the get-up motions of the \(\mathrm{igus}^{\circledR }\) Humanoid Open Platform, from the prone and supine positions.
5 Reception
To date we have built seven \(\mathrm{igus}^{{\circledR }}\) Humanoid Open Platform s, and have demonstrated them at the RoboCup and various industrial trade fairs. Amongst others, this includes demonstrations at Hannover Messe in Germany, and at the International Robot Exhibition in Tokyo, where the robots had the opportunity to show their interactive side (see Fig. 4). Demonstrations ranged from expressive and engaging looking, waving and idling motions, to visitor face tracking and hand shaking. The robots have been observed to spark interest and produce emotional responses in the audience.
Despite the recent design and creation of the platform, work groups have already taken inspiration from it, or even directly used the open-source hardware or software. A good example of this is the Humanoids Engineering and Intelligent Robotics team at Marquette University with their MU-L8 robot [12]. In their design they combined both an aluminium frame from the NimbRo-OP and 3D printed parts similar to those of the \(\mathrm{igus}^{{\circledR }}\) Humanoid Open Platform, as well as using ROS-based control software inspired by our own. A Japanese robotics business owner, Tomio Sugiura, started printing parts of the \(\mathrm{igus}^{{\circledR }}\) Humanoid Open Platform on an FDM-type 3D printer with great success. Naturally, the platform also inspired other humanoid soccer teams, such as the WF Wolves [13], to improve upon their own robots. The NimbRo-OP, which was a prototype for the \(\mathrm{igus}^{{\circledR }}\) Humanoid Open Platform, has been successfully used in research for human-robot interaction research at the University of Hamburg [14]. We recently sold a set of printed parts to the University of Newcastle in Australia and await results of their work.
In 2015, the robot was awarded the first RoboCup Design Award, based on criteria such as performance, simplicity and ease of use. At RoboCup 2016, the platform also won the first International HARTING Open Source Prize, and was a fundamental part of the winning TeenSize soccer team. These achievements confirm that the robot is welcomed and appreciated by the community.
6 Conclusions
Together with \(\mathrm{igus}^{{\circledR }}\) GmbH, we have worked for three years to create and improve upon an open platform that is affordable, versatile and easy to use. The \(\mathrm{igus}^{{\circledR }}\) Humanoid Open Platform provides users with a rich set of features, while still leaving room for modifications and customisation. We have released the hardware in the form of print-ready 3D CAD filesFootnote 2, and uploaded the software to GitHubFootnote 3. We hope that it will benefit other research groups, and encourage them to publish their results as contributions to the open-source community.
Notes
- 1.
- 2.
- 3.
Software: https://github.com/AIS-Bonn/humanoid_op_ros.
References
Gouaillier, D., Hugel, V., Blazevic, P., Kilner, C., Monceaux, J., Lafourcade, P., Marnier, B., Serre, J., Maisonnier, B.: Mechatronic design of NAO humanoid. In: International Conference on Robotics and Automation (2009)
Ha, I., Tamura, Y., Asama, H., Han, J., Hong, D.: Development of open humanoid platform DARwIn-OP. In: SICE Annual Conference (2011)
Lapeyre, M., Rouanet, P., Grizou, J., Nguyen, S., Depraetre, F., Le Falher, A., Oudeyer, P.-Y.: Poppy project: open-source fabrication of 3D printed humanoid robot for science, education and art. In: Digital Intelligence 2014, September 2014
Hirai, K., Hirose, M., Haikawa, Y., Takenaka, T.: The development of Honda humanoid robot. In: International Conference on Robotics and Automation (1998)
Kaneko, K., Kanehiro, F., Morisawa, M., Miura, K., Nakaoka, S., Kajita, S.: Cybernetic human HRP-4C. In: Proceedings of 9th IEEE-RAS International Conference on Humanoid Robotics, Humanoids, pp. 7–14 (2009)
Farazi, H., Allgeuer, P., Behnke, S.: A monocular vision system for playing soccer in low color information environments. In: Proceedings of 10th Workshop on Humanoid Soccer Robots, International Conference on Humanoid Robots, Seoul, Korea (2015)
Allgeuer, P., Behnke, S.: Robust sensor fusion for biped robot attitude estimation. In: Proceedings of 14th International Conference on Humanoid Robotics (2014)
Allgeuer, P., Behnke, S.: Fused angles: a representation of body orientation for balance. In: International Conference on Intelligent Robots and Systems, IROS (2015)
Schwarz, M., Behnke, S.: Compliant robot behavior using servo actuator models identified by iterative learning control. In: Behnke, S., Veloso, M., Visser, A., Xiong, R. (eds.) RoboCup 2013. LNCS (LNAI), vol. 8371, pp. 207–218. Springer, Heidelberg (2014). https://doi.org/10.1007/978-3-662-44468-9_19
Missura, M., Behnke, S.: Self-stable omnidirectional walking with compliant joints.: In: 8th Workshop on Humanoid Soccer Robots, Humanoids (2013)
Allgeuer, P., Behnke, S.: Omnidirectional bipedal walking with direct fused angle feedback mechanisms. In: Proceedings of 16th IEEE-RAS International Conference on Humanoid Robots, Humanoids, Cancún, Mexico (2016)
Stroud, A., Morris, M., Carey, K., Williams, J.C., Randolph, C., Williams, A.B.: MU-L8: the design architecture and 3D printing of a Teen-Sized humanoid soccer robot. In: 8th Workshop on Humanoid Soccer Robots, Humanoids (2013)
Tasch, C., Luceiro, D., Maciel, E.H., Berwanger, F., Xia, M., Stiddien, F., Martins, L.T., Wilke, L., Dalla Rosa, O.K., Henriques, R.V.B.: WF Wolves and Taura Bots Teen Size (2015)
Barros, P., Parisi, G.I., Jirak, D., Wermter, S.: Real-time gesture recognition using a humanoid robot with a deep neural architecture. In: Proceedings of 14th IEEE-RAS International Conference on Humanoid Robotics, Humanoids (2014)
Acknowledgements
We acknowledge the contributions of \(\mathrm{igus}^{{\circledR }}\) GmbH to the project, in particular the management of Martin Raak towards the robot design and manufacture. This work was partially funded by grant BE 2556/10 of the German Research Foundation (DFG).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Allgeuer, P., Ficht, G., Farazi, H., Schreiber, M., Behnke, S. (2017). First International HARTING Open Source Prize Winner: The igus Humanoid Open Platform. In: Behnke, S., Sheh, R., Sarıel, S., Lee, D. (eds) RoboCup 2016: Robot World Cup XX. RoboCup 2016. Lecture Notes in Computer Science(), vol 9776. Springer, Cham. https://doi.org/10.1007/978-3-319-68792-6_52
Download citation
DOI: https://doi.org/10.1007/978-3-319-68792-6_52
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-68791-9
Online ISBN: 978-3-319-68792-6
eBook Packages: Computer ScienceComputer Science (R0)