direkt zum Inhalt springen

direkt zum Hauptnavigationsmenü

Sie sind hier

TU Berlin

Page Content

Robotic Hands

Meka hands and Barrett hand (center)
RBO Hand2
Allegro Hand

For the Meka arms and the Barrett WAM arm we have three hands for different usage. Furthermore we have an Allegro hand and the most recent used SoMa soft hands. Among them the PISA/IIT SoftHand and the RBO Hand2, which we manufacture in our lab (for more information, see also the manufacturing lab below).  

Soft Hand Building Lab

Vacuum chamber and pump
Silicone components and utensils

To build the RBO Hand 2 and other soft robotic parts, we have a workshop with tools for silicone molding, and the assembly of the hands.

We also have the infrastructure to design and build our custom PneumaticBox for the pneumatic control of the hands.

A tutorial on building the RBO Hand can be found here: PneuFlex tutorial



Barrett WAM Arm


Our mobile manipulator is composed of a mobile base: a modified XR4000. We can mount different manipulators on top of the base. In the image on the right you can see a Barrett WAM.



Our two Panda arms are used in various research projects.

Meka T2


Our Meka (T2) Humanoid torso is composed of tractable arms (A2) and can be installed on the mobile platform.

Puma 560


We have two Puma 560 robot arms for teaching. The students learn how to implement joint and operational space controllers. Visual servoing and motion planning is also part of the Puma lectures.

12 iRobots for Teaching


Each equiped with a Hokuyo Laser Scanner, a USB webcam and a Netbook. The iRobot platform has a differential drive and a front bumper. The base is connected via USB Interface to the Netbook. The Netbook is running Linux and we use ROS to control the robot.

CyberGlove Systems - CyberGlove II + III


In order to provide intuitive control over our manipulators and actuators, we use the CyberGlove II and III made by CyberGlove Systems to track movement of the hand and individual fingers.

RGB-D Sensors


The kinect sensor projects a known infrared pattern on the scene. This point pattern is not visible for the human eyes. A infrared camera records the pattern and estimate the depth using the shift of the infrared points. A colored image is used to give every point the right color value. The output of the kinect sensor is a colored 3D point cloud of the scene.

To top

Motion Capturing


In order to be able to analyze movements, we maintain our own motion capturing laboratory. This allows us not only to map possible paths of movement, but also perform detailed analysis of, for example, grasping strategies.



IBM system cluster - e1350/iDataplex Nodes: IBM System x iDataPlex dx360 M2 and dx360 M3 server Interconnects: BNT RackSwitch G8000F 48-port GbE Switch Bundle.

The cluster has 132 nodes, each equipped with two Quadcore CPUs and Hyperthreading, providing in total 2272 slots for parallel computation.

Deep Learning Computers


Two computers each equipped with a NVIDIA GeForce GTX TITAN X GPU.

One high-performance computer, "Deep Thought", equipped with 4x GeForce GTX 1080 Ti GPUs.



Zusatzinformationen / Extras

Quick Access:

Schnellnavigation zur Seite über Nummerneingabe

Auxiliary Functions