direkt zum Inhalt springen

direkt zum Hauptnavigationsmenü

Sie sind hier

TU Berlin

Inhalt des Dokuments

Robotic Hands

Meka hands and Barrett hand (center)
Lupe
RBO Hand2
Lupe
PISA/IIT SoftHand
Lupe
Allegro Hand
Lupe

For the Meka arms and the Barrett WAM arm we have three hands for different usage. Furthermore we have an Allegro hand and the most recent used SoMa soft hands. Among them the PISA/IIT SoftHand and the RBO Hand2, which we manufacture in our lab (for more information, see also the manufacturing lab below).  

Soft Hand Building Lab

Vacuum chamber and pump
Lupe
Silicone components and utensils
Lupe

To build the RBO Hand 2 and other soft robotic parts, we have a workshop with tools for silicone molding, and the assembly of the hands.

We also have the infrastructure to design and build our custom PneumaticBox for the pneumatic control of the hands.

A tutorial on building the RBO Hand can be found here: PneuFlex tutorial

 

 

Barrett WAM Arm

Lupe

Our mobile manipulator is composed of a mobile base: a modified XR4000. We can mount different manipulators on top of the base. In the image on the right you can see a Barrett WAM.

Meka T2

Lupe

Our Meka (T2) Humanoid torso is composed of tractable arms (A2) and can be installed on the mobile platform.

Puma 560

Lupe

We have two Puma 560 (F4V, 10,6 MB) robot arms for teaching. The students learn how to implement joint and operational space controllers. Visual servoing and motion planning is also part of the Puma lectures.

12 iRobots for Teaching

Lupe

Each equiped with a Hokuyo Laser Scanner, a USB webcam and a Netbook. The iRobot platform has a differential drive and a front bumper. The base is connected via USB Interface to the Netbook. The Netbook is running Linux and we use ROS to control the robot.

CyberGlove Systems - CyberGlove II + III

Lupe

In order to provide intuitive control over our manipulators and actuators, we use the CyberGlove II and III made by CyberGlove Systems to track movement of the hand and individual fingers.

RGB-D Sensors

Lupe

The kinect sensor projects a known infrared pattern on the scene. This point pattern is not visible for the human eyes. A infrared camera records the pattern and estimate the depth using the shift of the infrared points. A colored image is used to give every point the right color value. The output of the kinect sensor is a colored 3D point cloud of the scene.

To top

Motion Capturing

Lupe

In order to be able to analyze movements, we maintain our own motion capturing laboratory. This allows us not only to map possible paths of movement, but also perform detailed analysis of, for example, grasping strategies.

Cluster

Lupe

IBM system cluster - e1350/iDataplex Nodes: IBM System x iDataPlex dx360 M2 and dx360 M3 server Interconnects: BNT RackSwitch G8000F 48-port GbE Switch Bundle.

The cluster has 132 nodes, each equipped with two Quadcore CPUs and Hyperthreading, providing in total 2272 slots for parallel computation.

Deep Learning Computers

Lupe

Two computers each equipped with a NVIDIA GeForce GTX TITAN X GPU.

One high-performance computer, "Deep Thought", equipped with 4x GeForce GTX 1080 Ti GPUs.

Kicker

Lupe

Zusatzinformationen / Extras

Quick Access:

Schnellnavigation zur Seite über Nummerneingabe

Auxiliary Functions