[ad_1]

In brief

  • Hank the robot has flexible robotic fingers inspired by the human hand
  • The fingers are controlled by airflows that can flex the finger and apply force

Researchers claim to have solved one of the “grand challenges” faced by robotics with the creation of a robot with flexible fingers inspired by the human hand.

British product development firm Cambridge Consultants’ Hank robot has pneumatic fingers capable of gripping and picking up small, irregularly-shaped and delicate objects with the correct amount of force.

Robotic arms in factories, warehouse and agriculture have traditionally used pincers and suction appendages to grasp items, which can prove too forceful when picking up easily-crushed objects such as fruit.

Hank the robot picks up a kiwi fruit (Photo: Cambridge Consultants)
Hank the robot picks up a kiwi fruit (Photo: Cambridge Consultants)

Hank’s individually-controlled silicone fingers contain a sensory system and are powered by airflows for flexing and applying appropriate pressure.

The touch sensors allow the robot to grasp and close its fingers around the object as a human hand would, applying increased force if it notices the object slipping. Once worn out or damaged, the fingers can simply be replaced.

Robotic dogs and seals help reduce loneliness in care home residents

Bruce Ackman, logistics commercial lead at Cambridge Consultants, said the logistics industry had been slow to adopt automation, with many businesses still requiring humans to pick and pack items in warehouses.

“Hank’s world-leading sensory system is a game changer for the logistics industry, making actions such as robotic bin picking and end-to-end automated order fulfillment possible.

“Adding a sense of touch and slip, generated by a single, low-cost sensor, means that Hank’s fingers could bring new efficiencies to giant distribution centres.”

MIT's robotic arm has learnt to play Jenga (Photo: MIT)
MIT’s robotic arm has learnt to play Jenga (Photo: MIT)

Researchers from Massachusetts Institute of Technology taught a robotic arm to play Jenga earlier this year, soft-pronged gripper to assess, probe and remove the blocks from the popular strategic building game.

The robot’s computer calculates tactile and visual information, coupled with data from moves it has previously made, to assess whether to keep attempting to remove its current block or to move on to another.

The tactile feedback is key in differentiating the robot from other machine learning methods, as it allowed the robot to train on around 300 block extraction attempts rather than tens of thousands.

[ad_2]

Source link