[ad_1]
SAN JOSE, Calif.–(BUSINESS WIRE)–Fetch
Robotics, the pioneer of on-demand automation, today announced the
results of the inaugural FetchIt! Mobile Manipulation Challenge. The
competition was designed to advance the state of technology for applying
mobile manipulators, which are autonomous mobile robots (AMRs) fitted
with robotic arms, for use in manufacturing and related applications.
The FetchIt! Challenge attracted teams from leading universities, who
were tasked with using a Fetch Mobile Manipulator robot to navigate to
stations in a work cell where they picked up items with the arm,
inserted them into a machining tool, placed the machined items into
kits, transported the finished kits to an inspection station, and then
to a drop-off location. This is the first competition that encompasses
the full range of activities that are commonly found in manufacturing
environments.
Georgia Tech was awarded first prize for successfully assembling three
kits in thirty-nine minutes, earning a prize package that included a
Fetch Mobile Manipulation Research Robot — a $100K value — along with
additional prizes from co-sponsors EandM, SCHUNK and SICK AG. More
information on the FetchIt! Challenge is available at https://opensource.fetchrobotics.com/competition.
FetchIt! Challenge participants included:
-
Team Columbia: Columbia University, led by Professor Peter K. Allen,
PhD, and Neil Chen -
Team DeRAILers: The Georgia Institute of Technology, led by Associate
Professor Sonia Chernova, PhD and David Kent -
Team RoboHawks: The University of Massachusetts Lowell, led by
Professor Holly Yanco, PhD, Assistant Professor Reza Ahmadzadeh, PhD
and Zhao Han - Team Fido: Independent Competitors, Thomas Butterworth and Ben Jarvhi
Autonomous mobile robots are seeing rapid adoption in multiple
applications, particularly for material movement tasks in warehouses and
factories. This has fueled growing interest in using AMRs equipped with
robotic arms – so called Mobile Manipulators – for applications that
combine autonomous transport with robotic arms that can grasp and
manipulate objects. While conceptually simple, combining autonomous
mobility with robotic arm activity is extremely challenging, requiring
complex interaction between the robot navigation, machine vision, arm
operation and safety systems.
The FetchIt! competition focused on teams autonomously completing
combined manipulation and navigation tasks. The goal was to assemble a
kit from six objects obtained from stations around the designated arena.
Designed to mimic a common machine-tending process in manufacturing, the
objects were picked from bins and then placed in a kit and transported
to a drop-off location. All teams used a stock Fetch Mobile Manipulator,
which is the most widely deployed mobile manipulation robot, deployed at
over fifty of the world’s leading academic and commercial robotic
research centers.
“I’d like to congratulate all the teams for their accomplishments during
the course of this challenge,” said Russell Toris, Director of Robotics
at Fetch Robotics. “When setting out to create this challenge, we knew
we wanted to keep it grounded to a real-word scenario. Interacting with
machinery that is designed to be used by humans is no easy task.
Piece-picking, kitting, and countless other tasks are going to require
state-of-the-art perception, motion planning, navigation, and safety all
seamlessly working together. The teams’ performance this week indicate
that they represent some of the world’s leading experts in these skills.”
“We’re very excited to have won the FetchIt! challenge,” said Georgia
Tech’s Sonia Chernova. “It has allowed us to validate our research code
in a complex domain. We can’t wait to continue our work with our newest
Fetch robot.”
Added Holly Yanco from UMass Lowell: “Everyone from Fetch has been
helpful and very encouraging. This has been an amazing experience and
the tasks used for the competition form a great basis for our ONR MURI
research.”
Sponsors and Prizes
In addition to Fetch Robotics, additional sponsors included EandM,
Schunk,
SICK,
and The
Construct.
First Place: Georgia Tech
Fetch Mobile Manipulation Research
Robot, MRS1000 4-layer LiDAR sensor (1) provided by SICK and EandM,
along with 7,000 Schunk Bucks
Second Place: University of Massachusetts Lowell
MRS1000
4-layer LiDAR sensor (1) and TiM561 LiDAR laser scanner (1) provided by
SICK and EandM, along with 5,000 Schunk Bucks
Additional Prizes: TiM561 LiDAR laser scanner (1) provided by
SICK and EandM, for all entrants
Platform for Robotic Research
Designed to work with the Robot Operating System (ROS) for the greatest
common usability and familiarity, the Fetch Mobile Manipulator and the
Freight Mobile Robot Base are robotics platforms used by researchers
around the world to collaborate and share research. The Fetch Mobile
Manipulator provides an affordable, fully-integrated standard platform
for mobile manipulation research that has been purposefully designed for
typical human working environments. The Fetch Research Robot is being
used by researchers and innovators at the world’s most advanced
organizations, including Arizona State University, Carnegie Mellon
University, Google, Northeastern University, OpenAI, Shenzen University,
Softbank, Toyota Research Institute, University of Michigan, University
of North Carolina-Charlotte, University of Sydney, University of Tokyo,
Virginia Tech and more.
About Fetch Robotics
Fetch Robotics is an award-winning intralogistics automation company
headquartered in Silicon Valley. We provide innovative, on-demand
warehouse automation solutions for material handling and inventory
management by combining mobile robotics with the power of the cloud to
find, track, and move almost anything in any facility. Fetch Robotics’
solutions and services are deployed in leading distribution,
fulfillment, and manufacturing centers around the world, augmenting
workforces to drive increased efficiency and productivity. For more
information, please visit http://www.fetchrobotics.com
or follow the company on Twitter @FetchRobotics.
[ad_2]
Source link
Leave A Comment
You must be logged in to post a comment.