Aussie robot wins Amazon Robotics Challenge
The centre’s COO, Dr Sue Keay, said Team ACRV was one of eight that made it through to yesterday’s finals, following a fifth placing after the picking and stowing rounds.
“It was a tense few hours. Our team top-scored early, with 272 points on the final combined stowing and picking task, but we then had to wait on the results for five other teams, many of whom had outperformed us in the rounds,” said Dr Keay.
“Not bad for a robot that was only unpacked and reassembled out of suitcases a few days before the event, with at least one key component held together with cable ties.”
The Amazon Robotics Challenge is designed to fill a gap in Amazon.com’s automated warehousing processes. While Amazon is able to quickly package and ship millions of items to customers from a network of fulfilment centres, the commercial technologies to allow robots to pick items and stow them in boxes in an unstructured environment are yet to be developed.
Sixteen international teams tested their hardware and software solutions in the 2017 Challenge, two from Australia. Team ACRV leader Dr Juxi Leitner said the centre’s secret was an innovative Cartesian manipulator, ‘CartMan’, built from scratch. CartMan can move along three axes, like a gantry crane, with a rotating gripper that allows the robot to pick up items using either suction or a simple two-finger grip.
“We were the only team with a Cartesian robot at the event. CartMan was definitely a large reason for our success,” said Leitner. “With six degrees of articulation and both a claw and suction gripper, CartMan gives us more flexibility to complete the tasks than an off-the-shelf robot can offer.
“The robot is robust and tackles the task in an innovative way and is also cost-effective. I think it would have been the lowest cost robot at the event!”
Fifteen members of the centre’s 27-strong team of researchers, sourced from QUT, The University of Adelaide and The Australian National University, were in Japan for the event. The Challenge combined object recognition, pose recognition, grasp planning, compliant manipulation, motion planning, task planning, task execution, and error detection and recovery.
“We had to create a robust vision system to cope with objects that we only got to see during the competition,” said Dr Anton Milan, University of Adelaide-based team member. “Our vision system had the perfect trade-off of training data, training time and accuracy. One feature of our system was that it worked off a very small amount of hand-annotated training data. We only needed just seven images of each unseen item for us to be able to detect them.”
Open letter by leaders of leading robotics and AI companies is launched at the world's...
National Instruments has announced the live finals of the 7th Annual Autonomous Robotics...
Omron Electronics Oceania has appointed a new director for its operations in the Oceania region.