Aussies just kicked ass and won Amazon’s robotics challenge with “Cartman”

The Australian Centre for Robotic Vision Team ACRV have won (read: kicked ass) in Amazon’s 2017 Robotics Challenge. Overnight the win came at RoboCup in Nagoya, Japan. Our brilliant...

The Australian Centre for Robotic Vision Team ACRV have won (read: kicked ass) in Amazon’s 2017 Robotics Challenge. Overnight the win came at RoboCup in Nagoya, Japan. Our brilliant Australians scored themselves a cool US$80,000 for the win. Our 2 Australian teams were the top of the pack of 16 teams of researchers from 10 countries.

Teams were tasked with building their own hardware and software to successfully pick and stow items in a warehouse. This has obvious applications in automation, so it makes sense a sales and distribution company like Amazon put up the cash. Amazon is able to quickly package and ship millions of items to customers from their network of fulfillment centers, however the technologies to solve automated picking in unstructured environments are yet to be developed.

Eight teams made it through to the finals, with the Australian Centre for Robotic Vision placing fifth after the picking and stowing rounds.

The Centre, is headquartered at Queensland University of Technology, and their COO Dr Sue Keay said.

 “It was a tense few hours. Our team top scored early with 272 points on the final combined stowing and picking task but we then had to wait on the results for five other teams, many of whom had outperformed us in the rounds, before it became clear that we had won.”

The winning robot was unpacked and reassembled out of suitcases a few days before the event, with at least one key component held together with cable ties.

The Australian Centre for Robotic Vision developed their own robot known as “Cartman” for the challenge, the only team with a Cartesian robot at the event. Cartman, can move along three axes at right angles to each other, like a gantry crane, and featured a rotating gripper that allowed the robot to pick up items using either suction or a simple two-finger grip.

Some people heralded the team’s win to the use of a custom-made robot. Team leader Juxi describes the advantages of the design, with six degrees of articulation and both a claw and suction gripper, Cartman gives us more flexibility to complete the tasks than most robots can offer. Cartman is robust and tackles the task in an innovative way and is also cost effective. We learnt from our experience last year when we used an off-the-shelf robot. I think we had the lowest cost robot at the event!

15 members of the Centre’s 27-strong team of researchers, sourced from QUT, The University of Adelaide and the Australian National University, were in Japan for the event.

PhD researcher with the Centre, Adam Tow, based at QUT said,

“We feel brilliant, we say thank you very much for all the support we’ve received. The competition was a lot of work but really rewarding and a lot of fun. The team invested more than 15,000 hours into the project.

The time and effort paid off according to Dr Chris Lehnert, a roboticist at QUT, “Everything from the robot design, vision systems and grasping system worked flawlessly in the finals. The competition was tough, so many of the improbable scenarios that we thought would never occur did occur.”

The Challenge combined object recognition, pose recognition, grasp planning, compliant manipulation, motion planning, task planning, task execution, and error detection and recovery. The robots were scored by how many items they successfully picked and stowed in a fixed amount of time.

“We are world leaders in robotic vision and we’re pushing the boundaries of computer vision and machine learning to complete these tasks in an unstructured environment,” says Juxi.

Cartman’s vision system was the result of hours of training data and training time, according to Dr Anton Milan,

“We had to create a robust vision system to cope with objects that we only got to see during the competition.

Our vision system had the perfect trade-off of training data, training time and accuracy. One feature of our system was that it worked off a very small amount of hand annotated training data. We only needed just seven images of each unseen item for us to be able to detect them.

It feels amazing to have accomplished this. Excellent team effort. Looking at the overall performance across all teams, we see huge advances in robotics and AI. We definitely have very exciting times ahead of us.”

University of Adelaide team member Dr Trung Pham agrees,

“one of the most important factors contributing to the team’s success was the seamless integration of world-leading robotics and vision. Our robot uses deep learning to see robustly and acts reliably due to smart design. The competition was a fantastic chance for us to truly test our state-of-the-art algorithms as well as opening up new real-world challenges that go beyond academic research.”

For more information, head over to www.roboticvision.org

Categories
Robots

Creator of techAU, Jason has spent the dozen+ years covering technology in Australia and around the world. Bringing a background in multimedia and passion for technology to the job, Cartwright delivers detailed product reviews, event coverage and industry news on a daily basis.