TY - GEN
T1 - CoSTAR
T2 - 2017 IEEE International Conference on Robotics and Automation, ICRA 2017
AU - Paxton, Chris
AU - Hundt, Andrew
AU - Jonathan, Felix
AU - Guerin, Kelleher
AU - Hager, Gregory D.
N1 - Funding Information:
This work was funded by NSF Grant 1227277. Authors are from the Dept. of Computer Science, Johns Hopkins University, Baltimore, MD, USA. Email: {cpaxton, ahundt, fjonath1, kguerin2}@jhu.edu and [email protected]).
Publisher Copyright:
© 2017 IEEE.
PY - 2017/7/21
Y1 - 2017/7/21
N2 - For collaborative robots to become useful, end users who are not robotics experts must be able to instruct them to perform a variety of tasks. With this goal in mind, we developed a system for end-user creation of robust task plans with a broad range of capabilities. CoSTAR: the Collaborative System for Task Automation and Recognition is our winning entry in the 2016 KUKA Innovation Award competition at the Hannover Messe trade show, which this year focused on Flexible Manufacturing. CoSTAR is unique in how it creates natural abstractions that use perception to represent the world in a way users can both understand and utilize to author capable and robust task plans. Our Behavior Tree-based task editor integrates high-level information from known object segmentation and pose estimation with spatial reasoning and robot actions to create robust task plans. We describe the cross-platform design and implementation of this system on multiple industrial robots and evaluate its suitability for a wide variety of use cases.
AB - For collaborative robots to become useful, end users who are not robotics experts must be able to instruct them to perform a variety of tasks. With this goal in mind, we developed a system for end-user creation of robust task plans with a broad range of capabilities. CoSTAR: the Collaborative System for Task Automation and Recognition is our winning entry in the 2016 KUKA Innovation Award competition at the Hannover Messe trade show, which this year focused on Flexible Manufacturing. CoSTAR is unique in how it creates natural abstractions that use perception to represent the world in a way users can both understand and utilize to author capable and robust task plans. Our Behavior Tree-based task editor integrates high-level information from known object segmentation and pose estimation with spatial reasoning and robot actions to create robust task plans. We describe the cross-platform design and implementation of this system on multiple industrial robots and evaluate its suitability for a wide variety of use cases.
UR - http://www.scopus.com/inward/record.url?scp=85027962996&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85027962996&partnerID=8YFLogxK
U2 - 10.1109/ICRA.2017.7989070
DO - 10.1109/ICRA.2017.7989070
M3 - Conference contribution
AN - SCOPUS:85027962996
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 564
EP - 571
BT - ICRA 2017 - IEEE International Conference on Robotics and Automation
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 29 May 2017 through 3 June 2017
ER -