TY - GEN
T1 - Real-time object recognition and orientation estimation using an event-based camera and CNN
AU - Ghosh, Rohan
AU - Mishra, Abhishek
AU - Orchard, Garrick
AU - Thakor, Nitish V.
N1 - Publisher Copyright:
© 2014 IEEE.
PY - 2014/12/9
Y1 - 2014/12/9
N2 - Real-time visual identification and tracking of objects is a computationally intensive task, particularly in cluttered environments which contain many visual distracters. In this paper we describe a real-time bio-inspired system for object tracking and identification which combines an event-based vision sensor with a convolutional neural network running on FPGA for recognition. The event-based vision sensor detects only changes in the scene, naturally responding to moving objects and ignoring static distracters in the background. We present operation of the system for two tasks. The first is proof of concept for a remote monitoring application in which the system tracks and distinguishes between cars, bikes, and pedestrians on a road. The second task targets application to grasp planning for an upper limb prosthesis and involves detecting and identifying household objects, as well as determining their orientation relative to the camera. The second task is used to quantify performance of the system, which can discriminate between 8 different objects in 2.25 ms with accuracy of 99.10% and is able to determine object orientation with -4.5° accuracy in an additional 2.28 ms with accuracy of 97.76%.
AB - Real-time visual identification and tracking of objects is a computationally intensive task, particularly in cluttered environments which contain many visual distracters. In this paper we describe a real-time bio-inspired system for object tracking and identification which combines an event-based vision sensor with a convolutional neural network running on FPGA for recognition. The event-based vision sensor detects only changes in the scene, naturally responding to moving objects and ignoring static distracters in the background. We present operation of the system for two tasks. The first is proof of concept for a remote monitoring application in which the system tracks and distinguishes between cars, bikes, and pedestrians on a road. The second task targets application to grasp planning for an upper limb prosthesis and involves detecting and identifying household objects, as well as determining their orientation relative to the camera. The second task is used to quantify performance of the system, which can discriminate between 8 different objects in 2.25 ms with accuracy of 99.10% and is able to determine object orientation with -4.5° accuracy in an additional 2.28 ms with accuracy of 97.76%.
UR - http://www.scopus.com/inward/record.url?scp=84920528541&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84920528541&partnerID=8YFLogxK
U2 - 10.1109/BioCAS.2014.6981783
DO - 10.1109/BioCAS.2014.6981783
M3 - Conference contribution
AN - SCOPUS:84920528541
T3 - IEEE 2014 Biomedical Circuits and Systems Conference, BioCAS 2014 - Proceedings
SP - 544
EP - 547
BT - IEEE 2014 Biomedical Circuits and Systems Conference, BioCAS 2014 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 10th IEEE Biomedical Circuits and Systems Conference, BioCAS 2014
Y2 - 22 October 2014 through 24 October 2014
ER -