Distributed FPGA-based Smart Camera Architecture For Computer Vision Applications
Distributed FPGA-based Smart Camera Architecture For Computer Vision Applications
Abstract: Smart camera networks (SCN) raise challenging issues to be manipulated in M2M and H2M. FPGAs are therefore
in many fields of research, including vision processing, commu- considered as publishers of Internet resources, addressable as
nication protocols, distributed algorithms or power management. URIs by means of application layer protocols like HTTP and
Furthermore, application logic in SCN is not centralized but spread
among network nodes meaning that each node must have to COAP [5].
process images to extract significant features, and aggregate data to II. H ARDWARE A RCHITECTURE
understand the surrounding environment.
In this context, smart camera have first embedded general pur- The smart camera architecture used in this paper is a FPGA-
pose processor (GPP) for image processing. Since image resolution based platform developed at our institute and called Dream-
increases, GPPs have reached their limit to maintain real-time Cam [6] (Fig. 1). This smart camera consists of 5 stacked
processing constraint. More recently, FPGA-based platforms have boards. This modular architecture allows us to easily adapt to new
been studied for their massive parallelism capabilities. This paper
present our new FPGA-based smart camera platform supporting co- functions. For instance, different communication board have been
operation between nodes and run-time updatable image processing. designed and the switch from USB to Giga-Ethernet is trivial.
The architecture is based on a full reconfigurable pipeline driven
by a softcore. Image FPGA
Sensor (Cylcone III)
Giga Ether.
(E2V)
I. I NTRODUCTION Board
Softcore
IP
memory
FPGA
Memory
ules called RouteMatrix (Fig. 3). The processing element can Fig. 4: Distributed target application
be thresholding, convolution, histogram computation,... and the
interconnection (RouteMatrix) can be viewed as multiplexer con-
trolled by the softcore. In this way, each node (smart camera) can communicate each other on an ethernet network. This first archi-
be configure via the network and provides the ressource expected. tecture does not take care on low-consumption communication
The former realizes the connections between the Elab blocks, in smart camera network which need to be investigated. A new
while the latter implement basic computer vision algorithms that physical medium is under development and will integrate an IEEE
can be part of a specific elaboration pipeline tunable at run-time. 802.15.4 transceiver supporting wireless communication.
This architecture can support multi-camera video inputs that can ACKNOWLEDGMENT
be parallel processed as a continuous pixel flow. In this respect,
This work has been sponsored by the French government re-
the captured flow has not an associated semantic, so that an user
search programm Investissements davenir through the IMobS3
interacting with a configuration manager can select and compose
Laboratory of Excellence (ANR-10-LABX-16-01), by the Eu-
the appropriate modules suited to the desired application.
ropean Union through the program Regional competitiveness
Elaboration parameter conguration and employment 2007-2013 (ERDF Auvergne region), and by
Route Matrix Route Matrix Route Matrix the Auvergne region. This work is made in collaboration with
Video Stream 0 Elab00 Elab10 Elab20 ... the Networks of Embedded Systems team at CNIT (Consorzio
Video Stream 1 Elab01 Elab11 Elab21 ... Nazionale Interuniversitario per le Telecomunicazioni)
Video Stream 2 Elab02 Elab12 Elab22 ... R EFERENCES
...
Video Stream 3 Elab03 Elab13 Elab23 [1] P. Pagano, C. Salvadori, S. Madeo, M. Petracca, S. Bocchino,
D. Alessandrelli, A. Azzar, M. Ghibaudi, G. Pellerano, and R. Pel-
liccia, A middleware of things for supporting distributed vision
Run-time software interconnection reconguartion applications, in Proceedings of the 1st Workshop on Smart Cameras
Fig. 3: Configurable processing IP for Robotic Applications, SCaBot Workshop, 2012.
[2] W. W. B. Rinner, Towards Pervasive Smart Camera Networks.
Academic press, 2009.
[3] P. Chen, P. Ahammad, C. Boyer, S.-I. Huang, L. Lin, E. Lobaton,
III. A PPLICATION M. Meingast, S. Oh, S. Wang, P. Yan, A. Yang, C. Yeo, L.-C. Chang,
J. D. Tygar, and S. Sastry, Citric: A low-bandwidth wireless camera
Targeted application is a tracking system in distributed network network platform, in Distributed Smart Cameras, 2008. Second
of smart cameras, presented in Fig 4. Our system will be based ACM/IEEE International Conference on, 2008.
on particle filter which is very common for tracking objects [7]. [4] E. Norouznezhad, A. Bigdeli, A. Postula, and B. Lovell, A high
resolution smart camera with gige vision extension for surveillance
Moreover, Cho et al have demonstrated in [8] that particle filters
applications, in Distributed Smart Cameras, 2008. ICES 2008.
can be implemented in FPGA device, making us confident about Second ACM/IEEE International Conference on, 2008, pp. 18.
the feasibility of this hardware implementation. A first application [5] Z. Shelby, K. Hartke, C. Bormann, and B. Frank, Constrained
(HOG) using this modular architecture has been implemented and Application Protocol (CoAP), CoRE Working Group, Internet En-
the smart camera is currently able to exchange information via gineering Task Force, October 2012.
[6] M. Birem and F. Berry, Dreamcam : A modular fpga-based smart
Giga-Ethernet communication with low resources FPGA (around
camera architecture, in Journal of System Architecture - Elsevier -
5%). Future development will focus on a wireless transceiver To appear - 2013, 2013.
integration and particle filter implementation. [7] S. Haykin and N. deFreitas, Special issue on sequential state
estimation, Proceedings of the IEEE, vol. 92, no. 3, pp. 399400,
IV. C ONCLUSION 2004.
[8] J. U. Cho, S.-H. Jin, X. D. Pham, J. W. Jeon, J.-E. Byun, and
In this paper, we presented a FPGA architecture offering high H. Kang, A real-time object tracking system using a particle filter,
processing capacity with configurable processing. At this point in Intelligent Robots and Systems, 2006 IEEE/RSJ International
of the project, we know that out smart camera can autonomously Conference on, 2006, pp. 28222827.