- ISSN: 1796-2021 (Online); 2374-4367 (Print)
- Abbreviated Title: J. Commun.
- Frequency: Bimonthly
- DOI: 10.12720/jcm
- Abstracting/Indexing: Scopus, CNKI, EBSCO,
- DBLP, Google Scholar, etc.
- E-mail questions or comments to editor@jocm.us
- Acceptance Rate: 27%
- APC: 800 USD
- Average Days to Accept: 88 days
Home > Published Issues > 2018 > Volume 13, No. 1, January 2018 >
An Embedded Multi-Sensor Data Fusion Design for Vehicle Perception Tasks
Mokhtar Bouain1,2, Karim M. A. Ali1, Denis Berdjag1, Nizar Fakhfakh2, and Rabie Ben Atitallah2
1. Univ. Valenciennes, CNRS, UMR 8201- LAMIH, F-59313 Valenciennes, France
2. Navya Company, Paris, France
2. Navya Company, Paris, France
Abstract—Nowadays, multi-sensor architectures are popular to provide a better understanding of environment perception for intelligent vehicles. Using multiple sensors to deal with perception tasks in a rich environment is a natural solution. Most of the research works have focused on PC-based implementations for perception tasks and very few concerns have been addressed for customized embedded designs. In this paper, we propose a Multi-Sensor Data Fusion (MSDF) embedded design for vehicle perception tasks using stereo camera and Light Detection and Ranging (LIDAR) sensors. A modular and scalable architecture based on Zynq-7000 SoC was designed.
Index Terms—Sensor Fusion, Embedded Systems, FPGA, Intelligent Vehicles.
Cite: Mokhtar Bouain, Karim M. A. Ali, Denis Berdjag, Nizar Fakhfakh, and Rabie Ben Atitallah, "An Embedded Multi-Sensor Data Fusion Design for Vehicle Perception Tasks," Journal of Communications, vol. 13, no. 1, pp. 8-14, 2018. Doi: 10.12720/jcm.13.1.8-14.
Index Terms—Sensor Fusion, Embedded Systems, FPGA, Intelligent Vehicles.
Cite: Mokhtar Bouain, Karim M. A. Ali, Denis Berdjag, Nizar Fakhfakh, and Rabie Ben Atitallah, "An Embedded Multi-Sensor Data Fusion Design for Vehicle Perception Tasks," Journal of Communications, vol. 13, no. 1, pp. 8-14, 2018. Doi: 10.12720/jcm.13.1.8-14.