Fast FPGA-Based Image Feature Extraction for Data Fusion in Autonomous Vehicles.
DOI:
https://doi.org/10.61961/injei.v1i1.3Keywords:
FPGA, SoC, Image Process, xfOpencvAbstract
Computer vision plays a critical role in many applications, particularly in the domain of autonomous vehicles. To achieve high-level image processing tasks such as image classification and object tracking, it is essential to extract low-level features from the image data. However, in order to integrate these compute-intensive tasks into a control loop, they must be completed as quickly as possible. This paper presents a novel FPGA-based system for fast and accurate image feature extraction, specifically designed to meet the constraints of data fusion in autonomous vehicles. The system computes a set of generic statistical image features, including contrast, homogeneity, and entropy, and is implemented on two Xilinx FPGA platforms - an Alveo U200 Data Center Accelerator Card and a Zynq UltraScale+ MPSoC ZCU104 Evaluation Kit. Experimental results show that the proposed system achieves high-speed image feature extraction with low latency, making it well-suited for use in autonomous vehicle systems that require real-time image processing. The presented system can also be easily extended to extract additional features for various image and data fusion applications.
Downloads
References
Y. Kadokawa, Y. Tsurumine, and T. Matsubara, “Binarized p-network: Deep reinforcement learning of robot control from raw images on fpga,” IEEE Robotics and Automation Letters, 2021. DOI: https://doi.org/10.1109/LRA.2021.3111416
M. Ravi, A. Sewa, T. Shashidhar, and S. S. S. Sanagapati, “Fpga as a hardware accelerator for computation intensive maximum likelihood expectation maximization medical image reconstruction algorithm,” IEEE Access, 2019. DOI: https://doi.org/10.1109/ACCESS.2019.2932647
X. Jiang, “Human tracking of track and field athletes based on fpga and computer vision,” Microprocessors and Microsystems, 2021. DOI: https://doi.org/10.1016/j.micpro.2021.104020
S. Aldegheri, N. Bombieri, D. D. Bloisi, and A. Farinelli, “Data flow orb-slam for real-time performance on embedded gpu boards,” in 2019 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS). IEEE, 2019. DOI: https://doi.org/10.1109/IROS40897.2019.8967814
S. Liu, L. Liu, J. Tang, B. Yu, Y. Wang, and W. Shi, “Edge computing for autonomous driving: Opportunities and challenges,” Proceedings of the IEEE, 2019. DOI: https://doi.org/10.1109/JPROC.2019.2915983
J. Webber, A. Mehbodniya, R. Teng, A. Arafa, and A. Alwakeel, “Finger-gesture recognition for visible light communication systems using machine learning,” Applied Sciences, 2021. DOI: https://doi.org/10.3390/app112411582
S. Che, J. Li, J. W. Sheaffer, K. Skadron, and J. Lach, “Accelerating compute-intensive applications with gpus and fpgas,” in 2008 Symposium on Application Specific Processors. IEEE, 2008. DOI: https://doi.org/10.1109/SASP.2008.4570793
D. Honegger, H. Oleynikova, and M. Pollefeys, “Real time and low latency embedded computer vision hardware based on a combination of fpga and mobile cpu,” in 2014 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems. IEEE, 2014. DOI: https://doi.org/10.1109/IROS.2014.6943263
M. Bouain, K. M. Ali, D. Berdjag, N. Fakhfakh, and R. B. Atitallah, “An embedded multi-sensor data fusion design for vehicle perception tasks.” J. Commun., vol. 13, no. 1, pp. 8–14, 2018. DOI: https://doi.org/10.12720/jcm.13.1.8-14
G. Bradski, “The OpenCV Library,” Dr. Dobb’s Journal of Software Tools, 2000.
Y. Zhang, X. Yang, L. Wu, and J. H. Andrian, “A case study on approximate fpga design with an open-source image processing platform,” in 2019 IEEE Computer Society Annual Symposium on VLSI (ISVLSI). IEEE, DOI: https://doi.org/10.1109/ISVLSI.2019.00074
D. Younis and B. M. Younis, “Low cost histogram implementation for image processing using fpga,” in IOP Conference Series: Materials Science and Engineering. IOP Publishing, 2020. DOI: https://doi.org/10.1088/1757-899X/745/1/012044
D. Tsiktsiris, D. Ziouzios, and M. Dasygenis, “A portable image processing accelerator using fpga,” in 2018 7th Int. Conf. on Modern Circuits and Systems Technologies (MOCAST). IEEE, 2018. DOI: https://doi.org/10.1109/MOCAST.2018.8376566
A. Linares-Barranco, F. Perez-Pena, D. P. Moeys, F. Gomez-Rodriguez, G. Jimenez-Moreno, S.-C. Liu, and T. Delbruck, “Low latency event-based filtering and feature extraction for dynamic vision sensors in real-time fpga applications,” IEEE Access, 2019. DOI: https://doi.org/10.1109/ACCESS.2019.2941282
F. Siddiqui, S. Amiri, U. I. Minhas, T. Deng, R. Woods, K. Rafferty, and D. Crookes, “Fpga-based processor acceleration for image processing applications,” Journal of Imaging, 2019. DOI: https://doi.org/10.3390/jimaging5010016
L. Si ́eler, C. Tanougast, and A. Bouridane, “A scalable and embedded fpga architecture for efficient computation of grey level co-occurrence matrices and haralick textures features,” Microprocessors and Microsystems, 2010. DOI: https://doi.org/10.1016/j.micpro.2009.11.001
M. A. B. Atitallah, R. Kachouri, and H. Mnif, “A new fpga accelerator based on circular buffer unit per orientation for a fast and optimised glcm and texture feature computation,” in 2019 IEEE International Conference on Design & Test of Integrated Micro & Nano-Systems DOI: https://doi.org/10.1109/DTSS.2019.8915341
(DTS). IEEE, 2019, pp. 1–6.
Xilinx, “Xilinx opencv library,” https://github.com/Xilinx/xfopencv, 2017, [Online; accessed 13-February-2023].
F. Vahid, Digital design with RTL design, VHDL, and Verilog. John Wiley & Sons, 2010.
A. K. Tripathi, S. Mukhopadhyay, and A. K. Dhara, “Performance metrics for image contrast,” in 2011 Int. Conf. on Image Information Processing. IEEE, 2011. DOI: https://doi.org/10.1109/ICIIP.2011.6108900
U. Shin, J. Park, G. Shim, F. Rameau, and I. S. Kweon, “Camera exposure control for robust robot vision with noise-aware image quality assessment,” in 2019 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems DOI: https://doi.org/10.1109/IROS40897.2019.8968590
(IROS). IEEE, 2019.
R. M. Haralick, K. Shanmugam, and I. H. Dinstein, “Textural features for image classification,” IEEE Transactions on systems, man, and cybernetics, 1973. DOI: https://doi.org/10.1109/TSMC.1973.4309314
Y. Nitta, S. Tamura, H. Yugen, and H. Takase, “Zytlebot: Fpga integrated development platform for ros based autonomous mobile robot,” in 2019 Int. Conf. on Field Programmable Technology (ICFPT). IEEE, 2019. DOI: https://doi.org/10.1109/ICFPT47387.2019.00089
Xilinx, “Petalinux tools,” https://www.xilinx.com/products/design-tools/embedded-software/petalinux-sdk.html, 2022, [Online; accessed 16-February-2023].
——, “Zynq ultrascale+ mpsoc zcu104 evaluation kit,”https://www.xilinx.com/products/boards-and-kits/zcu104.html, 2022, [Online; accessed 21-February-2023].
——, “Zynq ultrascale+ mpsoc zcu104 evaluation kit,”https://www.xilinx.com/products/boards-and-kits/alveo/u200.html, 2022, [Online; accessed 21-February-2023].
Published
How to Cite
Issue
Section
License
Copyright (c) 2023 Jeremias Gaia, Eugenio Orosco, Francisco Rossomando, Carlos Soria
This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License 4.0 that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.
Funding data
-
Secretaría de Estado de Ciencia, Tecnología e Innovación
Grant numbers 022-SECITI-2020