Platform for the Development of Image Processing Applications on Xilinx SoC FPGAs
DOI:
https://doi.org/10.37537/rev.elektron.4.2.109.2020Keywords:
image processing, HLS, ZynqAbstract
The use of image processing systems is becoming frequent and is appropriate in edge computing. However, the demands on power consumption and high performance prevent the use of standard processing platforms. Thus, FPGAs are a good option for the development of computer vision systems because they are capable of exploiting parallelism. On the other hand, the design flow of current FPGA synthesis tools supports high-level languages as input descriptions, in contrast to hardware description languages. High-level synthesis (HLS) automates the design process by transforming an algorithmic description into digital hardware meeting design limitations. However, image processing experts may find challenging the hardware integration with the rest of the system components, e.g. capture and display interfaces. This work presents a basic design for the construction of image processing applications based on Zynq and a guide for its use, which solves this problem, allowing the agile generation of embedded image processing solutions. Additionally, a methodology is provided, which facilitates the efficient development of image processing embedded solutions in an agile way.Downloads
References
J. Wang, Z. Feng, Z. Chen, S. A. George, M. Bala, P. Pillai, S. W.Yang, and M. Satyanarayanan, “Edge-based live video analytics for drones,” IEEE Internet Computing, vol. 23, no. 4, pp. 27–34, 2019.
A. Heredia and G. Barros-Gavilanes, “Video processing inside embedded devices using ssd-mobilenet to count mobility actors,” in 2019 IEEE Colombian Conference on Applications in Computational Intelligence (ColCACI). IEEE, 2019, pp. 1–6.
H. Kavalionak, C. Gennaro, G. Amato, C. Vairo, C. Perciante,C. Meghini, and F. Falchi, “Distributed video surveillance using smart cameras,”Journal of Grid Computing, vol. 17, no. 1, pp. 59–77, 2019.
J. Hosseinzadeh and M. Hosseinzadeh, “A comprehensive survey on evaluation of lightweight symmetric ciphers: hardware and software implementation,” Advances in Computer Science: an International Journal, vol. 5, no. 4, pp. 31–41, 2016.
F. Siddiqui, S. Amiri, U. I. Minhas, T. Deng, R. Woods, K. Rafferty,and D. Crookes, “Fpga-based processor acceleration for image processing applications,”Journal of Imaging, vol. 5, no. 1, p. 16, 2019.
J. McAllister, “Fpga-based dsp,” in Handbook of Signal Processing Systems. Springer, 2010, pp. 363–392.
P. Coussy and A. Morawiec,High-level synthesis: from algorithm to digital circuit. Springer Science & Business Media, 2008.
C. Zhang, P. Li, G. Sun, Y. Guan, B. Xiao, and J. Cong, “Optimizing fpga-based accelerator design for deep convolutional neural networks,” in Proceedings of the 2015 ACM/SIGDA international symposium on field-programmable gate arrays, 2015, pp. 161–170.
D. Passaretti, J. M. Joseph, and T. Pionteck, “Survey on fpga in medical radiology applications: Challenges, architectures and programming models,” in 2019 International Conference on Field-Programmable Technology (ICFPT). IEEE, 2019, pp. 279–282.
S. Lahti, P. Sjövall, J. Vanne and T. D. Hämäläinen, "Are We There Yet? A Study on the State of High-Level Synthesis," in IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, vol. 38, no. 5, pp. 898-911, May 2019, doi: 10.1109/TCAD.2018.2834439.
D. G. Bailey, Design for embedded image processing on FPGAs. John Wiley & Sons, 2011.
D. G. Bailey, “The advantages and limitations of high level synthesis for FPGA based image processing,” in Proceedings of the 9th International Conference on Distributed Smart Cameras, 2015, pp. 134–139.
L. Huang, D.-L. Li, K.-P. Wang, T. Gao, and A. Tavares, “A survey on performance optimization of high-level synthesis tools,” Journal Of Computer Science and Technology, vol. 35, pp. 697–720, 2020.
M. Qasaimeh and E. N. Salahat, “Real-time image and video processing using high-level synthesis (hls),” in Handbook of Research On Advanced Concepts in Real-Time Image and Video Processing. IGI Global, 2018, pp. 390–408.
M. K. Birla, “Fpga based reconfigurable platform for complex image processing,” in 2006 IEEE International Conference on Electro/Information Technology. IEEE, 2006, pp. 204–209.
C. Ababei, S. Duerr, W. J. Ebel Jr, R. Marineau, and M. G.Moghaddam, “Open source digital camera on field programmable gate arrays,” International Journal of Handheld Computing Research(IJHCR), vol. 7, no. 4, pp. 30–40, 2016.
C. Ababei, S.Duerr, W.J. Ebel Jr, R. Marineau, “Open source digital camera on field programmable gate arrays,”International Journal of Handheld Computing Research(IJHCR), vol. 7, no. 4, pp. 30–40, 2016.
X. Yang, Y. Zhang, and L. Wu, “A scalable image/video processing platform with open source design and verification environment,” in 20th International Symposium on Quality Electronic Design (ISQED).IEEE, 2019, pp. 110–116.
Tomás Medina Github repository (2020), Zybo Z7-20 Video Processing Platform, [Online]. Available: https://github.com/Tomasmed18/Zybo-Z7-20-Video-Processing-Platform
Zybo Z7 Board Reference Manual, Digilent, Zybo Z7-20, 2018.
Pcam 5C Reference Manual, Digilent,2018.
(2020) SportPear Electronics, Raspberry Pi 7 inch HDMI LCD GPIO Touch (1024*600), [Online]. Available: http://www.spotpear.com/index/study/detail/id/165.html
MIPI D-PHY Receiver 1.3 IP Core User Guide, Digilent, 2018.
MIPI CSI-2 Receiver 1.1 IP Core User Guide, Digilent, 2018.
Sensor Demosaic v1.0 (Rev. 3), Xilinx, 2018.
Bryce E. Bayer. “Color Imaging Array”. US Patent 3971065, 1976.
Gamma Correction v7.0 LogiCORE IP Product Guide PG004, Xilinx, 2015.
AXI Video Direct Memory Access V6.3, Xilinx, 2017.
AXI4-Stream to Video Out v4.0, Xilinx, 2017.
RGB-to-DVI (Source) 1.4 IP Core User Guide, Digilent, 2017.
AXI-Interconnect v2.1, Xilinx, 2017.
Team, OpenCV (2018) OpenCV library. [Online]. Available: https://opencv.org/about.html.
Xilinx OpenCV User Guide UG1233, Xilinx, 2019.
Vivado Design Suite User Guide: High-Level Synthesis, Xilinx, 2018.
Downloads
Published
Issue
Section
License
The authors who publish in this journal agree with terms established in the license Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0)