Resource Constrained Cellular Neural Networks for Real-time Pedestrian Segmentation using Embedded FPGAs.

Yiyu Shi
University of Notre Dame


Pedestrian detection and segmentation is a critical function in smart cars with advanced driver assistance systems (ADAS) or autonomous cars. While many computer vision and/or deep learning based schemes exist in the literature for object detection and segmentation, they adoption in ADAS or autonomous driving has been challenging due to the real-time and resource-constrained embedded requirement. One popular option for image segmentation stems from cellular neural networks (CeNN), which however suffers from high computational complexity. In this talk we will present a deeply compressed CeNN framework for real-time pedestrian segmentation in embedded FPGAs. Particularly, two compression strategies are examined: parameter quantization and early exit. Parameter quantization quantizes the numbers in CeNN templates to powers of two, so that complex and expensive multiplications can be converted to simple and cheap shift operations, which only require a minimum number of registers and LEs. Early exit stops the iterative computations when they show signs of convergence, so that speedup can be achieved. Experimental results show that our approach can significantly improve the resource utilization and speedup the computation, with little or no performance loss compared with the state-of-the-art implementations.