Online Training from Streaming Data with Concept Drift on FPGAs

Esther Roorda and Steve Wilton
University of British Columbia


Abstract

In dynamic environments, the inputs to machine learning models may exhibit statistical changes over time, through what is called concept drift. Incremental training can allow machine learning models to adapt to changing conditions and maintain high accuracy by continuously updating network parameters. In the context of FPGA-based accelerators however, online incremental learning is challenging due to resource and communication constraints, as well as the absence of labelled training data. These challenges have not been fully evaluated or addressed in existing research. In this paper, we present and evaluate strategies for performing incremental training on streaming data with concept drift on FPGA-based platforms. We first present FPGA-based implementations of existing training algorithms to demonstrate the viability of online training with concept shift and to evaluate design tradeoffs. We then propose a technique for online training without labelled data and demonstrate its potential in the context of FPGA-based hardware acceleration