With adoption of statistical timing across industry, there is need to characterize all gates/cells in the digital library for delay variations (referred to as, statistical characterization). Statistical characterization need to be performed efficiently within acceptable accuracy as functions of several process and environment parameter variations. In this paper, we propose an approach to consider intra-cell process mismatch variations to characterize a cell’s delay and output transition time (output slew) variations. A straightforward approach to address this problem is to model these mismatch variations by characterizing for each device fluctuation separately. However, it can be seen that the runtime complexity for such characterization becomes of the order of number of devices in the cell and the number of simulations required can easily become infeasible. We analyze the property of fluctuations in switching and non-switching devices and its impact on delay variations. Using these properties, we propose a clustering approach to characterize for cell’s delay variations due to intra-cell mismatch variations. The proposed approach results in up to 12X runtime improvements within acceptable accuracy of Monte Carlo simulations. We show that this approach ensures an upper-bound on the results while keeping the number of simulations for each cell independent of number of devices.