Projects

1. Analysis of Model IP Vulnerability for Undistillable Models

We thoroughly analyze the model IP vulnerability of models used in machine learning as a service applications (MLAAS). We further propose a distillation framework using a novel Skeptical student that can distill knowledge from even the undistillable models.

drawing

(Accepted at NeurIPS 2021 [h5 index 245])

2. A Novel Training Algorithm to Improve Model Robustness of Deep SNNs

We propose a SNN training strategy that can yield improved model robustness without causing any significant sacrifice in clean image classification performance, that too at reduced memory budget and no extra training time.

drawing

(Accepted at ICCV 2021 [h5 index 184])

3. Reducing Spiking Activity of SNNs via Brain-Inspired Learning

We propose an attention-guided compression scheme yield extremely energy-efficient SNNs. In particular, we provide a 2 stage training strategy to compress SNNs to yield extremely energy-efficient models that requires one order of lower energy compared to iso-parameter ANNs. drawing

(Accepted at WACV 2021)

4. Making Compressed Models Adversarially Robust

We propose a one-shot training framework to generate robust yet compressed DNN models. In particular, the proposed robust sparse learning strategy can achieve up to 20x compression with negligible drop in both clean and purturbed image accuracy. drawing

(Accepetd at ASP-DAC 2021)

5. Are Dense CNN Kernels Really Necessary?

A study of sparsely represented kernels, where we propose hardware friendly sparse models enabling low cost data-transfer from DRAM. We further extend the work to propose periodically repeating sparse kernels and periodic compressed sparse representation styles to reduce the data transfer need.

drawing

(IEEE Trans. on Computers 2020, Allerton 2019, Best poster at USC research festival 2019)

6. Making MCA Based Crossbar Array More Compact

We propose a new form of pre-defined sparsity that can reduce the cross bar array size drastically, thus helps in making RRAM based ML accelerators drawing

(Accepetd at ISVLSI 2019)