Shikhar Tuli

Room B321, E-Quad

Princeton University


NJ, 08544


I am a Ph.D. candidate at the Department of Electrical and Computer Engineering, Princeton University under the supervision of Niraj K. Jha. My main research areas are Machine Learning, Neuroscience-inspired AI, Edge Computing, and Embedded systems. Prior to this, I was an undergraduate student at the Department of Electrical Engineering at the Indian Institute of Technology Delhi. I am also the founder and CEO of Qubit Inc., a company that works on providing next generation solutions for industrial problems. I have also worked as a visiting researcher at the Embedded Systems Laboratory (ESL), Institute of Electrical Engineering at EPFL, Switzerland.

My publications can be seen here. I have also reviewed for top journals including IEEE TCAD, TEVC, TETC, TII and Wiley SPE. I have also reviewed papers for reputed conferences like CISS, CogSci and ICML. View my CV here.

selected publications

  1. BREATHE: Second-Order Gradients and Heteroscedastic Emulation based Design Space Exploration
    Tuli, Shikhar, and Jha, Niraj K.
    arXiv Preprint 2023
  2. TransCODE: Co-design of Transformers and Accelerators for Efficient Training and Inference
    Tuli, Shikhar, and Jha, Niraj K.
    IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems 2023
  3. EdgeTran: Co-designing Transformers for Efficient Inference on Mobile Edge Platforms
    Tuli, Shikhar, and Jha, Niraj K.
    arXiv Preprint 2023
  4. AccelTran: A Sparsity-Aware Accelerator for Dynamic Inference with Transformers
    Tuli, Shikhar, and Jha, Niraj K.
    IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems 2023
  5. CODEBench: A Neural Architecture and Hardware Accelerator Co-Design Framework
    Tuli, Shikhar, Li, Chia-Hao, Sharma, Ritvik, and Jha, Niraj K.
    ACM Transactions on Embedded Computing Systems 2022
  6. DINI: Data Imputation using Neural Inversion for Edge Applications
    Tuli, Shikhar, and Jha, Niraj K.
    Nature Scientific Reports 2022
  7. FlexiBERT: Are Current Transformer Architectures too Homogeneous and Rigid?
    Tuli, Shikhar, Dedhia, Bhishma, Tuli, Shreshth, and Jha, Niraj K.
    Journal of Artificial Intelligence Research 2022
  8. Generative Optimization Networks for Memory Efficient Data Generation
    Tuli, Shreshth, Tuli, Shikhar, Casale, Giuliano, and Jennings, Nicholas R
    NeurIPS 2021 - Workshop on ML for Systems 2021
  9. Are Convolutional Neural Networks or Transformers more like human vision?
    Tuli, Shikhar, Dasgupta, Ishita, Grant, Erin, and Griffiths, Thomas L.
    Annual Meeting of the Cognitive Science Society (CogSci) 2021
  10. AVAC: A Machine Learning based Adaptive RRAM Variability-Aware Controller for Edge Devices
    Tuli, Shikhar, and Tuli, Shreshth
    IEEE International Symposium on Circuits and Systems (ISCAS) 2020
  11. RRAM-VAC: A variability-aware controller for RRAM-based memory architectures
    Tuli, Shikhar, Rios, Marco, Levisse, Alexandre, and Atienza, David
    Asia and South Pacific Design Automation Conference (ASPDAC) 2020
  12. Design of a Conventional-Transistor-Based Analog Integrated Circuit for On-Chip Learning in a Spiking Neural Network
    Tuli, Shikhar, and Bhowmik, Debanjan
    International Conference on Neuromorphic Systems (ICONS) 2020
  13. Predicting the Growth and Trend of COVID-19 Pandemic using Machine Learning and Cloud Computing
    Tuli, Shreshth, Tuli, Shikhar, Tuli, Rakesh, and Gill, Sukhpal Singh
    Internet of Things 2020