Shikhar Tuli

665 Clyde Ave

Mountain View

California

CA, 94043

Hits

I am a senior research scientist at Samsung Research and a co-founder and CTO at Illumia AI. Before this, I was a Ph.D. candidate at the Department of Electrical and Computer Engineering, Princeton University under the supervision of Niraj K. Jha. My main research areas are Machine Learning, Neuroscience-inspired AI, Edge Computing, and Embedded systems. I did my undergraduate studies at the Department of Electrical Engineering, Indian Institute of Technology Delhi. I have also worked as a visiting researcher at the Embedded Systems Laboratory (ESL), Institute of Electrical Engineering at EPFL, Switzerland.

My publications can be seen here. I have reviewed for top journals including IEEE TCAD, TEVC, TETC, TII, and Wiley SPE. I have also served as a reviewer for reputed conferences like CISS, CogSci, and ICML. View my CV here.

selected publications

  1. EdgeTran: Device-Aware Co-Search Of Transformers for Efficient Inference on Mobile Edge Platforms
    Tuli, Shikhar, and Jha, Niraj K.
    IEEE Transactions on Mobile Computing 2023
  2. BREATHE: Second-Order Gradients and Heteroscedastic Emulation based Design Space Exploration
    Tuli, Shikhar, and Jha, Niraj K.
    arXiv Preprint 2023
  3. TransCODE: Co-design of Transformers and Accelerators for Efficient Training and Inference
    Tuli, Shikhar, and Jha, Niraj K.
    IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems 2023
  4. AccelTran: A Sparsity-Aware Accelerator for Dynamic Inference with Transformers
    Tuli, Shikhar, and Jha, Niraj K.
    IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems 2023
  5. CODEBench: A Neural Architecture and Hardware Accelerator Co-Design Framework
    Tuli, Shikhar, Li, Chia-Hao, Sharma, Ritvik, and Jha, Niraj K.
    ACM Transactions on Embedded Computing Systems 2022
  6. DINI: Data Imputation using Neural Inversion for Edge Applications
    Tuli, Shikhar, and Jha, Niraj K.
    Nature Scientific Reports 2022
  7. FlexiBERT: Are Current Transformer Architectures too Homogeneous and Rigid?
    Tuli, Shikhar, Dedhia, Bhishma, Tuli, Shreshth, and Jha, Niraj K.
    Journal of Artificial Intelligence Research 2022
  8. Generative Optimization Networks for Memory Efficient Data Generation
    Tuli, Shreshth, Tuli, Shikhar, Casale, Giuliano, and Jennings, Nicholas R
    NeurIPS 2021 - Workshop on ML for Systems 2021
  9. Are Convolutional Neural Networks or Transformers more like human vision?
    Tuli, Shikhar, Dasgupta, Ishita, Grant, Erin, and Griffiths, Thomas L.
    Annual Meeting of the Cognitive Science Society (CogSci) 2021
  10. AVAC: A Machine Learning based Adaptive RRAM Variability-Aware Controller for Edge Devices
    Tuli, Shikhar, and Tuli, Shreshth
    IEEE International Symposium on Circuits and Systems (ISCAS) 2020
  11. RRAM-VAC: A variability-aware controller for RRAM-based memory architectures
    Tuli, Shikhar, Rios, Marco, Levisse, Alexandre, and Atienza, David
    Asia and South Pacific Design Automation Conference (ASPDAC) 2020
  12. Design of a Conventional-Transistor-Based Analog Integrated Circuit for On-Chip Learning in a Spiking Neural Network
    Tuli, Shikhar, and Bhowmik, Debanjan
    International Conference on Neuromorphic Systems (ICONS) 2020
  13. Predicting the Growth and Trend of COVID-19 Pandemic using Machine Learning and Cloud Computing
    Tuli, Shreshth, Tuli, Shikhar, Tuli, Rakesh, and Gill, Sukhpal Singh
    Internet of Things 2020