Avishek Biswas

CTO

San Francisco, California, United States13 yrs 4 mos experience
Highly Stable

Key Highlights

  • Expert in deep learning architecture and design.
  • Proven track record in energy-efficient computing.
  • Strong background in mixed-signal circuit design.
Stackforce AI infers this person is a specialist in Edge-AI and IoT technologies with a focus on energy-efficient hardware design.

Contact

Skills

Core Skills

Deep LearningTensorflowRtl DesignCompute In-memoryMixed-signal DesignNeural NetworksEnergy-efficient DesignMachine Learning

Other Skills

LinuxPythonEnergy ModelingPyTorchSRAM DesignCadenceMatlabVerilogProgrammingImage ProcessingElectronicsSimulationsComputer VisionCMicrosoft Office

Experience

Nvidia

Senior Deep Learning Architect

Jul 2023Present · 2 yrs 8 mos

TensorflowDeep LearningLinux

Facebook

Research Scientist

Nov 2021Jun 2023 · 1 yr 7 mos

  • Worked on Compute In-Memory (CIM) based accelerator design and system level analysis for various AR/VR workloads. This included new micro-architecture and dataflow design in RTL, SoC level design verification, working with different AR/VR product teams to map their workloads in the CIM based accelerator, simulator design (Python based) for energy and performance modeling with relevant CIM IPs to quickly explore a large design space for optimal hardware design, as well as evaluate the benefits of CIM at the system level.
Compute In-MemoryRTL DesignPythonEnergy Modeling

Texas instruments

Mixed-Signal Research Engineer

Jun 2018Nov 2021 · 3 yrs 5 mos · Kilby Labs, Dallas

  • Hardware/software co-design for low power neural network accelerators for edge-AI applications. Particular focus on in-memory computing approaches to address the limitations of traditional Von-Neumann digital architectures.
  • Experienced in mixed-signal circuit design from transistor level all the way up to RTL level. Experienced in Static Random Access Memory (SRAM) design along with other key analog components including DACs, ADCs, PLLs etc..
  • Experienced in RTL design of entire accelerator modules including datapath and full control logic in SoC/ASICs. Also experienced with FPGA implementation of system-level RTL designs for proof-of-concepts.
  • Experienced in neural network training and modeling using PyTorch, MATLAB, C etc..
Mixed-Signal DesignRTL DesignNeural NetworksPyTorch

Cea

Graduate Research Intern

Jun 2017Jul 2017 · 1 mo · Grenoble Area, France

Intel corporation

Graduate Summer Intern

Jun 2015Aug 2015 · 2 mos

Mit

Graduate Student Research Assistant (PhD)

Sep 2012May 2018 · 5 yrs 8 mos

  • Pursued PhD, dissertation: "Energy-Efficient Smart Embedded Memory Design for IoT and AI".
  • Worked on high-throughput and energy-efficient SRAMs with in-memory computation capability for low-power, real-time machine learning applications, under the guidance of Prof. Anantha Chandrakasan. Developed and demonstrated on silicon (65nm CMOS), a compute in-memory SRAM array which can run massively parallel dot-products for neural networks, in a highly energy-efficient manner (published at ISSSCC 2018, JSSC 2019).
SRAM DesignEnergy-Efficient DesignMachine Learning

Education

Massachusetts Institute of Technology

Doctor of Philosophy - PhD — Electrical Engineering and Computer Science

Jan 2014Jan 2018

Massachusetts Institute of Technology

Master of Science - MS — Electrical Engineering and Computer Science

Jan 2012Jan 2014

Indian Institute of Technology, Kharagpur

Bachelor of Technology (B.Tech.)

Jan 2008Jan 2012

Stackforce found 100+ more professionals with Deep Learning & Tensorflow

Explore similar profiles based on matching skills and experience