Aditi Viswanathan

CTO

San Francisco, California, United States11 yrs 10 mos experience
Highly Stable

Key Highlights

  • Led engineering teams at Peer AI and Google.
  • Co-founded a successful startup in machine translation.
  • Conducted ML training for over 200 Googlers.
Stackforce AI infers this person is a Data Engineering and Machine Learning expert in the SaaS industry.

Contact

Skills

Core Skills

Machine LearningSoftware EngineeringData AnalysisData Engineering

Other Skills

APIsWeb ApplicationsNeural Machine TranslationSpam DetectionKafkaSpark StreamingGraph DatabasesPythonJavaScriptHTMLMicrosoft OfficeSocial MediaC++CApache Spark

Experience

Peer ai

2 roles

Head of Engineering

Promoted

Dec 2024Present · 1 yr 3 mos

Technical Lead

Mar 2024Present · 2 yrs

Allofus.ai

Chief Technology Officer

Apr 2023Mar 2024 · 11 mos · San Francisco, California, United States · Hybrid

Google

2 roles

Software Engineer

Promoted

May 2018Aug 2023 · 5 yrs 3 mos · On-site

  • Initiated and co-founded Govoroon at Area120 (Google's internal startup incubator) which delivered Machine Translation solutions for customer support; authored and published a paper on neural machine translation; got 10 customer contracts signed within 8 months.
  • Key contributor to AdLingo (Area 120) and Business Communications, focusing on APIs, measurement, reporting, and web applications. Built domain knowledge on the Ads ecosystem at Google as well as architecting systems for scale.
  • Conducted ML training sessions for Googlers over 3 years, with 200+ Googlers attending in person. Also trained external community members from 15+ countries to become ML trainers within their own local communities.
Machine LearningAPIsWeb ApplicationsNeural Machine TranslationSoftware Engineering

Data Analyst

Nov 2016May 2018 · 1 yr 6 mos · On-site

  • Spam and abuse detection on Google Search
Data AnalysisSpam Detection

Noodle.ai

Data Engineer

Jan 2016Jan 2016 · 0 mo

Innovation labs, [24]7

2 roles

Software Engineer

Jun 2014Jul 2016 · 2 yrs 1 mo · Bangalore

  • Key projects:
  • End-to-End Real Time Data Platform
  • I helped design a Kappa architecture based model for the next generation Real Time Data Platform in [24]7. I also implemented a proof-of-concept of this - it uses a streaming engine to pre-compute session level variables and a graph database to store historical data, user profiles and link omni-channels.
  • Technologies used: Kafka, Spark Streaming, Akka, Flink, Neo4j, OrientDB, Cassandra
  • Prediction-as-a-Service (PraaS)
  • This product automates the steps of the machine learning process – variable selection, binning, model training, simulation, monitoring and model deployment. I implemented the deployment module in which each step in the offline model building process has an equivalent representation (a javascript function) in real time. All of these functions then get stitched together to form a logical evaluation tree and this tree is published to a global repository from where the evaluator App can call it.
  • Languages Used: Python, Javascript, PMML (Logistic Regression, Naïve Bayes, Decision Tree, Random Forest and SVM models are currently supported by PraaS)
  • Deployment of Prediction Entities
  • I created prediction entities (views and models) for five clients. These were deployed on [24]7’s Real Time Data Platform.
  • Languages Used: Javascript, MVEL, Java
  • Page Categorization Tool
  • This is a data visualization tool that represents page categories as a sunburst. Users can interactively delete, combine and modify page categories on this, and the final result is stored as a javascript function that captures all the transformations the original URL goes through to form these customized page categories.
  • Languages Used: Javascript (d3.js)
  • Reporting Tool
  • This is also a data visualization tool that I built using d3.js. This allows users to view the number of visitors, visits and chats on each page of the website by traversing through URLs represented as a tree.
  • Languages Used: Python, javascript (d3.js)
KafkaSpark StreamingGraph DatabasesMachine LearningSoftware EngineeringData Engineering

Student Intern

Jul 2013Dec 2013 · 5 mos · Bangalore

  • Key projects:
  • Simulation and Modeling Ready Datasets
  • I wrote a series of Python and HiveQL scripts that consume raw events from the Hadoop file system and session-ize, process and transform this data into variables that can be used directly by analysts to build models. The scripts ran as a daily job and the output tables were stored in a column-oriented SQL database.
  • Technologies Used: Hadoop MapReduce, Hive
  • Modeling Workbench
  • This tool allows analysts to perform Exploratory Data Analyses on their datasets and view the results visually. I wrote some of the backend R codes for this.
  • Language Used: R

Gmr varalakshmi foundation

Student Intern

May 2012Jul 2012 · 2 mos · Bangalore, India

  • NGO meet and Exhibition
  • I held an NGO meet to facilitate co-ordination among NGOs in Bangalore. This was attended by over 100 NGOs. I also helped organize an exhibition of traditional art work by students of the Foundation.
  • Poster and Pamphlet Design
  • I designed posters and pamphlets that were used to publicize the NGO meet and the exhibition, as well as brochures and flyers that were used to promote the Foundation, its vocational training center and the work that it does.

Education

Birla Institute of Technology and Science, Pilani

Bachelor's degree — Computer Science

Jan 2010Jan 2014

Birla Institute of Technology and Science, Pilani

Bachelor's degree — Computer Science

Jan 2010Jan 2014

Stackforce found 100+ more professionals with Machine Learning & Software Engineering

Explore similar profiles based on matching skills and experience