Utkarsh Sharma

CTO

Bengaluru, Karnataka, India9 yrs 2 mos experience
Most Likely To SwitchHighly Stable

Key Highlights

  • Expert in building scalable data pipelines.
  • Proven track record in optimizing data processing.
  • Strong background in machine learning platforms.
Stackforce AI infers this person is a Big Data Engineer with expertise in SaaS and Healthcare solutions.

Contact

Skills

Core Skills

Apache SparkBig DataMachine LearningData EngineeringSoftware DevelopmentWeb DevelopmentFull Stack Development

Other Skills

.NETAirflowAmazon Web Services (AWS)Atlassian BambooAzure DatabricksBambooC#CSSDruidFast APIHBaseHDFSHTMLHadoopHibernate

About

Software Developer with a passion for solving complex algorithmic challenges. Skilled in Java, SQL Server, Apache Spark, Hive, Hadoop and (Officially Certified Developer) MongoDB.

Experience

Apple

2 roles

Lead engineer - Backend & Big Data

Promoted

Nov 2022Present · 3 yrs 4 mos · On-site

  • Architected and Lead development from scratch of a self serve ML analysis platform.
  • It supports running large scale ML models on Spark & Ray on massive volumes of data and displaying results back to UI. Used Spark, Ray, Airflow, Iceberg, Fast API, Postgres, Kafka, Python, Scala, Terraform.
  • Lead backend and data pipeline development of a platform which is responsible for collecting petabytes of data from Apple factories. Introduced several optimisations and lead several features. Used Spark, Snowflake, Airflow, Druid, Terraform, Scala, Python, Java,
  • HTTP4s.
SparkRayAirflowIcebergFast APIPostgres+7

Senior Engineer

Feb 2021Nov 2022 · 1 yr 9 mos · On-site

  • Worked on a manufacturing quality analysis tool, to optimise and stabilise the data
  • pipelines. Was able to optimise data processing by approximately 60%. Used Spark, Hive, Java,
  • Spring Boot, Kubernetes.
SparkHiveJavaSpring BootKubernetesBig Data+1

Walmart labs

Software Engineer 3 - Big Data

Nov 2018Feb 2021 · 2 yrs 3 mos · Bengaluru, Karnataka, India

  • Team: Smart Forecasting
  • About the team: Building one of the largest demand forecasting systems in the world. This system forecasts the demand for every product across all Walmart stores in US, Canada, Mexico and UK.
  • Key responsibilities:
  • 1. Design and build data pipelines from scratch for new features. The data processed is at the scale of a few TB every week.
  • 2. Optimize existing batch jobs that crunch a huge amount of data.
  • 3. Helped in our migration from on-premise Hadoop clusters to Azure.
  • 4. Design, build and maintain airflow clusters that orchestrate our data pipelines.
  • Languages and tools used: Java, Python, Apache Spark, SQL Server, Hive, Hadoop, HDFS, Azure Databricks, Airflow and Kubernetes
JavaPythonApache SparkSQL ServerHiveHadoop+6

Abco india private ltd

2 roles

Associate Software Developer

Jul 2017Nov 2018 · 1 yr 4 mos

  • Team: Crimson Quality Reporting (CQR)
  • Key Responsibilities:
  • 1. Enhancement of a product called Crimson Quality Reporting, which deals with the processing of massive amounts of patient data. I have worked on optimizing this product and introducing new features to it. We have helped bring down the SLA times for large datasets by more than half. Technologies used- Java, Spring, RabbitMQ, Atlassian Bamboo, Mybatis, SQL Server.
  • 2. Built an application to enable regression tests on the entire CQR pipeline using Spring Boot, Java, Mybatis, PowerShell, and SQL Server.
  • 3. Building a measure processing engine based on the US healthcare MACRA/MIPS measures using Scala, Spark, SQL Server, and Java. It will process terabytes of data in a single batch.
  • 4. Built a Measure Management application to easily trigger loads for the CQR pipeline, monitor the health of the individual services in the pipeline, and provide easy access for configuration changes in the pipeline. Technologies Used: ReactJS, Spark, Java, Spring Boot, SQL Server.
JavaSpringRabbitMQAtlassian BambooMybatisSQL Server+4

Intern

Jan 2017Jul 2017 · 6 mos

  • Worked on an application called Console 360, which is the client side application for all the ABCO India products. It is a single point web application from which users can monitor the status of their products, access the files sent for processing and view comprehensive reports via plugins from other products. I worked as full stack developer on this project using the following technologies: ReactJS, HTML, CSS, NodeJS, C#, and .NET.
ReactJSHTMLCSSNodeJSC#.NET+2

Education

Vellore Institute of Technology

Bachelor of Technology (B.Tech.) — Computer Science

Jan 2013Jan 2017

Stackforce found 100+ more professionals with Apache Spark & Big Data

Explore similar profiles based on matching skills and experience