Pulkit Rathi

Backend Engineer

Bengaluru, Karnataka, India5 yrs 6 mos experience
Most Likely To SwitchHighly Stable

Key Highlights

  • 5+ years of experience in backend engineering.
  • Expertise in building scalable cloud services.
  • Published researcher with multiple U.S. patent applications.
Stackforce AI infers this person is a Backend Engineer specializing in SaaS and Fintech with a focus on distributed systems.

Contact

Skills

Core Skills

Cloud ServicesDistributed SystemsWorkflow AutomationService ArchitectureInfrastructure AutomationData EngineeringBackend DevelopmentData AnalysisData Science

Other Skills

AndroidAndroid DevelopmentAndroid StudioApache KafkaApache SparkAuthenticationAuthorizationAzure CosmosDBAzure Data FactoryAzure DatabricksAzure Event HubsBig DataC (Programming Language)C++Data Structures

About

- I am diligent, self-motivated, and analytical, and I like to work with passion and a positive mindset for the advancement of the organization. - A backend Engineer with 5+ years of experience building scalable cloud services and data platforms at Oracle and Dell. I’ve expertise in Java, K8s, Docker, Cloud, Terraform, AI, and distributed systems with a strong foundation in infrastructure automation. A published researcher with multiple U.S. patent applications in deep learning and distributed systems. - Overall, I am a strong software engineer with experience in building large-scale distributed systems and solving the challenges that come with it. I am confident I would be a valuable asset to any team working on distributed systems. - GATE 2019 All India rank: 706 (99.3 percentile)

Experience

Oracle

Member Of Technical Staff

Oct 2021Present · 4 yrs 5 mos · Bengaluru, Karnataka, India

  • Oracle Cloud Infrastructure (OCI) Data Flow is a fully managed Apache Spark serverless cloud-native product that simplifies big data and ML applications without having to worry about managing infrastructure. It enables rapid application delivery as developers can focus on application development, not infrastructure management. Customers can develop Bigtata/ETL/ML/Delta spark applications using Scala, Python, Java, SQL.
  • 1. Designed a multi-agent LLM workflow with LangGraph to triage on-call Jira tickets by integrating logs, runbooks, and Slack history; reducing incident investigation time by 40% and cutting manual triage effort for on-call engineers.
  • 2. Spearheaded development of a multi-tenant gateway service responsible for authentication (AuthN), authorization (AuthZ), throttling, and intelligent routing of requests from Notebooks, Workflows, and SQL Tools to the Compute Clusters and reducing latency by 80% from 300 ms to 50 ms.
  • 3. Provisioned highly available and scalable infrastructure using Terraform and k8s, ensuring performance and resilience. In addition, I automated the security patching reducing the infrastructure/operations costs by 25% annually.
  • 4. Acted as Subject Matter Expert (SME) for incident escalations, war room handling, Touchless region build executions, and mitigation of production capabilities within the Data Flow team, delivering 50+ regions.
  • 5. Worked with Oracle Cloud Data Flow, a serverless Spark platform, and gained hands-on experience with control and data plane system design, data sharing strategies, Java-based microservices architecture, and infrastructure automation using Docker, Kubernetes and Terraform.
  • Tech Stack: Java, Python, Docker, Dropwizard, Terraform, Kubernetes, Apache Spark, LangChain, LangGraph
JavaPythonDockerTerraformKubernetesApache Spark+4

Dell technologies

2 roles

Data Engineer

Aug 2020Sep 2021 · 1 yr 1 mo · Bengaluru, Karnataka, India

  • 1. Developed backend services in Node.js for a fraud detection ecosystem, integrating GraphQL APIs to query and deliver insights from Azure Cosmos DB reducing the fraud returns by 30%.
  • 2. Engineered robust data ingestion pipelines from various data lakes and warehouses into Cosmos DB using Azure Databricks and Azure Data Factory to support fraud analytics.
  • 3. Built a telemetry collection system for Dell devices using OpenTelemetry collector and exposed metrics through Prometheus + Grafana stack; supported ingestion of 10M+ metrics daily for AI workloads.
  • 4. Automated headcount reporting workflows by building SSIS packages to streamline data flow into SQL databases from multiple data sources.
  • Technologies Used: Azure CosmosDB, Azure Data Factory, Azure Databricks, Azure Datalake, GraphQL, MS Access, Node.js, Python, SQL Server, SSIS.
Node.jsGraphQLAzure CosmosDBAzure Data FactoryAzure DatabricksPython+4

Graduate Intern

Jan 2020May 2020 · 4 mos · Bengaluru, Karnataka, India

  • 1. Developed a python application for analyzing and visualizing driver scan logs using Flask and Python.
  • 2. Worked on ETL pipeline for parquet file generation using Spark and Azure event hubs.
  • Technologies Used: Flask, Matplotlib, Pandas, Python, Scala, Spark.
FlaskPythonSparkAzure Event Hubs

Meesho

Data Science Intern

May 2019Jul 2019 · 2 mos · Bangalore

  • 1. Did data analysis on user data to find trends in user behaviour.
  • 2. Developed a recommendation system to increase orders per user by showing personalized feed to the users.
  • Technologies Used : Scala, Spark, Pandas, NumPy, Matplotlib, Python.
ScalaSparkPandasNumPyPython

Education

ABV-Indian Institute of Information Technology and Management

Integrated Post Graduate (B.Tech + M.Tech) — Information Technology

Jan 2015Jan 2020

Kendriya Vidyalaya

Intermediate — PCM

Jan 2014Jan 2015

Kendriya Vidyalaya

High School

Jan 2012Jan 2013

Stackforce found 100+ more professionals with Cloud Services & Distributed Systems

Explore similar profiles based on matching skills and experience