Arpit Garg

Software Engineer

Noida, Uttar Pradesh, India4 yrs 4 mos experience
AI Enabled

Key Highlights

  • Expert in building scalable microservices architectures.
  • Proven track record in enhancing system reliability and performance.
  • Passionate about integrating AI into backend systems.
Stackforce AI infers this person is a Backend Engineer specializing in scalable SaaS architectures and AI integration.

Contact

Skills

Core Skills

Spring BootMicroservices

Other Skills

Google GeminiKafkaRedisPrometheusGrafanaREST APIsClean CodingPostgreSQLJavaSpring CloudSpring MVCHibernateOpenAI APIMongoDBMySQL

About

I’m a Backend Engineer with 4+ years of experience designing and scaling high-performance, distributed systems that drive real business impact. I love turning complex backend challenges into simple, elegant, and reliable solutions that keep products running seamlessly at scale. My expertise lies in Java, Spring Boot, and microservices architecture — where I build resilient, fault-tolerant APIs and event-driven systems optimized for speed, scalability, and maintainability. I’ve engineered production-ready platforms using Kafka, Redis, MySQL, PostgreSQL, and deployed them through Docker, Kubernetes, and Google Cloud (GCP). I’m passionate about system design, scalability, and clean architecture, ensuring every service I build is observable, efficient, and easy to evolve. I use tools like Prometheus, Grafana, and ELK to monitor system health and drive continuous improvement. Beyond backend systems, I’ve integrated AI and LLMs (OpenAI, Gemini) into data-driven workflows — blending intelligent automation with enterprise-grade engineering. I believe great engineers don’t just ship code — they ship reliability, performance, and user trust. I thrive in product-driven environments that value ownership, craftsmanship, and innovation. 🚀 Always excited to collaborate on scalable backend architectures, AI-powered platforms, and systems that make technology invisible — but powerful.

Experience

Oracle

Senior Member of Technical Staff

Feb 2026Present · 1 mo · Noida, Uttar Pradesh, India · On-site

Birdeye

2 roles

SDE-II

Promoted

Jan 2025Feb 2026 · 1 yr 1 mo · Remote

  • 𝐁𝐮𝐢𝐥𝐭 𝐚𝐧 𝐀𝐈-𝐩𝐨𝐰𝐞𝐫𝐞𝐝 𝐝𝐚𝐭𝐚 𝐩𝐫𝐨𝐜𝐞𝐬𝐬𝐢𝐧𝐠 𝐩𝐢𝐩𝐞𝐥𝐢𝐧𝐞 using Google Gemini and Kafka Streams to validate and enrich 100𝐊+ 𝐜𝐥𝐢𝐞𝐧𝐭 𝐫𝐞𝐜𝐨𝐫𝐝𝐬 𝐢𝐧 𝐫𝐞𝐚𝐥 𝐭𝐢𝐦𝐞, improving data accuracy and automation coverage.
  • 𝐋𝐞𝐝 𝐝𝐞𝐯𝐞𝐥𝐨𝐩𝐦𝐞𝐧𝐭 𝐨𝐟 𝐚 𝐬𝐞𝐥𝐟-𝐬𝐞𝐫𝐯𝐞 𝐢𝐧𝐭𝐞𝐠𝐫𝐚𝐭𝐢𝐨𝐧 𝐩𝐥𝐚𝐭𝐟𝐨𝐫𝐦 that automated partner onboarding, cutting manual support overhead by 70% and accelerating go-live times.
  • 𝐄𝐧𝐡𝐚𝐧𝐜𝐞𝐝 𝐫𝐞𝐥𝐢𝐚𝐛𝐢𝐥𝐢𝐭𝐲 𝐨𝐟 𝐝𝐢𝐬𝐭𝐫𝐢𝐛𝐮𝐭𝐞𝐝 𝐬𝐲𝐬𝐭𝐞𝐦𝐬 by introducing Redis-based distributed locks and Resilience4j for retries, improving fault tolerance and data integrity across services.
  • 𝐀𝐫𝐜𝐡𝐢𝐭𝐞𝐜𝐭𝐞𝐝 𝐬𝐜𝐚𝐥𝐚𝐛𝐥𝐞 𝐉𝐚𝐯𝐚/𝐒𝐩𝐫𝐢𝐧𝐠 𝐁𝐨𝐨𝐭 𝐦𝐢𝐜𝐫𝐨𝐬𝐞𝐫𝐯𝐢𝐜𝐞𝐬 processing 1𝐌+ 𝐰𝐞𝐛𝐡𝐨𝐨𝐤-𝐛𝐚𝐬𝐞𝐝 𝐞𝐯𝐞𝐧𝐭𝐬 𝐰𝐢𝐭𝐡 99.99% 𝐮𝐩𝐭𝐢𝐦𝐞, monitored via Prometheus and Grafana.
  • 𝐁𝐮𝐢𝐥𝐭 𝐚 𝐛𝐢-𝐝𝐢𝐫𝐞𝐜𝐭𝐢𝐨𝐧𝐚𝐥 𝐚𝐩𝐩𝐨𝐢𝐧𝐭𝐦𝐞𝐧𝐭 𝐬𝐲𝐧𝐜 𝐬𝐲𝐬𝐭𝐞𝐦 across 20+ CRMs, improving data consistency and reducing operational overhead by 30%.
  • Integrated 40+ enterprise CRMs (HubSpot, Zoho, QuickBooks, etc.) via REST, SOAP, and SFTP, expanding interoperability and enabling cross-platform automation.
  • Optimized data pipelines using Apache NiFi and Aerospike TTL job scheduling, ensuring low-latency and fault-tolerant task execution.
  • Mentored a 4-member backend team, leading code reviews and enforcing SOLID and clean architecture practices for high-quality, scalable releases.
Google GeminiKafkaRedisSpring BootPrometheusGrafana+2

SDE-I

Jul 2023Jan 2025 · 1 yr 6 mos · Remote

Spring BootClean Coding

Epam systems

3 roles

SDE-II

Promoted

Jun 2022Jul 2023 · 1 yr 1 mo

  • 𝐃𝐞𝐬𝐢𝐠𝐧𝐞𝐝 𝐬𝐜𝐚𝐥𝐚𝐛𝐥𝐞 𝐥𝐨𝐠 𝐢𝐧𝐠𝐞𝐬𝐭𝐢𝐨𝐧 𝐦𝐢𝐜𝐫𝐨𝐬𝐞𝐫𝐯𝐢𝐜𝐞𝐬 𝐡𝐚𝐧𝐝𝐥𝐢𝐧𝐠 10𝐌+ 𝐝𝐚𝐢𝐥𝐲 𝐞𝐯𝐞𝐧𝐭𝐬 with sub-second latency, ensuring 99.99% 𝐫𝐞𝐥𝐢𝐚𝐛𝐢𝐥𝐢𝐭𝐲 for Google Chronicle.
  • 𝐃𝐞𝐯𝐞𝐥𝐨𝐩𝐞𝐝 𝐚 𝐫𝐞𝐚𝐥-𝐭𝐢𝐦𝐞 𝐑𝐮𝐥𝐞 𝐄𝐧𝐠𝐢𝐧𝐞 leveraging Kafka Streams and algorithmic logic to detect security threats, boosting detection accuracy by 45% and reducing false positives.
  • 𝐄𝐧𝐠𝐢𝐧𝐞𝐞𝐫𝐞𝐝 𝐝𝐚𝐭𝐚 𝐭𝐫𝐚𝐧𝐬𝐟𝐨𝐫𝐦𝐚𝐭𝐢𝐨𝐧 𝐩𝐢𝐩𝐞𝐥𝐢𝐧𝐞𝐬 converting raw logs into Unified Data Model (UDM) using Chronicle Base Normalizers, achieving seamless schema validation and integration.
  • 𝐎𝐩𝐭𝐢𝐦𝐢𝐳𝐞𝐝 𝐛𝐚𝐜𝐤𝐞𝐧𝐝 𝐬𝐲𝐬𝐭𝐞𝐦𝐬 for performance and scalability through Spring Boot, Kafka, Redis, and PostgreSQL, contributing to faster data processing and reduced infrastructure costs.
  • Received the Delivery Excellence Award for on-time delivery, system optimization, and contributions to distributed system design.
Spring BootKafkaRedisPostgreSQLClean Coding

SDE-I

Oct 2021May 2022 · 7 mos

  • Worked with 𝐆𝐨𝐨𝐠𝐥𝐞 for developing security tool named as "𝐆𝐨𝐨𝐠𝐥𝐞 𝐂𝐡𝐫𝐨𝐧𝐢𝐜𝐥𝐞", that is a SIEM (Security Information and Event Management) tool, which collects logs from various sources and parsed them to get structured data upon which signatures will be run to trigger security forensic alerts.
  • Chronicle is a cloud service, built as a specialized layer on top of core Google infrastructure, designed for enterprises to privately retain, analyze, and search the massive amounts of security and network telemetry they generate. Chronicle normalizes, indexes, correlates, and analyzes the data to provide instant analysis and context on risky activity. Chronicle provides following services:
  • 1. Threat Investigation
  • 2. Threat hunting and detection
  • 3. Security Analytics
  • Designed and implemented scalable microservices architecture for large-scale log ingestion and processing in Google Chronicle, ensuring efficient data handling and low-latency response times.
  • Spearheaded the development of the Rule Engine Service, leveraging advanced algorithms to detect malicious activities, significantly improving Chronicle’s security monitoring capabilities.
  • Engineered log parsing into the Unified Data Model (UDM) format using native parsers and CBNs, ensuring consistent data structures and seamless integration into Google Chronicle’s log analysis pipeline.
  • Collaborated with Google's internal teams to define technical requirements, optimizing API performance and functionality to meet high-security and operational demands.
  • Contributed to the continuous improvement of API capabilities, introducing innovative features, optimizing existing functionalities, and providing critical production environment support.
  • Performed Unit testing using JUnit and Mockito to uncover bugs and troubleshoot issued prior to application launch.
  • Provided technical KT sessions to development team for various topics like Spring, Spring Boot, Java 8.
Spring BootClean Coding

Software Engineer Intern

May 2021Oct 2021 · 5 mos

Spring BootClean Coding

Education

Meerut Institute of Engineering and Technology(MIET)

Bachelor of Technology - BTech — Computer Science

Jan 2017Jan 2021

Stackforce found 100+ more professionals with Spring Boot & Microservices

Explore similar profiles based on matching skills and experience