Saurabh Singh

CEO

Pune, Maharashtra, India11 yrs 8 mos experience
Most Likely To SwitchHighly Stable

Key Highlights

  • Expert in cloud-native data solutions and architecture.
  • Proven track record in enterprise-scale data platform development.
  • Strong leadership in managing cross-functional teams.
Stackforce AI infers this person is a Cloud and Data Engineering expert specializing in SaaS solutions.

Contact

Skills

Core Skills

Data EngineeringCloud Platforms

Other Skills

LangGraphGoogle Cloud Platform (GCP)Big DataAmazon Web Services (AWS)Extract, Transform, Load (ETL)Ab InitioETL ToolsApache NiFiInsuranceCloud dataCloud SpannerArtificial Intelligence (AI)Apache AirflowData ArchitectureCDP

About

Cloud Architect & Data Engineer | 11+ Years | GCP & AWS Expert Results-driven professional specialising in enterprise-scale data platforms and cloud migrations. Proven track record of delivering solutions that drive measurable business impact through innovative data architecture. Key Expertise: • Data Engineering: End-to-end pipeline development, ETL/ELT, data warehousing, real-time processing • Cloud Platforms: GCP (BigQuery, Dataflow, GCS) & AWS (S3, Redshift, RDS) • Big Data Technologies: Apache Beam, Spark, Airflow, data lakes, microservices architecture • AI/ML: LangGraph, Vertex AI, data science model support • Programming: Python, Java, Shell scripting, SQL Open to discussing cloud-native data solutions and digital transformation opportunities.

Experience

Accenture

2 roles

Data Architect - Manager

Promoted

Sep 2025Present · 6 mos

Data Engineering Associate Manager

Oct 2021Sep 2025 · 3 yrs 11 mos

  • Responsible for designing and developing real time Consumer Data Platform(CDP) platform.
  • Designed and Implemented various CDP features like identity resolution, audience creation, customer personalisation, marketing activation.
  • Low latency, High performant and Robust platform developed to support the high volume request coming from front-end by implementing cloud based micro service architecture.
  • Leading and grooming a big team including applications developers, data engineers, functional/ performance tester and devOps.
  • Multiple integration with third party tools to meet the business requirements.
  • As a Facilitator and Enabler responsible for bridging the gap between technical and business teams.
  • Highly appreciated for problem solving skill, developing a technical strong team and leadership skills.
LangGraphGoogle Cloud Platform (GCP)Data EngineeringCloud Platforms

Virtusa

Senior Consultant

Apr 2021Oct 2021 · 6 mos · Pune Division, Maharashtra, India

  • Responsible for migrating on-prem database to google cloud.
  • Working on BigQuery, GCS, Airflow, dataproc, cloud function and other services in public cloud to create data pipelines and data warehouse.
  • Developed automation that helped increase the delivery velocity by 60% and increased the code quality.
  • Responsible for developing robust and efficient data pipeline using airflow.
  • Developed standard process to create data-lake, ETL pipeline and data modelling.
  • Hands-on experience working on real time data streaming via Kafka and using spark stream to clean and process data in batches.
  • Use Github to handle the code deployment and version control
Google Cloud Platform (GCP)Big DataData EngineeringCloud Platforms

Vodafone

2 roles

Deputy Manager

Jul 2020Apr 2021 · 9 mos

  • Working in hybrid -cloud model involving GCP, AWS and On-premise.
  • Experience working on BigQuery, GCS, Airflow, dataprep, Redshif, Redshift spectrum, S3, AWS Glue and other services in public cloud to create data lake and data warehouse.
  • Responsible for developing robust and efficient data pipeline using airflow. Also, ETL process using Ab Initio and Google Cloud.
  • Hands on experience with NiFi to move data across multiple platforms as per requirement.
  • Use Github to handle the code deployment and version control.
Google Cloud Platform (GCP)Big DataData EngineeringCloud Platforms

Assistant Manager

Dec 2017Jun 2020 · 2 yrs 6 mos

  • Working in a hybrid-cloud environment like AWS,GCP and On-premise.
  • Responsible for developing robust and efficient data pipeline using spark/python.
  • Worked on various POC across cloud with NiFi, Kafka, Big Query etc.
  • Developed and build pipelines from scratch to extract, model and load data to meet the requirement in most efficient way.
  • Hand on experience working on NiFi, AWS, GCP, Spark, Scala, Ab Initio, Teradata and UNIX.
  • Extensive experience developing script in UNIX to meet the requirements and automate the process.
Google Cloud Platform (GCP)Amazon Web Services (AWS)Data EngineeringCloud Platforms

Lti - larsen & toubro infotech

Software Engineer

Jan 2017Nov 2017 · 10 mos · Pune/Pimpri-Chinchwad Area

  • Developed and build process from scratch to extract, model and load data to meet the requirement in most efficient way.
  • Hand on experience working on Ab Initio, Teradata and UNIX.
  • Created various generic jobs dependent on the formal parameter to meet the requirement.
  • Extensive experience developing script in UNIX to meet the requirements and automate the process.
  • Experience working with Teradata. Worked with creating table, applying queries, constraints and loading.
  • Creating high level and low-level design documents, unit test plan and test scripts.
Extract, Transform, Load (ETL)Ab InitioData Engineering

Infosys

Senior System Engineer

Feb 2014Oct 2016 · 2 yrs 8 mos · Pune/Pimpri-Chinchwad Area

  • The Client is one of the leading global health insurance companies. Project involved fulfilling the client’s needs for establishing and maintaining a robust Data Warehouse.
  • Tools – Ab Initio GDE, Ab Initio Co-ops, IBM DB2 UDB, UNIX, IBM Rational Clear case
  • Developed and build process from scratch to extract, model and load data to meet the requirement in most efficient way.
  • Hand on experience working on Ab Initio, DB2 and UNIX. Primary responsibility to extract the huge data, apply transformation logic to meet the requirements and load it to target location.
  • Responsible for cleansing the data from source systems using Ab Initio components such as Join, Dedup Sorted, Reformat, Filter-by- Expression, Read Separated values, Rollup, Scan.
  • Extensive experience developing script in UNIX to meet the requirements and automate the process.
  • Experience working with DB2. Worked with creating table, applying queries, constraints and loading.
ETL ToolsAb InitioData Engineering

Education

Sir Padampat Singhania University, Udaipur

Bachelor of Technology (B.Tech.)

Jan 2009Jan 2013

Kendriya Vidyalaya

Senior School Certificate Examination

Jan 2007Jan 2009

Stackforce found 100+ more professionals with Data Engineering & Cloud Platforms

Explore similar profiles based on matching skills and experience