Ajay Morale

Software Engineer

Pune, Maharashtra, India6 yrs experience

Key Highlights

  • Expert in data migration and ETL processes.
  • Proficient in Google Cloud Platform technologies.
  • Strong background in building scalable data pipelines.
Stackforce AI infers this person is a Data Engineering expert with extensive experience in cloud-based data migration and pipeline development.

Contact

Skills

Core Skills

Data EngineeringData MigrationData ModelingSoftware DevelopmentWeb DevelopmentBig Data

Other Skills

AirflowAndroidAndroid DevelopmentAndroid StudioApache BeamApache KafkaApache OozieApache SparkBMC Control-MBashBigQueryBigQuery SQLBitbucketC++Cloud Function

About

I'm a Senior Data Engineer who has a passion for learning. Always looking for an opportunity to learn something new and try different technologies which can enhance my knowledge, experience, capabilities and career. I am a quick learner, great team player, self motivated, open to feedback and best at problem solving.

Experience

6 yrs
Total Experience
1 yr 10 mos
Average Tenure
5 mos
Current Experience

Servicenow

Senior Software Engineer, Data Platform

Nov 2025Present · 5 mos · Hybrid

Ust

Senior Data Engineer

Oct 2023Nov 2025 · 2 yrs 1 mo · Pune, Maharashtra, India

  • Data Migration from Teradata, BCBSA
  • ● Developed and implemented an ETL pipeline to migrate data from teradata to BigQuery tables. Received inputs from client in form of different sources like csv, json, txt, xlsx or Bigquery table. Used BigQuery SQL to do required transformations on the data. Worked on designing and implementing file to table, table to table load cases and generating extracts.
  • ● Worked on various functions in bigquery like windowing functions, analytical functions, views, implementing CDC and SCD logics.
  • ● Worked along with customer to understand the requirements and collaborated with different teams to design and implement optimized solution.
  • ● Develop and implement data quality and governance procedures to ensure the accuracy and reliability of data.
  • ● Leveraged Bash for fundamental operations and Composer Airflow for pipeline orchestration.
  • ● Ensured data integrity and optimized performance throughout the migration process.
  • Data Modeling, NJACA.
  • ● Designed and implemented a scalable data pipeline using Google Cloud Dataflow – Apache Beam with Java to process and unify data from multiple heterogeneous input sources.
  • ● Developed and optimized Apache Beam – Java pipelines within Dataflow to efficiently transform incoming datasets from varied formats into a standardized schema.
  • ● Implemented data quality enhancement rules to cleanse, validate, and enrich data before loading it into Google BigQuery for downstream analytics and reporting.
  • ● Ensured high availability and fault tolerance of the data pipeline, enabling seamless handling of large-scale streaming and batch data.
  • ● Improved overall data accuracy, processing speed, and maintainability through performance tuning and modular design.
Data MigrationETLBigQueryData QualityBashAirflow+1

Quantiphi

2 roles

Senior Data Engineer

Apr 2023Sep 2023 · 5 mos · Mumbai, Maharashtra, India

  • Data Migration for Mr Cooper, Google.
  • ● Build a data pipeline to load 6 billion+ documents from IBM Filenet to GCP Document Warehouse(upcoming tool).
  • ● Worked with Storage Transfer service to copy these documents from filenet to GCS. Then used cloud function along with cloud SQL and bigquery to load and validate these documents on Document warehouse. Used GCP Cloud Task to re-run the process for failed documents.
  • ● Worked with different product teams from Google Cloud to build a high performance data migration pipeline.
  • ● Worked on research and development to find best technologies suited for the job, worked with different GCP tools like Datastore, firestore, workflows, Cloud SQL, Cloud Function etc.
  • Web Application, Bayer.
  • ● Build a web application for data scientists using gcp cloud technologies in backend and angular for frontend to provide an environment where they can do their machine learning work.
  • ● Functionalities include fetching required data to train and test model, providing necessary environment, managing access to users, visualizing all usage data.
  • ● Build this web application using python flask framework using API’s of google cloud technologies like VM Instance, Datastore, Cloud Storage, Pubsub, Jupyterlab.
  • ● Worked with GCP secrete manager to store user id, password, keys required, encrypting and ensuring security of the application
  • Awards / Quantiphi Analytics (Count - 2)
  • 1. Received Unsung Hero Award.
  • 2. Received Kingsmen Award.
  • 3. Nominated for Group of Talent Award.
Data PipelineGCPCloud FunctionCloud SQLStorage Transfer ServiceData Engineering+1

Big Data Engineer

Jul 2021Apr 2023 · 1 yr 9 mos · Mumbai, Maharashtra, India

  • Hadoop Migration and Real-Time Data Processing Pipeline, HCA
  • ● Led the migration of an on-premise Hadoop-based data processing system (Scala, Java) to Google Cloud Platform (GCP) for improved scalability, performance, and maintainability.
  • ● Integrated existing Kafka topics as input streams to ingest real-time data into the new cloud architecture.
  • ● Developed and deployed GCP Cloud Functions – Java to process streaming data efficiently, replacing legacy Scala batch jobs.
  • ● Implemented a robust data pipeline to handle edge cases and ensure fault-tolerant, near real-time data processing.
  • ● Loaded processed data into BigQuery as the centralized data warehouse for analytics and reporting.
  • Optimized data flow and streaming performance, improving reliability and reducing latency across the pipeline.
  • ● Collaborated with cross-functional teams to validate data integrity, monitor pipelines, and automate deployment workflows.
  • Bayer, Web Application
  • ● Hands-on Experience in Python Flask framework, Datastore, Google Cloud Storage, Cloud Function, pubsub, VM instance, Jupyterlab API, Docker Kubernetes, Secret Manager for creating a Platform for data scientists to make data easily available for further research.
  • ● Worked on creating an API using the flask framework of python. Also worked on Datastore, VM Instance python API to perform all types of operations along with cloud function and pubsub. Worked with Secret Manager to store credentials and fetch the same using python API.
  • ● Worked with Jupyterlab API and VM instances for creating an environment for data scientists for further research.
  • ● Implemented GATK pipeline in GCP environment in order to get and process human genome data in a POC.
HadoopGCPKafkaCloud FunctionsData EngineeringBig Data

Datametica solutions private limited

Big Data Developer

Feb 2020Jun 2021 · 1 yr 4 mos · Pune, Maharashtra

  • Data Migration from Teradata , RMG
  • ● Worked on migration of 100+ objects for the client: RMG(Royal Mail Group). Migration was from DataStage, Teradata to GCP.
  • ● Hands-on experience includes an understanding of the DataStage logic and implementing the same using Bigquery in GCP.
  • ● Worked on Datastage master sequences, sequences, and pxJobs to implement end to end data pipeline for loading the data from teradata into BigQuery and scheduling the jobs using Cloud Composer.
  • ● Worked on deployment of the project and helped stabilize the same. The CI/CD tools used in this were GIT, BitBucket, jenkins and Airflow job scheduler. Documentation was done using confluence.
  • OnPrem (Hadoop) Data Migration, Telus
  • ● Migration project from OnPrem (Hadoop System) to GCP, Build a complex dataflow pipeline involving multiple sources including DB2, Oracle, Hive, Hadoop as source and bigquery as sink using python, SQL, HQL, bash scripting, and oozie, Airflow which run successfully on the production environment without major fixes.
  • ● Worked on Historical Data Migration, Code conversion for Spark, Scala, Hive, Shell scripts to Bigquery/Dataflow jobs.
  • ● Implemented CIF(Common Ingestion Framework) which provided functionalities like Data Loss Prevention, Do not touch list records, DeDup Check, NULL Check, Data Quality Modules, Alerting, Record Update Strategy.
  • ● Worked on Dag Factory to create Airflow dags dynamically using yaml config files for orchestration of Telus project.
  • ● Worked on Oozie workflows to execute previous codes of client to extract the data from tables for Historical Data Load.
  • Spot Awards/ Datametica Solutions(Count - 3)
  • ● Received for quickly understanding the project and providing quality delivery in less amount of time.
  • ● Received for implementing python utilities that automated lot of processes in project
  • ● Received for work done in poc which includes cloud function that can trigger any action whenever an operation is performed on bigquery
Data MigrationDataStageBigQueryAirflowData Engineering

Education

MIT Kothrud Pune

Bachelor's degree — Information Technology

Jan 2016Jan 2019

Government Polytechnic pune

diploma in computer engineering

Jan 2013Jan 2016

Stackforce found 100+ more professionals with Data Engineering & Data Migration

Explore similar profiles based on matching skills and experience