Sachin Shekhawat

DevOps Engineer

Jaipur, Rajasthan, India4 yrs 10 mos experience
Most Likely To Switch

Key Highlights

  • Expert in architecting scalable data platforms.
  • Proven track record in data migration and modernization.
  • Strong leadership in mentoring data engineering teams.
Stackforce AI infers this person is a Data Engineering expert in Fintech and Cloud Data Solutions.

Contact

Skills

Core Skills

Data EngineeringData MigrationData Pipeline DevelopmentData Integration

Other Skills

APACHE AIRFLOWAPACHE NIFIAPACHE SPARKAWSApacheApache KafkaApache SparkApache SqoopBIG DATABusiness Intelligence (BI)DATA PIPELINESDBTData AnalysisData ModelingData Processing

About

I’m a Lead Data Engineer who genuinely enjoys building things that make data simple, reliable, and meaningful. Over the years, I’ve worked across banking, retail, and supply chain industries—designing data platforms, migrating legacy systems into Snowflake, and leading teams through complex engineering challenges. What drives me is not just writing code or building pipelines—it’s solving problems that matter, improving systems that people rely on, and helping teams deliver better, faster, and smarter. I love the mix of creativity and logic in data engineering: understanding how the business thinks, shaping the data to match that story, and architecting solutions that scale effortlessly. I’ve been working across Big Data ecosystems, cloud platforms, and modern tooling like Snowflake, DBT, Airflow, NiFi, Kafka, and Spark. Whether it’s designing transformations, optimizing SQL, tuning warehouses, or automating workflows—my goal is always the same: build something clean, efficient, and future-proof. I’m at my best when: • I’m architecting a data platform that simplifies a messy legacy system • I’m leading engineers, reviewing designs, or helping someone unblock a tough problem • I’m optimizing a pipeline or troubleshooting something deep inside the data • I’m helping clients turn their vision into a real, scalable solution • I’m learning a new tool or experimenting with a better way of doing things My technical world includes: Snowflake, DBT, Airflow, Apache NiFi, Spark, Hadoop, Hive, HBase, Kafka, Python, SQL, AWS, MS SQL, MySQL, Tableau, and end-to-end data engineering in all forms. At the end of the day, I enjoy building data systems that people actually want to use—systems that perform well, run smoothly, and create impact. If you’re working on data modernization, Snowflake architecture, or large-scale transformation projects, I’m always open to connecting :)

Experience

Hakkōda

Lead Data Engineer

May 2024Present · 1 yr 10 mos · Jaipur, Rajasthan, India

  • Working as a Lead Data Engineer, helping enterprise clients migrate their data into modern Snowflake-based platforms and redesign their legacy architectures into scalable, cloud-native data ecosystems. I lead the of data engineering solutions, from pipeline design to deployment, while guiding teams, establishing best practices, and ensuring high performance, automation, and reliability across all data workflows.
  • 𝐓𝐨𝐨𝐥𝐬 & 𝐓𝐞𝐜𝐡𝐧𝐨𝐥𝐨𝐠𝐢𝐞𝐬 𝐔𝐬𝐞𝐝: BIG DATA, PYTHON, SNOWFLAKE, APACHE AIRFLOW,DBT,GITHUB,
  • 𝐑𝐨𝐥𝐞𝐬 & 𝐑𝐞𝐬𝐩𝐨𝐧𝐬𝐢𝐛𝐢𝐥𝐢𝐭𝐢𝐞𝐬:
  • Working in end-to-end DATA MIGRATION and MODERNIZATION programs into SNOWFLAKE for retail and supply-chain clients.
  • Built high-performance SPARK-BASED ETL/ELT JOBS optimized for large-scale distributed data processing.
  • Designing and implementing DATA PIPELINES using SNOWFLAKE, DBT, and AIRFLOW for automated and scalable workflows.
  • Mentoring a TEAM OF DATA ENGINEERS and driving best practices in ETL/ELT, testing, and performance tuning
  • Building reusable DBT models, macros, and transformation frameworks.
  • Establishing automated DATA QUALITY and monitoring solutions.
  • Working with clients and SMEs to turn business requirements into scalable technical solutions.
  • Ensuring strong DATA GOVERNANCE, documentation, and platform standards.
BIG DATAPYTHONSNOWFLAKEAPACHE AIRFLOWDBTData Engineering+1

Bot consulting

Lead data engineer

May 2024Present · 1 yr 10 mos

Dbs bank

Data Engineer

Nov 2022May 2024 · 1 yr 6 mos · Hyderabad, Telangana, India

  • Worked on DBS’s enterprise data platform (ADA), designing and building scalable data engineering solutions that supported analytics, reporting, and regulatory initiatives across the bank.
  • 𝐓𝐨𝐨𝐥𝐬 & 𝐓𝐞𝐜𝐡𝐧𝐨𝐥𝐨𝐠𝐢𝐞𝐬 𝐔𝐬𝐞𝐝: BIG DATA, PYTHON, APACHE SPARK, APACHE AIRFLOW, HIVE, HADOOP, AWS
  • 𝐑𝐨𝐥𝐞𝐬 & 𝐑𝐞𝐬𝐩𝐨𝐧𝐬𝐢𝐛𝐢𝐥𝐢𝐭𝐢𝐞𝐬:
  • Designed and delivered END-TO-END DATA PIPELINES and DATA MARTS for smooth data ingestion, transformation, and consumption.
  • Built high-performance SPARK-BASED ETL/ELT JOBS optimized for large-scale distributed data processing.
  • Developed and orchestrated WORKFLOWS IN APACHE AIRFLOW with monitoring, scheduling, and automated error handling.
  • Worked extensively with the HIVE/HADOOP ECOSYSTEM to process and manage LARGE DATASETS with high data quality.
  • Engaged with stakeholders to translate BUSINESS REQUIREMENTS into clear and actionable TECHNICAL SOLUTIONS.
  • Performed DATA INTEGRATION, SQL OPTIMIZATION, and PIPELINE PERFORMANCE TUNING to improve efficiency and scalability.
  • Used AWS SERVICES for storage, compute, and orchestration in cloud-enabled data workflows.
  • Created AUTOMATION components, reusable frameworks, and standardized INGESTION PATTERNS to reduce manual work.
  • Ensured strong DATA GOVERNANCE through data quality checks, audit controls, and compliance with internal standards.
  • Provided PRODUCTION SUPPORT by monitoring pipelines, resolving incidents, and ensuring high availability.
  • Contributed to improving ADA PLATFORM BEST PRACTICES and assisted in onboarding new data sources and use cases.
  • Impact:
  • Delivered reliable, scalable, and efficient data workflows that supported analytics, dashboards, and operational processes across multiple business units.
BIG DATAPYTHONAPACHE SPARKAPACHE AIRFLOWHIVEHADOOP+3

Bomisco

Data Engineer

May 2021Nov 2022 · 1 yr 6 mos · Bengaluru, Karnataka, India

  • Worked as Data Engineer
  • 𝐓𝐨𝐨𝐥𝐬 & 𝐓𝐞𝐜𝐡𝐧𝐨𝐥𝐨𝐠𝐢𝐞𝐬 𝐔𝐬𝐞𝐝: BIG DATA, APACHE NIFI, PYTHON, MS SQL, POWER BI, TABLEAU, QLIKSENSE
  • 𝐑𝐨𝐥𝐞𝐬 & 𝐑𝐞𝐬𝐩𝐨𝐧𝐬𝐢𝐛𝐢𝐥𝐢𝐭𝐢𝐞𝐬:
  • Worked as a Data Engineer responsible for INGESTING DATA from multiple source systems such as SFTP, SMTP SERVERS, RDBMS, and REST APIs.
  • Designed and built ETL PIPELINES for processing data from various sources, including STRUCTURED, SEMI-STRUCTURED, and UNSTRUCTURED formats using APACHE NIFI.
  • Automated REPORTING PROCESSES using PYTHON to reduce manual effort and improve report accuracy.
  • Designed and developed DATA SOLUTIONS and DATA INTEGRATION workflows, including PERFORMANCE TUNING for different client requirements.
  • Developed COMPLEX REPORTS and DASHBOARDS using BI tools such as TABLEAU, POWER BI, and QLIKSENSE.
  • Managed and processed LARGE DATA VOLUMES by designing efficient WORKFLOWS and DATA PIPELINES to ensure optimal performance and reliability.
BIG DATAAPACHE NIFIPYTHONMS SQLPOWER BITABLEAU+3

Stackforce found 100+ more professionals with Data Engineering & Data Migration

Explore similar profiles based on matching skills and experience