Gaznavie Ahad — Data Engineer
Data Engineer with hands-on experience building ETL pipelines, cloud data infrastructure, and analytics-ready data models. I've worked across the full data stack - from ingesting raw data via APIs and cloud storage, to transforming it with dbt and Python, to delivering it into Snowflake warehouses that power real business decisions. At PwC, I built and maintained production ETL pipelines, deployed 30+ dbt models with automated quality tests, and worked with AWS services (S3, Glue, Lambda) to support scalable data processing. I'm comfortable working independently and asynchronously, documenting data flows clearly and resolving pipeline issues end to end. My background also includes building data preprocessing pipelines for sensor datasets in an autonomous vehicle research environment at MulticoreWare, where I worked on high-volume data ingestion, validation, and annotation tooling. I enjoy working on problems where data quality and pipeline reliability actually matter - and I'm always looking to work with teams building meaningful data infrastructure. Open to data engineering, analytics engineering, and ETL development opportunities.
Stackforce AI infers this person is a Data Engineer specializing in ETL and cloud data infrastructure.
Location: Chennai, Tamil Nadu, India
Experience: 1 yr 9 mos
Skills
- Data Engineering
- Etl
- Social Media Management
Career Highlights
- Expert in building scalable ETL pipelines.
- Proficient in AWS services for data processing.
- Strong background in data quality and reliability.
Work Experience
Outlier
Ai Trainer (1 mo)
PwC Acceleration Center India
Data Engineer (11 mos)
CYBERNERDS KARE
Social Media Lead (1 yr 2 mos)
MulticoreWare Inc
Research Intern (Sensor Fusion) (1 yr)
Education
Btech Computer science and engineering at Kalasalingam University