Shreyas Gopi Koundinya

Data Engineer

Toronto, Ontario, Canada1 yr 3 mos experience
AI EnabledAI ML Practitioner

Key Highlights

  • Expert in building scalable data pipelines on AWS and Azure.
  • Proficient in Python, PySpark, and data engineering best practices.
  • Strong focus on automation and data quality in engineering solutions.
Stackforce AI infers this person is a Data Engineering expert in Fintech and Cloud Services.

Contact

Skills

Core Skills

Data EngineeringCloud ServicesData DevelopmentAutomationData AnalysisBackend DevelopmentData AnalyticsStatistical AnalysisWeb DevelopmentFrontend Development

Other Skills

Azure Data FactoryAzure DatabricksPySparkSQLDatadogAnalyticsData IntegrityPython (Programming Language)JavaScriptAPI DevelopmentData ExtractionDocker ProductsApache AirflowData Build Tool (DBT)AWS Lambda

About

I’m a Data Engineer who enjoys building systems that make data reliable, useful, and easy to work with. At EY, I built cloud-native data pipelines on AWS, working with Snowflake and dbt to deliver scalable, maintainable workflows. I focused on automation, data quality, and helping teams turn raw data into something they could actually trust and use. Currently, I’m a Data Engineering Consultant at CGI, working with TD. I build, test, and validate data pipelines using Azure Data Factory, Databricks, and PySpark, making sure data workflows are efficient, resilient, and production-ready. I care about writing clean, maintainable code and using the right tools for the job. I’m always learning and focused on building systems that just work.

Experience

1 yr 3 mos
Total Experience
1 yr 3 mos
Average Tenure
--
Current Experience

Cgi

Consultant - Data Engineer

Nov 2025Present · 6 mos · Hybrid

  • Built and enhanced data pipelines using Azure Data Factory, Azure Databricks, Azure Data Lake, and Blob Storage
  • Developed PySpark jobs for large-scale data transformation, segregation, and enrichment of financial and crime-related datasets
  • Used Python for data ingestion, pipeline logic, and integration, with PySpark handling distributed processing
  • Implemented modular, reusable transformation logic aligned with enterprise data frameworks (JTMF)
  • Integrated monitoring and observability using Datadog to track pipeline health and performance
  • Supported SIT and UAT cycles to ensure pipelines were stable, compliant, and production-ready
Azure Data FactoryAzure DatabricksPySparkSQLDatadogData Engineering+1

Stacked pancake & breakfast house

Freelance Data Developer

Dec 2024Nov 2025 · 11 mos · Hybrid

  • Created a staff login/logout system using Python and Google Sheets to reduce errors and make tracking easier.
  • Automated payroll reports, saving the management over 10 hours each week.
  • Built an inventory app with Flask and PostgreSQL to track stock and reduce waste.
  • Set up automatic backups and notifications to protect data and keep the team updated.
  • Helped improve how data is collected and reported to support better decisions.
  • Worked closely with the owner to make sure the tools fit the business needs.
SQLAnalyticsData IntegrityPython (Programming Language)JavaScriptAutomation+3

Ey

Senior Analyst

Jan 2022May 2023 · 1 yr 4 mos · Hybrid

  • Built cloud-native ETL pipelines using AWS Glue, Lambda, Redshift, and Azure Data Factory, Blob Storage.
  • Used Apache Airflow to orchestrate 20+ workflows with retries, SLAs, and alerting.
  • Created modular data models and transformations using dbt with Snowflake and Redshift.
  • Processed large-scale batch and streaming data using PySpark on Databricks.
  • Developed streaming pipelines with Kafka and Kinesis for real-time processing.
  • Built backend APIs with Python frameworks like FastAPI, Flask, and Django.
  • Worked with relational and NoSQL databases including PostgreSQL, MongoDB, and DynamoDB.
  • Automated CI/CD using GitHub Actions and infrastructure as code with Terraform and CloudFormation.
  • Implemented monitoring and logging with CloudWatch, Datadog, and Azure Monitor.
Docker ProductsApache AirflowData Build Tool (DBT)AWS LambdaAnalyticsAmazon Web Services (AWS)+21

Xpheno

Senior Analyst

Dec 2021Jan 2022 · 1 mo

  • Deputed to EY as a contractor, later absorbed by EY on direct rolls.
  • Skilled in streamlined data collection, analysis, and Excel automation.
  • Experienced Python Flask and Django Developer, adept at backend development, API integration, and creating user-friendly applications with clean code practices.
AnalyticsFlaskPython (Programming Language)SQL Server Integration Services (SSIS)DjangoData Extraction+2

Exposys data labs

Student Intern

Jun 2020Jul 2020 · 1 mo · Bengaluru

  • Developed Python scripts for data modeling and statistical analysis using NumPy and pandas.
  • Implemented regression models to analyze and interpret data for AI and ML applications.
  • Collaborated with team members to optimize algorithms for data-driven innovations.
Data AnalyticsNumPyAnalyticspandasFlaskPython (Programming Language)+4

Inube software solutions pvt limited

Student Intern

Jan 2020Feb 2020 · 1 mo · Bengaluru, Karnataka, India

  • Developed dynamic user interfaces using JavaScript, Node.js, and React.js for modern web technologies.
  • Built scalable applications and software architecture as an experienced .NET Developer.
  • Expertise in database integration and deployment for iNube Software Solutions Pvt Limited in Bengaluru, India.
AnalyticsAgile MethodologiesSQL Server Integration Services (SSIS)JavaScriptWeb DevelopmentWeb Design+1

Education

Durham College

Postgraduate Degree — Data Analytics for Business Decision Making

May 2024Dec 2024

Durham College

Post Graduate Certificate — Cloud Computing

Sep 2023Aug 2024

J S S Academy of Technical Education, BANGALORE

Bachelor of Engineering - BE — Computer Science

Jan 2017Jan 2021

Stackforce found 100+ more professionals with Data Engineering & Cloud Services

Explore similar profiles based on matching skills and experience