Ganapathy Subramanian N

Director of Engineering

Chennai, Tamil Nadu, India18 yrs 6 mos experience
Most Likely To SwitchAI ML Practitioner

Key Highlights

  • 100+ certifications in cloud and data technologies.
  • Expert in multi-cloud data architecture and engineering.
  • Proven track record in optimizing data strategies and governance.
Stackforce AI infers this person is a multi-cloud data engineering expert with a focus on Fintech and SaaS solutions.

Contact

Skills

Core Skills

Data EngineeringCloud Architecture

Other Skills

AWSDatabricksCollibraUnity CatalogIcebergData StrategyData GovernanceMachine LearningPysparkDelta LakeGen AISparkLangchainAzureAirflow

About

Databricks SA Champion,Snowflake Data SuperHero & Squad Member - Designing data architectures using multi cloud technologies. Senior Director of Data Engineering(DE) practise with deep knowledge on AWS,Azure,GCP,Databricks & Snowflake. Cloud native Analytics – DE,DW,Big data,Data platforms,Databases & AI/ML,Devsecops,Security,Admin & Networking. Good Experience on DE/DW technologies - Teradata, Airflow, DBT,Druid, Dremio,Denodo,Collibra and Alation & Aviatrix. Open source tools - Docker,K8S,Terraform,Vault and Consul,Gitops using ArgoCD and Service Mesh using Istio. Certifications: --------------- K8S: ->CKA ->CKAD AWS: Professional: ->Solutions Architect ->Devops Engineer ->Gen AI Developer Specialty: ->Advanced Networking ->Security Associate: ->Solutions Architect ->Developer ->Sysops Admin ->Data Engineer ->ML Engineer ->(Cloud & AI) Practitioner Azure: Expert: ->Solutions Architect ->Devops Engineer Specialty: ->Cosmos DB Developer Associate: ->Data Scientist ->Database Admin ->Administrator ->Developer ->Data Engineer ->Network Engineer ->AI Engineer ->Security Engineer ->Fabric Analytics Engineer Fundamentals: ->Azure ->AI ->Data GCP: ->ACE ->Cloud Digital & Gen AI Leader Professional: ->Cloud Architect ->Data Engineer ->Database Engineer OCI-Gen AI Professional Databricks: Associate: ->Spark Developer ->Data Analyst ->ML ->Data Engineer ->Gen AI ->Data Engineer Professional ->Databricks,Lakehouse,Gen AI&LLM,Gen AI Fundamentals & Advantages ->Platform (Admin,Architect-AWS,Azure & GCP) ->Solutions Architect (Essentials & Champion) Snowflake: ->Snowpro(Core,Architect & Data Engineer) ->DCDF ->Snowconvert Hands On Essentials For Data: ->Warehouse ->Sharing ->Applications ->Lake ->Engineering Hashicorp Associate: ->Terraform ->Vault ->Consul Teradata 12: ->Enterprise Architect ->Solutions Developer ->DBA ->Technical Specialist ->Professional Apache Druid From Imply: ->Basics ->Data Modelling ->Monitoring ->Streaming Astronomer Airflow: ->Fundamentals ->DAG Authoring Aviatrix: ->Associate Gitops For Argo: ->Fundamentals ->Scale Data Governance: ->Alation Advocate ->Collibra Foundations & Architect Solo.io fundamentals for: ->Envoy ->Istio ->Cilium ->eBPF ->Ambinet mesh ->Network foundations ->Intermediate for Istio Dremio: -> Lakehouse -> Data Analyst -> Reflections -> Data Products -> AI Architect ->ISC2-CC ->Denodo Architect ->Netezza ->Intro to Finops Certified Fundamentals on ->Github ->Dataops & Data Quality by DataKitchen ->DBT ->Informatica - IDMC,DE,CDGC,MDM & B360

Experience

18 yrs 6 mos
Total Experience
3 yrs 1 mo
Average Tenure
6 yrs 5 mos
Current Experience

Tiger analytics

4 roles

Senior Director - Data Engineering

Promoted

Jan 2025Present · 1 yr 4 mos

  • >Designed Data mesh strategy for a Banking client on AWS+ Databricks + Collibra + Unity catalog+Iceberg. Did end-to-end data strategy including assessment, target state definitions, data product operating model, Real time, Unstructured, Semantic layer, AI for BI, Data Marketplace, Data contracts etc.
  • >Designed End to End cloud-based data platform on Databricks for an APAC based automotive distributor. Data pipelines are build using Databricks workflows. Data ingestion using API’s. Data Lake on Delta Lake using medallion architecture (Bronze, Silver and Gold). Transformations using Pyspark on Databricks notebooks. Data governance using Unity catalog. Machine learning models build using AutoML & Mlflow.
  • >Spark code optimization using Gen AI - RAG based implementation using Rule engine. LLM as Claude/GPT/Gemini/Llama. Orchestration using Langchain. Saved and optimized running time and cost for many Spark jobs for various customers.
AWSDatabricksCollibraUnity CatalogIcebergData Strategy+5

Director - Data Engineering

Jan 2023Jan 2025 · 2 yrs

  • >Designed a Data Lake solution for a small insurance company. Data Lake is built on AWS S3 (Landing and Refined Zones). Using data pipelines, the required data is ingested onto an AWS OpenSearch service. Source data scraping using multiple AWS Lambda functions. ETL using Pyspark on AWS Glue.
  • >Designed End to End cloud-based data platform on Azure for a US-Based Pet company using Airflow, ADLS, Azure Databricks, Azure Synapse, Azure ML, Azure Key Vault, AAD and Power BI.
  • >Designed End to End cloud-based data platform on Databricks for an APAC based automotive distributor. Data pipelines are build using Databricks workflows. Data ingestion using API’s. Data Lake on Delta Lake using medallion architecture (Bronze, Silver and Gold). Transformations using Pyspark on Databricks notebooks. Data governance using Unity catalog. Machine learning models build using AutoML & Mlflow.
  • >Designed End to End cloud-based data platform on GCP for a US-Based Telecom company. Data orchestration using Cloud composer, Data Lake on GCS, Transformations using Pyspark on Dataproc, Curated data layer using Bigquery and data visualization using Looker.
  • >Migrated multiple Terabytes of data from Salesforce to AWS Redshift using Fivetran and DBT. Created DBT models with pre-hooks & post-hooks to cater incremental data loads. DBT tests for data quality.
  • >Designed Snowflake based Data Lake for a real estate investment company. Designed various AWS based data pipelines (API Gateway, AWS Lambda and AWS EC2 etc.) for data ingestion from multiple sources – Azure, Box, SYSLOG using Vector on AWS etc.
  • >Multiple POC’s and Artifacts on Data Lakes, Data warehousing, Data analytics and Migrations related to AWS, Azure, GCP, Databricks and Snowflake.
  • >Created solution accelerators – AWS CFT generator, UI based Spark code generator using Gen AI etc.
AWSAzureAirflowPysparkDBTSnowflake+3

Principal Architect

Promoted

Jan 2021Jan 2023 · 2 yrs

  • >Complete end to end AWS architecture design for a major food chain company’s highly available and easily scalable pricing recommendation portal. This involves services like AWS S3, EC2, VPC, Route 53, ALB with cross zone load balancing, NAT gateways and Amazon RDS with Multi AZ’s. Entire infrastructure using Cloud Formation Template for reusability.
  • >Created/Architected AWS S3 based Data Lake for a Capital Company. End to End Data Lake implementation starting from Data ingestion from multiple data sources to S3. Build a Data pipeline to ingest structured data to Amazon RDS (MySQL).
  • >Designed End to End Data Analytics platform for various projects for a manufacturing company. Data Ingestion using Transfer Family, API’s, Lambda etc. Data Lake using S3, Lake Formation. Data harmonization using Glue & EMR. Data provisioning using RDS, Redshift & DynamoDB. BI using AWS QuickSight. Involved in Data Mesh design (Data as a product domains creation) on AWS.
  • >Designed a hybrid data platform solution for a Telecom company using AWS Aurora. Data from On-premises Data Lake is ingested to Aurora using AWS Glue.
AWSData LakeData AnalyticsCloud FormationCloud Architecture

Data Architect

Dec 2019Jan 2021 · 1 yr 1 mo

Cognizant

Data Architect/AWS Architect

Sep 2015Nov 2019 · 4 yrs 2 mos · Chennai Area, India

  • >Complete Data lakehouse using AWS native components – S3, Pyspark on Glue, Redshift & Snowflake as the consumption layer. Data orchestration using Airflow.
  • >Complete CI/CD lifecycle using tools - Git, GitHub, Docker, Kubernetes, Docker Swarm and Jenkins etc.
  • >Creating Micro services applications using Docker and Docker compose YAML files.
  • >Hands on experience in Container orchestration using Kubernetes.
  • >Senior DBA and Project lead for Cloud based Analytics (Redshift and Snowflake) which involves User administration, access management, data loading etc.
  • >Managing Petabytes of data on production Snowflake. Involved in user access management, Warehouse provisioning, data migration activities using stages.
  • >Got involved in architecting various environments in AWS cloud.
  • >Involved in various project implementations on Teradata PROD environment.
  • >Actively got involved in various backup, restore and replication activities across the PROD and DR box.
  • >Got involved in dormant objects list preparation, ID sharing report generation, Stale stats list preparation etc.
  • >As the project lead got involved in various Management related activities like reporting etc.
  • >Written various DBA related Teradata macros to simplify the DBA activities.
  • >Involved in various project automation activities to reduce the time spent on redundant activities.
  • >Provided solutions to various Teradata related issues across the organization.
  • >In recognition to my work got EA (Maximum performance rating in CTS) for financial year 2018-2019.
AWSSnowflakeTeradataDockerKubernetesData Engineering+1

Infosys

Senior Consultant

Feb 2015Sep 2015 · 7 mos · Chennai Area, India

  • >Involved in complete end to end development activities for Counterparty Credit Risk work stream.
  • As a technical lead involved in various development implementation activities.
  • >Involved in development of Teradata projects right from initial requirements gathering phase till deployment into PROD environment.
  • >Using my knowledge in Teradata SQL provided solutions to various critical issues.
  • >Involved in performance tuning, Metrics preparation to optimize the Teradata queries to perform in better way in PROD environment.
TeradataSQLPerformance TuningData Engineering

Tata consultancy services

Project Lead/Senior DBA

Sep 2013Jan 2015 · 1 yr 4 mos · Chennai Area, India

  • >Maintenance of two Teradata servers BBY8 and BBY11 using Teradata Viewpoint and other Teradata utilities.
  • >As the DBA, involved in various implementation activities on Teradata and Netezza Server.
  • >Maintenance of weekly BAR backup jobs, automatic stat collection, user administration, space management, capacity planning etc.
  • >Automated many of the manual dependent jobs such as Stats collection and DBC tables purging etc.
  • >Involved in various critical discussions and provided solutions to acquire new Teradata DBA projects into TCS.
  • >For my hard work and dedication, I got graded as Band A (Maximum performance level in TCS) performance on my Anniversary (first year completion cycle).
TeradataNetezzaData Engineering

Cognizant technology solutions

Teradata Technical lead and DBA

Mar 2010Aug 2013 · 3 yrs 5 mos · Chennai Area, India

  • >Creation of Technical specifications on the basis of the Middle level platform design.
  • Unit testing, project tracking, version control, defect logging and solving.
  • >Done development for the savings product using the Teradata utilities like BTEQ, MLOAD, FASTLOAD, FASTEXPORT etc.
  • >Done development for addition of some information in presentation layer tables using the Teradata utilities.
  • >Involved in various crucial projects and provided the solutions for various critical issues.
  • >As the DBA involved in various environment build for development projects.
  • >Got promoted to next level Associate – projects within six months on joining the Cognizant.
  • >Got awarded with STAR Award Q3-2011.
  • >Got EA (Maximum performance rating in CTS) for financial year 2011-2012.
  • >Got awarded with GUIDING STAR Award in Q1-2013.
TeradataSQLData Engineering

Tata consultancy services

Developer/Module lead

Jul 2007Feb 2010 · 2 yrs 7 mos · Mumbai , India

  • >Done development for the Safe Deposit Vault module, Added the Fees and Charges module in SDV.
  • >Added branch-based fee products in the base version of TCS-bancs.
  • >Done development for the Maker and Checker Functionality implementation in B.I.N Bank,Trade Finance for the import LC bills,Account Number changes from 17 digits to 20-digit standards in Base reports, and in entire SDV module.
  • >Done Development for Payments and worked on the various types of SWIFT messages.
  • >Done development for various Report Processing using PCO programs which can interface with Oracle.
  • >TCS has given me EARLY CONFIRMATION for my dedication towards work.
  • >Awarded as TCS-GEM for the month of December 2008.
  • >Graded with Band A (Maximum performance level in TCS) performance for the financial year 2008-2009.

Education

Government College of Technology, Coimbatore

Bachelor of Engineering (B.E.) — Electrical and Electronics

Jan 2003Jan 2007

G.G.S.M.M.Hr.SEC School

12th — Computer Science

Jan 1991Jan 2003

Stackforce found 100+ more professionals with Data Engineering & Cloud Architecture

Explore similar profiles based on matching skills and experience