Poornima Nag Ponthagani — Product Engineer
With 10+ years of experience in data engineering, I help organizations build scalable, cloud-native data solutions that accelerate insight delivery, improve governance and reduce operational overhead. I architect scalable batch, streaming and event-driven pipelines on AWS, Azure, and Databricks by aligning with business goals. I’ve led Data Lakehouse implementations with Delta Lake, Iceberg and governance using Unity Catalog and Azure Purview. I deploy production ready pipelines using Airflow, ADF, Argo workflow, CI/CD, Docker, and Kubernetes collaborating closely with data scientists and mentoring engineers to drive impact and best practices. 🚀 Impact highlights: ✦ Developed a reusable data product framework for pharma usecases enabling YAML-based data ingestion, transformation and modeling. This accelerated the use case delivery by 40%, reducing development time from 3 weeks to <2 weeeks and enabling faster insights. ✦ Leveraged Generative AI (Claude, GPT) to automate data summarization reducing manual analyst workload by 40% (~60 hours/month) and cutting insight delivery time by 30%. This enabled data analysts and medical researchers to shift focus toward accelerating decision-making for product and ops teams. 🚀 Technical Skills: ✦ Languages: Python, SQL, COBOL ✦ Data Tools & Frameworks : Hadoop, PySpark, Hive, PostgreSQL, DBT, Great Expectations, Kedro, Snowflake ✦ Data Architecture: Batch processing, event-driven pipelines, real-time streaming (Kafka), Data Lakehouse Designs(Delta Lake, Apache Iceberg) ✦ Pipeline Orchestration: Apache Airflow, Azure Data Factory, Databricks Workflows, Argo Workflows ✦ Data Governance & Cataloging: Databricks Unity Catalog, Azure Purview ✦ Cloud Platforms: AWS(EC2, AWS Glue, AWS Lambda, Kinesis, EMR, S3, Redshift, RDS, DynamoDB), Azure(Azure DataFactory, ADLS Gen2, Azure Synapse, Cosmos DB, Azure Databricks) ✦ DevOps: CI/CD Pipelines (CircleCI, Azure DevOps, GitHub Actions), Docker, Kubernetes ✦ Generative AI: RAG architecture, Amazon Bedrock, Anthropic Claude Models ✦ Agentic AI Platforms: Google Agentspace, AWS Agentcore ✦ Agentic AI Integration: A2A Protocol Server Development, Multi-Agent Communication Systems 🚀 Certifications and Accreditation: - AWS Certified Developer - Associate - AWS Certified Solutions Architect - Associate - Databricks Certified Associate Developer for Apache Spark 3.0 – Associate - Kubernetes Application Developer(CKAD) – Associate - Academy Accreditation - Databricks Lakehouse Fundamentals - Academy Accreditation - Generative AI Fundamentals
Stackforce AI infers this person is a Data Engineering expert in SaaS and Cloud solutions.
Location: London, England, United Kingdom
Experience: 10 yrs 10 mos
Skills
- Data Architecture
- Cloud Platforms
- Data Engineering
- Data Governance
- Pipeline Orchestration
- Etl Development
- Data Migration
- Data Quality Assurance
- Embedded Systems
- Control Systems
Career Highlights
- 10+ years in data engineering and architecture
- Accelerated insights delivery by 40% with reusable frameworks
- Mentored engineers, increasing team delivery by 25%
Work Experience
QuantumBlack, AI by McKinsey
Full-time parenting (1 yr 1 mo)
Lead Data Engineer – Data Architecture & Agentic AI Systems (3 yrs 10 mos)
Senior Data Engineer (1 yr 11 mos)
Data Engineer (1 yr 2 mos)
Deloitte
Data Engineer (6 mos)
Junior Data Engineer (2 mos)
Cognizant
Bigdata Developer (2 yrs 1 mo)
Informatica Developer (10 mos)
ISRO - Indian Space Research Organization
Embedded Software Developer (5 mos)
Education
Bachelor of Technology (B.Tech.) at International Institute of Information Technology Hyderabad (IIITH)