Dillibabu Ramadhoss

Associate Consultant

India12 yrs 8 mos experience
AI EnabledAI ML Practitioner

Key Highlights

  • Expert in architecting scalable AI systems.
  • Proven track record in enterprise data transformation.
  • Strong leadership in cross-functional AI initiatives.
Stackforce AI infers this person is a Data Engineering and AI Architect specializing in enterprise-scale solutions.

Contact

Skills

Core Skills

Google Cloud Platform (gcp)Vertex AiLarge Language Models (llm)MlopsGcpDbtBigquery

Other Skills

Chatbot DevelopmentGenAIMachine LearningGoogle GeminiCloud RunCI/CDLookerGitHub ActionsCloud SQLDataflowEDMEDHApache SparkSQLAzure Databricks

About

I build systems that don’t just process data — they understand it, act on it, and collaborate like digital teammates. My work sits at the intersection of Agentic AI, cloud-native engineering, and real-world enterprise operations. I design intelligent, scalable architectures on GCP that bring together data, AI, and automation into one cohesive ecosystem. Whether it’s a streaming pipeline from Kafka, a PI server feeding industrial telemetry, or an enterprise workflow buried inside legacy apps — I use Vertex AI and multi-agent frameworks to turn these moving parts into coordinated, AI-powered systems. Over the last decade, I’ve helped global teams in retail, pharma, media, and manufacturing modernize how they work — moving them from fragmented legacy environments to real-time, AI-augmented platforms. A few areas I’ve been deeply involved in recently: Cloud Run–powered agent orchestration using Gemini 2.x, enabling fast, intelligent decision flows Secure, enterprise-ready BigQuery platforms with automated DLP, governance, and policy-driven controls MLOps pipelines built with CI/CD, observability, model retraining, and lineage transparency AI adoption roadmaps, capability uplift, and team enablement across engineering and business groups My approach blends hands-on engineering depth with strategic leadership. I enjoy solving complex architectural problems just as much as guiding teams, simplifying ideas for stakeholders, and shaping long-term AI strategies. At my core, I’m driven by a simple idea: AI should feel less like a tool and more like a partner in how organizations think, work, and innovate. Passionate about: Agentic AI, multi-agent orchestration, scalable GCP architectures, real-time data platforms, and mentoring engineers who want to build the future.

Experience

12 yrs 8 mos
Total Experience
2 yrs
Average Tenure
2 mos
Current Experience

Tata consultancy services

Associate Consultant – GCP Architect & Agentic AI

Mar 2026Present · 2 mos · On-site

Latentview analytics

Senior Data Program Manager

Sep 2024Mar 2026 · 1 yr 6 mos · Hybrid

  • Driving enterprise transformation at the intersection of Agentic AI, Generative AI, Data Engineering, and Cloud Architecture.
  • I lead cross-functional teams to build scalable, production-grade AI systems leveraging Vertex AI, Cloud Run, BigQuery, and agentic architectures that enable automation, intelligence, and measurable business impact.
  • Architected multi-agent AI systems using Vertex AI Agent Engine + Cloud Run, improving response accuracy and reducing manual intervention across enterprise workflows.
  • Delivered next-gen LLM applications with bias/latency mitigation, prompt governance, grounding, and real-time context integration.
  • Modernized large-scale data platforms using BigQuery, Dataflow, Kafka, and DBT, enabling both batch and real-time analytics.
  • Established MLOps and LLMOps best practices—CI/CD, monitoring, retraining pipelines, and governance frameworks.
  • Mentored engineering teams, defined AI adoption roadmaps, and improved delivery velocity through reusable accelerators and platform patterns.
  • Partnered with leadership and business teams to align AI use cases with measurable ROI and operational value.
Chatbot DevelopmentGenAIMachine LearningLarge Language Models (LLM)Google GeminiGoogle Cloud Platform (GCP)+2

Ltimindtree

Senior Specialist - Data Engineering

Jul 2023Aug 2024 · 1 yr 1 mo · Hybrid

  • Designed end-to-end GCP data models enabling migration from legacy systems to BigQuery and Looker.
  • Built scalable DBT pipelines deployed via Cloud Run + GitHub Actions CI/CD, improving developer efficiency and release reliability.
  • Automated PII detection, masking, and policy tag generation using DLP + Cloud Functions, ensuring privacy-by-design.
  • Created a centralized metadata scanning tool to classify high-sensitivity fields across datasets.
  • Deployed Composer-to-Cloud Run hybrid orchestration for operational efficiency and cost optimization.
  • Worked with data science teams to standardize unstructured → structured data classification pipelines.

Searce inc

Principal Data Architect

Jan 2023Jun 2023 · 5 mos · Remote

  • Designed CDC-based pipelines using Cloud SQL → GCS → Pub/Sub → Dataflow → BigQuery for real-time and batch ingestion.
  • Built automated PII masking and policy tag frameworks using BigQuery, DLP, and Cloud Functions.
  • Partnered with product owners to define enterprise data models, spanning master, transactional, and analytical datasets.
  • Drove governance alignment, ensuring privacy, identity resolution, and compliance were embedded into the core data model.
  • Developed repeatable cloud-first architecture patterns for ingestion, cleansing, transformation, and lineage.
  • Unnested complex JSON using scalable Dataflow pipelines, enabling analytics-ready structures.

Bayer

2 roles

Technical Lead(Google Cloud)

Promoted

Oct 2022Jan 2023 · 3 mos · Amsterdam, North Holland, Netherlands · On-site

  • Migrated enterprise systems (MS SQL, Dynamics, Salesforce) to BigQuery using EDM/EDH frameworks.
  • Built advanced ETL pipelines using Airflow, Dataflow, and BigQuery, enabling high-frequency reporting.
  • Created Authorized Views to implement secure, governed access models across business teams.
  • Designed real-time ingestion patterns across multiple enterprise sources and messaging systems.
  • Conducted architecture workshops for stakeholders driving modernization strategy and capability uplift.

Senior GCP Data Engineer

May 2021Oct 2022 · 1 yr 5 mos · Amsterdam, North Holland, Netherlands · On-site

  • Developed automated pipelines for daily extracts, ad-hoc reporting, and federated data consumption.
  • Built monitoring and optimization frameworks for query performance and warehouse efficiency.
  • Collaborated with analytics and data science teams for data mart tuning and metadata modeling.
  • Led development teams, ensuring scalable, secure, and high-quality data engineering deliverables.

Cognizant

Senior Data Engineer

May 2016Apr 2021 · 4 yrs 11 mos · Chennai, Tamil Nadu, India

  • Built end-to-end ETL pipelines using Informatica and SQL, optimizing transformations and reducing runtime.
  • Implemented metadata-driven frameworks for scheduling, validation, and quality checks.
  • Designed reusable maplets and transformations, improving development productivity and standardization.
  • Handled release readiness reviews, versioning, and environment readiness for large-scale deployments

Azure it solutions india private limited

ETL Developer

May 2013Apr 2016 · 2 yrs 11 mos

  • Developed complex ETL workflows using Informatica: Lookups, Aggregations, Joins, Routing, and Update Strategy.
  • Created automated data validation and exception-handling pipelines for end-user reconciliation.
  • Designed reusable ETL components and supported release and build tool automation.
  • Performed performance tuning at mapping and functional levels to reduce load time and improve SLAs.

Education

Sri Venkateswara College of Engineering

MCA — Computer Applications

Jun 2010May 2013

Stackforce found 100+ more professionals with Google Cloud Platform (gcp) & Vertex Ai

Explore similar profiles based on matching skills and experience