M

Manjunath K.

Director of Engineering

San Antonio, Texas, United States19 yrs experience

Key Highlights

  • Expert in Data Engineering with extensive healthcare experience.
  • Proficient in Big Data technologies and ETL processes.
  • Strong leadership in managing data engineering teams.
Stackforce AI infers this person is a Data Engineering expert in the Healthcare sector with strong Big Data and ETL capabilities.

Contact

Skills

Core Skills

MicroservicesContinuous Integration And Continuous Delivery (ci/cd)SnowflakeCloud ComputingBig DataEtlData IntegrationData Warehousing

Other Skills

837Amazon Web Services (AWS)Apache Spark StreamingAzure Data LakeAzure DatabricksAzure Kubernetes Service (AKS)B2BBashDB2DockerETL ToolsElectronic Data Interchange (EDI)Electronic Medical Record (EMR)Feature EngineeringHL7 Standards

About

Experienced Data Engineer with a demonstrated history of streamlining the process by providing a reliable framework. Great experience in the hospital & health care industry. Skilled in sql, Python, Kubernetes, Snowflake, Palantir, pyspark, AWS, Azure, ETL Tools, Shell Scripting, Hadoop, and Informatica.

Experience

Clarivate

2 roles

Senior Director Data Engineering

Promoted

May 2025Present · 10 mos · United States

JenkinsDockerMicroservicesContinuous Integration and Continuous Delivery (CI/CD)

Director, Data Engineer

Nov 2023May 2025 · 1 yr 6 mos · United States

SnowflakeApache Spark StreamingJenkinsCloud ComputingAmazon Web Services (AWS)Docker+8

Deloitte

Senior Solutions Specialist | Lead Data Engineering

Apr 2022Nov 2023 · 1 yr 7 mos · Texas, United States

Big DataSnowflakeJenkinsPySparkCloud ComputingMacro+16

Agilon health

Senior Manager Data Engineer

Oct 2021Apr 2022 · 6 mos · Bengaluru, Karnataka, India

Big DataSnowflakeHL7 StandardsCloud ComputingAmazon Web Services (AWS)Macro+12

Cognizant

2 roles

Manager Data Engineer

Jul 2019Oct 2021 · 2 yrs 3 mos

  • Ingested Claim data into Hive tables using partitioned tables. Denormalized Diagnosis tables using the complex Array and Map Datatypes.
  • Cleaned and loaded data in append mode using spark data frames
  • Developed spark streaming pipelines for CMS response file processing like 999, HRR, MAOs etc.
Big DataPySparkCloud Computing837HealthcareETL+14

ETL Lead

Aug 2014Jul 2019 · 4 yrs 11 mos

  • Designed data Pipelines to load EDI files into Datawarehouse using informatica and B2B Data Transformations. ETL pipelines to re-generate claim EDI file for CMS submission without losing the hierarchy of a claim. Developed crucial response workflows for CMS response file processing (277, 999, HRR, MAOs).
  • Unix shell scripting and stored procedures to build ETL framework, and runtime parameter file generation for concurrent. Developed generic Auditing process to track the Job and Batch status along with the appropriate logging process.
  • Sourced variety of source systems like XML, CCF, Tables, CSV and other regular file formats.
  • Developed Metadata-driven framework for best performance in processing the claim life cycle.
  • Added a great level of restart ability to achieve ease of maintenance. The Majority of Failure handling scenarios are controlled and automated.
  • Designed a controlled way of job looping technique in job scheduling.
  • Co-ordinate with offshore teams to get the quality deliverables.
HL7 StandardsInformatica837HealthcareETLMedicare+12

Tata consultancy services

3 roles

Senior ETL Developer

Jul 2013Aug 2014 · 1 yr 1 mo

  • Worked with business analysts to gather requirements and Prepared development design.
  • Involved in project activities from analysis, build, unit testing and deploying the components.
  • Developed ETL mappings to read Cobol file as source to provide the output for each mail transaction with complete details.
  • Wrote PL/SQL packages for reconciliation and count auditing process.
  • Coded UNIX shell scripts for handling the file validation, movement and logging all activities.
InformaticaETLShell ScriptingData IntegrationDB2SQL+3

ETL Developer

Promoted

Nov 2012Jun 2013 · 7 mos

  • Supporting the DW and Datamart by working on small modules, which provides the Addon features to code like Root cause fix for repeated prod failures, better functionalities, preventative fix for possible issues and ensuring Batch wellness and performance.
  • Have developed ETL Workflows to populate data metrics from Third-party Member prior insurance details. This helped business in deciding the accurate insurance price for its members.
  • Implemented complex ETL pipelines for Slowly Changing dimensions, Dataflows which can evaluate MSR performance metrics for all LOBs, and Reconciliation process.
  • Writing SQL for unit testing of the developed mappings and ETL code. Perform peer review.
InformaticaETLShell ScriptingData IntegrationDB2SQL+3

Software Engineer

May 2010Oct 2012 · 2 yrs 5 mos

  • Senior Software engineer at TCS
InformaticaETLShell ScriptingDB2SQLData Warehousing+3

Cognizant technology solutions

Programmer Analyst

Feb 2007May 2010 · 3 yrs 3 mos · Chennai / Pune

  • Develop and Unit testing of ETL pipelines using Informatica tool. Developing and unit testing of data mappings, workflows(pipelines) and sessions in Informatica
  • Provide technical support to troubleshoot production issues in DataStage pipelines.
  • Prepare job metrics and failure metrics for weekly analysis and status reports.
InformaticaETLOracleData WarehousingETL Tools

Education

Visvesvaraya Technological University

Bachelor of Engineering (BE) — Electronics and communications

Jan 2002Jan 2006

Stackforce found 100+ more professionals with Microservices & Continuous Integration And Continuous Delivery (ci/cd)

Explore similar profiles based on matching skills and experience