Preethi Kathi

Associate Consultant

New Britain, Connecticut, United States10 yrs 10 mos experience
Highly Stable

Key Highlights

  • 8 years of experience in ETL and Data Warehousing.
  • Proficient in AWS services and data integration techniques.
  • Strong background in Agile methodologies and team leadership.
Stackforce AI infers this person is a Data Integration Specialist with expertise in Healthcare and AWS technologies.

Contact

Skills

Core Skills

Informatica PowercenterData WarehousingAwsData Integration

Other Skills

Agile DevelopmentAgile MethodologiesAmazon RedshiftAmazon Web Services (AWS)CA workstationData AnalysisData MigrationDatabasesExtractGitIICSInformaticaInformatica Power Centre 10xInformatica power centreJira

About

 8 years of experience working on Projects involving ETL, Data warehousing, IICS, Informatica PowerCenter projects  Experience working related to various aspects involving Data integration techniques using ETL tools like Informatica PowerCenter 10.5/9.6, Informatica Intelligent Cloud Services (IICS).  Hands on experience with Amazon EC2, Amazon S3, VPC, IAM, Lambda, Glue, Airflow and other services of AWS family.  Solid understanding of Agile/Scrum methodology.  Hands on experience in different modules of Healthcare, Insurance and Banking Domains.  Working knowledge on cloud-based database solutions like Snowflake, Amazon Redshift.  Excellent knowledge on Unix and Python scripting.  Used transformations like Joiner, Expression, Connected, Unconnected lookups, Filter, Aggregator, Update Strategy, Router, sorter, Sequence generator etc.  Experience in DevOps Tools – Linux, Jenkins, GitHub, GitLab, JIRA, Confluence  Effectively implemented mappings for slowly changing dimensions.  Proficient in mapping performance optimization, Session and database level performance tuning, Pipeline Partitioning to speed up data processing.  Good knowledge in reporting tools using Power BI  Skilled in creating complex data integration and transformation logic as well as data integration between systems and applications. · Possesses in depth knowledge on testing tools like DVO through Informatica power center. · Transferred and retrieved data to and from FTP Servers to Oracle Database. Having good Experience materialized views for data replication in distributed environments.  Demonstrated ability to use technical and interpersonal skills to fulfil project goals.  Excellent Programming skills, Fast learning curve, Ability to quickly "Ramp-up" & start contributing.  Effective Communication, Presentation and Customer Interaction skills

Experience

Health care service corporation

Sr analytics consultant

Nov 2025Present · 4 mos · Connecticut, United States · Remote

Cigna healthcare

Business Analytics Advisor

Dec 2024Nov 2025 · 11 mos · Connecticut, United States · Remote

Cgi

Senior ETL Developer

Jun 2023Jan 2025 · 1 yr 7 mos · Connecticut, United States

Microsoft SQL ServerOracle SQL DeveloperInformatica PowerCenterData Warehousing

Hcl enterprise

Team Lead

May 2021Mar 2022 · 10 mos

  • Project: Data Forge
  • The main goal is to use AWS Services to create a data lake with salesforce tables that includes history for analytics and machine learning consumption.
  • Analysis and listed out various snowflake tables required for loading data into S3 buckets.
  • Implemented data integration using IICS CDI to extract data from Salesforce using Salesforce connector, and the resulting data was then uploaded to an S3 bucket as a csv file.
  • Created glue jobs based on the requirements.
  • The airflow job is triggered by lambda to execute as and when a file is placed in the S3 bucket.
  • Using Airflow ran a series of glue tasks to compare it to previously processed files to detect changes.
  • If changes are detected, the same file will then be processed to AWS redshift tables as type 2.
  • The data from Redshift tables that includes history is being used for analytics and machine learning consumption.
  • Technologies Used: AWS, IICS, Unix, SQL, Oracle, Informatica, Power Center, Python
  • Project: Fire Flood Heritage Initiative
  • Role: Project lead
  •  Ingestion of pipeline deals into the data warehouse. Previously, only settled deals were loaded into the DWH; however, the business now wants to load all stages of deals.
  •  Developed UTRs and reviewed ETL code and technical design documents produced by the team.
  •  Given functional and Technical KT’s to freshers and laterals.
  • Technologies Used: Informatica Power Centre 10x, SQL Server, UNIX
UnixMicrosoft SQL ServerAmazon Web Services (AWS)Python (Programming Language)Informatica PowerCenterIICS+2

Cgi

Senior Software Engineer

Sep 2017Apr 2021 · 3 yrs 7 mos · Bengaluru, Karnataka, India

  • Role: ETL Developer
  • This is about handling utilization, Case management and health information online activities for members. The sun setting of the legacy system and migration to TruCare reduces manual effort, increase efficiency and lower costs.
  •  Analyzing the requirements that we get from the System Analysts and creating the design/mapping documents.
  •  Actively involved in Development, Design, Testing, Prod support, PI Planning and Sprint planning done with Off/Onshore Team for all project related activities.
  •  Worked in Agile methodology and actively participated in standup calls and PI planning and daily work reported in Rally.
  •  Created mappings and workflows and had done performance tuning at query level and also workflow level partitioning to increase the performance of the workflow
  •  Creating Unit Test Cases and project related design documents etc.,
  •  Worked extensively on Informatica Power Center tools - Source Analyzer, Data Warehousing designer, Mapping & Mapplet Designer and Transformation Designer. Developed Informatica mappings and also in tuning of mappings for better performance...
  •  Analyzing the source data and working with business users to develop the Model, Sharing Requirements and make them understand the requirement with Offshore. and setup the environments for brand new Projects and For the Informatica Team.
  •  Extracted data from flat files and Sql Server using T-SQL and Oracle Database and applied business logic and loaded into oracle Warehouse.
  •  Developed mappings/Reusable Objects/Transformation by using mapping designer, transformation developer in Informatica power center.
  •  Implemented Informatica recommendations, methodologies and best practices.
  •  Implemented slowly changing dimensions to maintain current information and history information in dimension tables.
  • Technologies Used: Informatica Power Centre 10, SQL Server, UNIX
DatabasesData MigrationPowerCenterInformatica PowerCenterSnowflakeData Analysis+1

Tech mahindra

Software Associate

Dec 2013Aug 2017 · 3 yrs 8 mos · Hyderabad, Telangana, India

  • ETL DeveloperThe DSM Variable Development application was developed to pull information from various external and internal sources and present the information to Commercial Lines Automated System (CLAS) for use in the underwriting process. This process also includes the scoring of in force policies on a monthly basis to support reporting and the conditional renewal process.
  •  Involved in variable development and present the information to CLAS for use in underwriting process.
  •  Worked on migration of DATA MART code from SQL SERVER to NETEZZA
  •  Design and Development of documents includes Requirement, Design, All Phases of testing.
  •  Development of mappings, sessions, workflows using parameters, tasks etc.
  •  Worked on different Sources such as Relational and Flat files.
  •  Automated testing using Informatica DVO.
  •  Worked on TFS (Team Foundation Server) activities such as check-in and check-out for the project.
  •  Involved in variable development and present the information to CLAS for use in underwriting process.
  •  Effective usage of debugger for debugging the flow of data.
  •  Preparing SQL Queries for testing the data. It includes Balancing and comparing the data.
  •  Creating Parameters for connectivity with database.
  •  Maintaining the quality of the products to achieve customer satisfaction.
  •  Unit testing and Integration Testing, Usage of DVO tool for QA testing
  •  Health check reports generation to maintain the standards.
  •  Effective preparation of promote checklists and proceed with the production promotes.
  •  Create deployment groups to promote the development code to QA and Production environments.
  •  Fixing the production related issued based on priority.
  • Technologies Used: Informatica Power Centre 9.5, SQL Server 2012,2014, DVO, SOAP UI, Zena, Power BI
Data MigrationPowerCenterInformatica PowerCenterSnowflakeData AnalysisData Warehousing

Education

Geetanjali College of Engineering & Technology

Bachelor of Technology - BTech

Jan 2009Jan 2013

Stackforce found 100+ more professionals with Informatica Powercenter & Data Warehousing

Explore similar profiles based on matching skills and experience