Rohith Kumar

Product Manager

Bengaluru, Karnataka, India5 yrs 8 mos experience
Most Likely To SwitchHighly Stable

Key Highlights

  • 5+ years of experience in Data Engineering
  • Expertise in Snowflake Cloud Data Platform
  • Proven track record in building scalable data pipelines
Stackforce AI infers this person is a Data Engineer specializing in Snowflake and cloud data solutions within the Fintech and Healthcare sectors.

Contact

Skills

Core Skills

SnowflakeData Warehousing

Other Skills

SnowpipeCOPY commandSnowflake SQLData validationData cleansingData qualitySQLZero Copy CloningTime TravelData analysisETLDatabase managementTransact-SQL (T-SQL)Data Build Tool (DBT)T-SQL Stored Procedures

About

I am a Snowflake Data Engineer with 5+ years of experience in Data Warehousing, ETL development, and Cloud Data Engineering, including 4+ years of hands-on experience in Snowflake Cloud Data Platform. I specialize in building scalable data pipelines, cloud data warehouse solutions, and analytics-ready datasets that support business intelligence and data-driven decision making. My expertise includes Snowflake architecture, Snowflake SQL, and cloud-based data integration using tools like Azure Data Factory and AWS S3. I have strong experience designing and developing data ingestion pipelines using Snowpipe, COPY commands, Streams, and Tasks for both full load and incremental data processing. I have worked extensively with structured and semi-structured data formats such as JSON, PARQUET, ORC, and CSV, integrating data from multiple sources into Snowflake for analytics and reporting. I also have strong experience in data modeling (Star Schema, Snowflake Schema), query optimization, and performance tuning to ensure efficient data processing. In addition, I have experience in schema migration from Oracle and SQL Server to Snowflake, implementing Change Data Capture (CDC) pipelines, and performing data validation, reconciliation, and data quality checks to maintain data accuracy. I enjoy collaborating with data analysts, BI teams, and business stakeholders in Agile environments to deliver reliable, scalable, and high-performance data solutions. 🔹 Core Skills: Snowflake | Snowflake SQL | Azure Data Factory | ETL | Data Warehousing | Snowpipe | Streams | Tasks | CDC | AWS S3 | SQL | Data Modeling | Query Optimization

Experience

5 yrs 8 mos
Total Experience
1 yr 10 mos
Average Tenure
2 yrs 3 mos
Current Experience

Alicanto consultants private limited

Snowflake Data Engineer

Jan 2024 – Present · 2 yrs 3 mos · Bengaluru, Karnataka, India · Hybrid

  • Standard Life Aberdeen is a global investment management firm offering asset management and financial services. The project involved developing an Investment Data Mart in Snowflake by integrating financial data from Azure Blob and SQL Server for reporting and analytics.
  • Responsibilities:
  • Implemented Snowpipe for near real-time data ingestion from Azure Blob Storage into Snowflake.
  • Performed bulk data loading using COPY command to load data from internal and external stages into Snowflake tables.
  • Created and managed Snowflake objects including databases, schemas, tables, stages, and file formats.
  • Developed complex Snowflake SQL queries using joins, aggregate functions, and analytical functions for business reporting.
  • Worked with Snowflake Streams for Change Data Capture (CDC) and implemented Slowly Changing Dimensions (SCD).
  • Implemented Snowflake features such as Data Sharing, Zero Copy Cloning and Time Travel for data recovery and environment management.
  • Loaded data into Snowflake tables from internal stages, external stages, and local file systems.
  • Used PUT, GET, LIST and COPY commands for managing and validating staged data files.
  • Created table DDLs and database structures in Snowflake development environments based on business requirements.
  • Performed data validation and reconciliation between Oracle source systems and Snowflake target tables.
  • Involved in data cleansing and transformation processes to ensure data quality and consistency.
  • Developed SQL scripts to support business analysis and reporting requirements.
  • Cloned production databases for development and testing using Zero Copy Cloning.
  • Collaborated with team members and stakeholders to document development activities and technical solutions.
  • Supported unit testing, integration testing and UAT processes.
  • Troubleshooted data pipeline and data loading issues to ensure smooth data processing.
SnowflakeSnowpipeCOPY commandSnowflake SQLData validationData cleansing+2

Capitalgram marketing & technology private limited

Executive

Oct 2022 – Feb 2023 · 4 mos · Hyderabad · On-site

  • NYC Health + Hospitals is the largest public healthcare system in the United States, providing essential medical services to millions of New Yorkers. The organization uses modern data platforms like Snowflake to manage healthcare data and support reporting, analytics, and operational insights.
  • Role and Responsibilities:
  • Understood the business functionality and data flow of the system to support data warehouse operations.
  • Involved in migrating database objects from SQL Server to Snowflake Cloud Data Warehouse.
  • Implemented Snowpipe for continuous data ingestion and used COPY command for bulk data loading into Snowflake tables.
  • Created and managed internal and external stages for loading and transforming data into Snowflake.
  • Used Zero Copy Cloning to clone production data for development and testing purposes.
  • Shared sample datasets with business users by granting appropriate access for User Acceptance Testing (UAT).
  • Retrieved historical data using Snowflake Time Travel feature for data recovery and analysis.
  • Participated in Snowflake testing activities to evaluate and optimize cloud resource utilization.
SQLSnowpipeCOPY commandZero Copy CloningTime TravelSnowflake+1

Devagiri enterprises

Associate

Apr 2018 – May 2021 · 3 yrs 1 mo · Bengaluru, Karnataka, India · On-site

  • Elevance Health is a leading healthcare company that focuses on improving health outcomes through advanced data and technology solutions. The organization manages large volumes of healthcare data to support analytics, reporting, and operational decision-making. The DW Support project ensures reliable data integration, maintenance, and performance of enterprise data warehouse systems.
  • Role and Responsibilities:
  • Analysed service improvement areas and implemented solutions to enhance system performance and efficiency.
  • Developed SQL queries based on business requirements using joins, aggregations, and filtering techniques.
  • Participated in all phases of SDLC including requirement gathering, analysis, design, development, unit testing and deployment.
  • Created and maintained database objects such as views, materialized views, indexes, and query hints.
  • Executed automated ETL scripts and monitored execution logs to ensure successful data processing.
  • Used Git repository for version control and management of developed scripts.
  • Reported issues, clarified requirements, and raised defects for product related problems.
  • Participated in daily stand-up meetings and provided status updates on development tasks.
  • Supported daily regression testing to validate system functionality and data accuracy.
  • Contributed to knowledge transfer sessions and mentoring activities within the team.
  • Prepared weekly and monthly reports on project activities and performance metrics.
  • Created solution design documents, requirement traceability matrices (RTM) and technical documentation.
  • Worked on effort estimation for change requests and new enhancements.
  • Performed data analysis and issue resolution as part of production support activities.
SQLData analysisETLDatabase managementData Warehousing

Education

Aacharya Nagarjuna University

B. Com

Govt. High School

SSC

Narayana Junior College - India

MPC

Stackforce found 100+ more professionals with Snowflake & Data Warehousing

Explore similar profiles based on matching skills and experience