Shreyas M C

Software Engineer

Mangaluru, Karnataka, India3 yrs 4 mos experience
Most Likely To Switch

Key Highlights

  • Expert in building and optimizing Snowflake data pipelines.
  • Proficient in ETL processes and data pipeline management.
  • Strong background in Python and data analysis.
Stackforce AI infers this person is a Data Engineer specializing in Snowflake and ETL processes within the SaaS industry.

Contact

Skills

Core Skills

SnowflakeData EngineeringEtlWeb Development

Other Skills

AWS S3Amazon S3Apache AirflowAptitudeAutosysCNC ProgrammingCOPY INTOCSSCamdCascading Style Sheets (CSS)Change Data Capture (CDC)Cloud ComputingCommunicationComputer-Aided Design (CAD)Data Analysis

About

Snowflake Data Engineer with 3.4 years of experience specializing in building and optimizing Snowflake data pipelines. Proficient in leveraging Snowflake features like snowpipe, data sharing, data masking, and COPY INTO commands, Clustering to ensure secure, efficient, and scalable data solutions. Skilled in Python and working with Keystone data pipelines to debug and monitor workflows. Adept at managing complex ETL processes and ensuring high-quality, reliable data pipelines.

Experience

Infosys

3 roles

Senior Systems Engineer

Promoted

Oct 2024Present · 1 yr 5 mos · Hybrid

  • ◾Implemented Snowpipe for continuous data loading into Snowflake using the COPY INTO command for efficient file uploads and ingestion. Monitored the pipeline and resolved issues promptly for smooth operations.
  • ◾ Experienced in querying data from staging files before ingestion using external tables, and proficient in loading structured and semi-structured data (e.g., JSON, XML, CSV) into Snowflake.
  • ◾ Uploaded CSV files from local instances to AWS S3 and loaded them into Snowflake for processing.
  • ◾ Responsible for the design, development, and maintenance of Snowflake database objects (tables, views, and user-defined functions).
  • ◾ Created internal and external stages, transformed data during load, and ensured smooth data pipeline operations.
  • ◾ Cloned production data for testing and code modifications, using time travel in Snowflake for data recovery.
  • ◾ Managed large, complex datasets (e.g., XML, JSON, CSV) from various sources.
  • ◾ Created share objects for controlled access to reader accounts, ensuring secure data sharing.
  • ◾ Gained expertise in micro-partitions to optimize storage and improve query performance.
  • ◾ Hands-on experience with CDC (Change Data Capture) and SCD (Slowly Changing Dimensions) for data pipeline implementation.
  • ◾ Created non-materialized, secured, and materialized views to ensure data consistency, security, and performance.
  • ◾ Improved query performance by addressing clustering keys, adjusting warehouse size, and increasing cluster numbers for better efficiency.
  • ◾ Monitored Keystone jobs, troubleshooting and resolving failures to ensure smooth pipeline operations and collaborating with business users to address data-related issues.
SnowpipeCOPY INTOAWS S3SnowflakeData SharingData Masking+6

Systems Engineer

Jan 2023Sep 2024 · 1 yr 8 mos · Hybrid

  • ◾As a Data Engineer and I'm responsible for Data cleaning, Data Analysis and Data handling.
  • ◾Created objects, including tables, views, indexes, and
  • referential integrity, with a strong ability to translate requirements into technical specifications.
  • ◾Modified CI views and stage tables in Snowflake to meet business requirements and performed various operations such as update, insert, delete, alter, copy into, clone, and drop ensuring data integrity and streamlining processes.
  • ◾Handled job promotion and deployment using data pipelines, including managing changes in Git and promotion of jobs through development, test, and production environments, and configuring job and process settings such as SLA adjustments and scheduling. This helped to meet the business requirements.
  • ◾Utilized Apache Airflow to check the job status and error logs and resolved the issues that occured during the job failure. It lead to smooth transfer of data from Source to destination.
  • ◾Developed Python scripts to automate the manual validation task that helped us to reduce the validation time.
  • ◾Written automation scripts by using python to automatic validation of data.
Data CleaningData AnalysisSnowflakeApache AirflowPythonGit+2

Systems Engineer Trainee

Sep 2022Dec 2022 · 3 mos · Hybrid

  • ◾I was assigned to "Open-Source Technologies" stream, Where I learnt various open source technologies like Java, Python, RDBMS, MySQL, Unix, MongoDB, HTML5, CSS, and Python Flask frameworks.
  • ◾Created a capstone project named as "Bank Management System" which consists of three modules Customer module, Admin module and Cashier module. I was handling the customer module.
  • Skills Applied:
  • ◉ Frontend: HTML5, CSS3 and JavaScript
  • ◉ Backend: Python flask, MySQL, Unix
JavaPythonRDBMSMySQLUnixMongoDB+2

Tata consultancy services

Intern

Jun 2021Sep 2021 · 3 mos · Remote

  • ◾Completed training under TCS Youth Employment Program aimed at making graduates industry-ready.
  • ◾Developed expertise in Python, Database Management Systems (DBMS), and MySQL.
  • ◾Gained hands-on experience in building and managing databases and writing optimized SQL queries.
  • ◾Received a Certificate of Accomplishment for successfully completing the program.
PythonDatabase Management Systems (DBMS)MySQL

Education

Visvesvaraya Technological University

Bachelor of Engineering - BE — Mechanical Engineering

Jan 2018Jan 2022

Sri Subrahmanyeshwara Pre University College

PCMB

Jun 2016Jun 2018

Blessed Kuriakose English medium school

Jun 2015Jun 2016

Stackforce found 100+ more professionals with Snowflake & Data Engineering

Explore similar profiles based on matching skills and experience