S

Shubham Chawla

DevOps Engineer

Hyderabad, Telangana, India10 yrs experience
Most Likely To SwitchAI ML Practitioner

Key Highlights

  • Expert in Big Data technologies and cloud platforms.
  • Proven track record in data migration and automation.
  • Strong experience in developing analytics tools for healthcare.
Stackforce AI infers this person is a Data Engineering expert in the Healthcare and Cloud Computing sectors.

Contact

Skills

Core Skills

Cloud Data EngineeringGcpBig DataData Engineering

Other Skills

AWSAWS GlueAWS LambdaAmazon Elastic MapReduce (EMR)Amazon RedshiftApache AirflowApache KafkaAutomationBig Data AnalyticsCloud technologiesData MigrationData ValidationEMRElasticsearchGenai

About

A confident and versatile individual with competency in Big Data Technologies like Hadoop, Spark, Hive, Teradata, Snowflake, and proficiency in Python, Scala, and SQL with ample amount of experience in various cloud platforms including AWS, AZURE and GCP. I’m currently working as a Strategic Cloud Data Engineer at Google solving complex customer challenges related to GCP onboarding and optimisation. I also talk about Cloud technologies and have been involved in various technical trainings spanning diverse group of people across various regions.

Experience

Google

Strategic Cloud Data Engineer

Dec 2021Present · 4 yrs 3 mos · Hyderabad, Telangana, India

GCPCloud technologiesData EngineeringCloud Data Engineering

Legato health technologies

Senior Cloud Data Engineer

Feb 2020Dec 2021 · 1 yr 10 mos · Greater Hyderabad Area

  • Worked on an in-house developed analytics tools
  • which gives deep insights on various healthcare parameters to the users.
  • Worked as a lead in one of the Data Foundation Teams responsible to
  • provide reliable data in Snowflake to Business Intelligence team.
  • Developed a Quality Analysis Tool to automate various Table specific
  • validations in order to gain user trust by providing reliable data.
  • Worked as lead AWS/Big Data developer to develop an end-to-end
  • framework to securely move data from on-prem to AWS Cloud.
  • Developed various EMR/Glue Jobs and pipelines using Step-Functions
  • and AWS Lambda to process petabytes of data in AWS Cloud.
  • Worked on various Data Migration projects, mainly focusing on moving data from Teradata to Snowflake.
SnowflakeAWSData MigrationBig DataEMRGlue+2

Deloitte india (offices of the us)

Consultant

Aug 2018Jan 2020 · 1 yr 5 mos · Greater Hyderabad Area

  • Client: Biggest Health insurance company in US.
  • Project: Develop a Spark based analytics platform to quickly find out adjudicated members in a state using Spark-Scala, Hadoop, Hive and Control-M to schedule the jobs.
  • Developed a Test Automation Engine to help SIT team automate their testing work thereby reducing the manual work to almost zero.
  • Client: World's largest mobile phone manufacturer
  • Project: Optimize and automate various Spark and Teradata jobs using Python, Spark and Datapipelines and in-house cloud platform in order to get rid of various redundant and manual work. Various automation completed in the span of 3 months saved almost 70% man-hours for the company.
  • Worked with various Data Scientists as a Data Engineer to automate, optimize and schedule their various Spark Jobs.
SparkScalaHadoopHiveAutomationData Engineering+1

Cognizant

2 roles

Program Analyst

Jan 2017Jul 2018 · 1 yr 6 mos · Greater Chennai Area

  • Worked as a core PySpark developer with one of the biggest American multinational financial services corporation. CCAR, CECL, IFRS9 are United States regulatory framework introduced by the Federal Reserve to assess, regulate, and supervise large banks and financial institutions. Worked on to develop and automate these regulatory frameworks using PySpark, Hadoop, Hive and other Big Data technologies using Cornerstone Machine Learning platform.
PySparkHadoopHiveBig DataData Engineering

Program Analyst Trainee

Jan 2016Jan 2017 · 1 yr · Greater Chennai Area

  • Worked as a Spark/Hive developer to migrate a legacy framework to Big Data platform for a US based healthcare insurance company. Converted DB2 queries to Hive queries and COBOL code to Spark Scala using best optimization techniques. The latest framework is widely being used now for data analytics work by the company and is being used as a model to migrate rest of the segments to Big Data platform.
SparkHiveData MigrationData Engineering

Education

Lovely Professional University

Bachelor of Engineering - BE — Computer Science

May 2011Jun 2015

Stackforce found 100+ more professionals with Cloud Data Engineering & Gcp

Explore similar profiles based on matching skills and experience