Ram Kumar G

Data Engineer

Bengaluru, Karnataka, India10 yrs 7 mos experience
AI Enabled

Key Highlights

  • 8+ years of experience in Cloud and Data Engineering.
  • Expertise in Azure and GCP data solutions.
  • Strong background in Data Warehousing and Business Intelligence.
Stackforce AI infers this person is a Data Engineering expert in Cloud and Business Intelligence sectors.

Contact

Skills

Core Skills

Data EngineeringCloud Technologies

Other Skills

Apache KafkaApache SparkAzure AIAzure Data FactoryAzure Data LakeAzure DatabricksBusiness Intelligence & Data WarehousingData AnalysisData WarehousingDelta LakeE-LearningGoogle Cloud Platform (GCP)HadoopHiveInsurance

About

8+ years of experience in Cloud, Business intelligence, and data warehouse using different tools and methodologies. Worked on Data Modelling, Data Integration, And Data Analytics Skills - • Azure - Azure Data Factory, Data Bricks, Azure Data Lake • GCP - BigQuery, GCS, Dataproc, Composer (Apache Airflow) • Hive, Spark, Spark-SQL, Big Query • MSBI (SSIS, SSAS) • SQL Server • Python for Data Science

Experience

Deloitte

Senior Consultant – Data Engineer

Jun 2024Present · 1 yr 9 mos · Bengaluru, Karnataka, India · Hybrid

Azure Data FactoryApache SparkAzure DatabricksAzure AIData WarehousingAzure Data Lake+5

Singapore department of statistics (dos)

Lead Data Engineer

Jul 2023Feb 2024 · 7 mos · Singapore, Singapore · On-site

Walmart global tech

Data Engineer III

Aug 2020Jul 2023 · 2 yrs 11 mos · Bengaluru, Karnataka, India

Azure Data FactoryApache SparkAzure DatabricksGoogle Cloud Platform (GCP)Data WarehousingAzure Data Lake+5

Accenture

Senior Software Engineer

Jul 2019Jul 2020 · 1 yr · Bengaluru, Karnataka, India

  • Having nearly 5 years of experience in Analysis, Development, Integration, and Maintenance of
  • Web-based and Client/Server applications using Python and Big Data technologies like
  • Kafka, Azure Data Factory, Data Bricks, Spark, and Pyspark.
  • Out of which 2.5 years of Solid experience in the Insurance Domain.
  • Exceptional skills in Automating repetitive tasks using Pyspark, AutoSys Job Scheduler,Solid
  • Experience in Insurance, E-learning
  • Having hands-on experienced with Spark SQL and Spark Streaming with good knowledge of performance tuning techniques.
  • Good knowledge and hands-on experience with Data & Cloud Data factory, Azure data
  • bricks, Azure DW, Azure Data Lake
  • Extensive experience in implementation of the version control software GitHub.
  • Experienced in using Kafka as a distributed publisher-subscriber messaging system.
  • In-Depth understanding of Spark Architecture including Spark Core, Spark SQL, Data Frames.
  • Experienced in building ETL pipelines using Azure data factory and data bricks
  • Have good Knowledge of ETL and hands-on experience in ELT.
  • Experienced in Azure services like Azure SQL, Azure DW, Blob storage, Azure Data Lake.

Skience(formerly known as athene group)

Software Engineer

Jul 2018Jun 2019 · 11 mos · Bengaluru, Karnataka, India

  • Gathering and Creating Analysis document for migration.
  • Developed generalized Data bricks Notebooks for re-use of code based on configurations.
  • Developed U-SQL scripts to perform Extract, Transform, Load on files in Data Lake, creating views and custom transformations.
  • Experience in building ETL pipelines using Azure data factory and data bricks
  • Good knowledge and hands-on experience with all Hadoop and Spark ecosystems like Apache Hive,
  • Kafka, Apache impala, Nifi, Oozie, Phoenix, Hbase, Elasticsearch, and Kibana.
  • In-Depth understanding of Spark Architecture including Spark Core, Spark SQL, Data Frames.
  • Develop and execute database queries and conduct analyses.
  • Developed SQL Server objects like Tables, Views, Stored Procedures, Triggers, Indexes, and UDF’s according to business.
  • Create data integration and technical solutions for Azure Data Lake Analytics, Azure Data Lake
  • Storage, Azure Data Factory, Azure SQL databases for providing analytics and reports for
  • improving marketing strategies.
  • Creating ADL, Azure Storage, ADB, and resource groups in the Azure portal.
  • Created linked services to connect to Azure Storage, on-premises SQL Server.
  • Created U-SQL script for transform activities and developed complex queries to transform the data from multiple sources and output the data into Azure Analytics.

Excelsoft technologies

Software Engineer

Feb 2015Jul 2018 · 3 yrs 5 mos · Mysuru, Karnataka, India

  • Gathering requirements from business and Interacted with Team and Analysis, Design and
  • Develop database using ER Diagram, Normalization, and relational database concept.
  • Involved in Design, Development, and enhancement of ETL packages of the system.
  • Developed SQL Server objects like Tables, Views, Stored Procedures, Triggers, Cursors, Indexes
  • and UDF’s according to business.
  • Designed Cubes with Snowflake using SQL Server Analysis Services 2008 (SSAS).
  • Created and Designed Data Source and Data Source Views and also configured OLAP Cubes (Star
  • and Snowflake Schema) using SSAS.
  • Extensively involved in the SSAS storage and partitions, and Aggregations, calculation of queries
  • with MDX developing reports using MDX and SQL.
  • Extensively used Joins and sub-Queries to simplify complex queries involving multiple tables.
  • Extensively working in data Extraction, Transformation, and Loading from source to target
  • a system using SQL Server Integration Services.
  • Creation of workflows with Data Flow Task, Script Task, Execute SQL Task, and Containers.
  • Used transformations like Derived Column, Conditional Split, Aggregate, Multicast, Sort, and
  • Merge join, Data Conversion, and Union all.
  • Created Configurations, Checkpoints, and Breakpoints.
  • Maintained log information in the Text file to tracking Errors and recording package execution status.
  • Develop general and parameterized Azure Data Factory Pipelines, Activities, Data Sets, and
  • Linked Services.
  • Implemented Copy activity, Stored procedure activity, Lookup activity, Metadata activity, for
  • each activity, Pipeline Activities for On-cloud ETL processing.

Education

Bangalore University

Master of Computer Applications - MCA — Computer Science

Stackforce found 100+ more professionals with Data Engineering & Cloud Technologies

Explore similar profiles based on matching skills and experience