Ayush Gupta

DevOps Engineer

Gurgaon, Haryana, India8 yrs 1 mo experience

Key Highlights

  • Over 7.5 years of experience in Azure data engineering.
  • Gold badge holder for SQL and Python on HackerRank.
  • Expert in building complex data pipelines and dashboards.
Stackforce AI infers this person is a Data Engineer specializing in Azure cloud solutions and data analytics.

Contact

Skills

Core Skills

Azure Data EngineeringData Pipeline DevelopmentData VisualizationData EngineeringExploratory Data AnalysisData Analysis

Other Skills

AirflowAlgorithmsAnalytical SkillsApacheAzure Data FactoryAzure Data LakeAzure DevopsCC++Data ModelingData TransformationDatabase DevelopmentDatabase QueriesDatabasesMatlab

About

Experienced Azure cloud data engineer with an enriching experience of more than 7.5 years. Skilled in Azure data factory, Azure data bricks, Azure Devops, Azure logic apps, Azure datalake storage, Azure analytics services, Azure synapse and other azure services that pertain to data engineering Good in big data analysis using Python pandas dataframes, Apache Spark & Excel. Data visualization with Tableau desktop, Tableau server, Matplotlib, Seaborn, Plotly & cufflinks. Proficient in writing complex SQL queries. Gold badge holder for SQL on Hackerrank. Good with problem solving, Algorithms, Python, Java & C++. Gold badge holder for Python on Hackerrank. Good in Machine Learning algorithms with SciKit Learn, including Linear Regression, Logistic Regression, K Nearest Neighbors, K Means Clustering, SVM, NLP, Decision Trees & Random Forests Good in building deep learning and neural networks models with tensorflow & keras Strong engineering professional with a Bachelor of Technology (B.Tech.) focused in Electrical and Electronics Engineering from Galgotias College Of Engineering And Technology.

Experience

Hcltech

Senior Technical Lead

Jun 2025Present · 9 mos · Gurugram, Haryana, India

Fractal

Lead Data Engineer

Jan 2023May 2025 · 2 yrs 4 mos

  • Responsible for developing the Azure CI/CD build and release pipelines
  • in Azure Devops which helps in migration of databricks notebooks, azure
  • data factory pipelines and azure logic apps into QA and production environment.
  • ∙ Building end to end ETL pipelines in Azure data factory.
  • ∙ Developing the pyspark databricks notebooks for all sorts of data transformation
  • as per the client requirements.
  • ∙ Responsible for creating external delta format tables of the existing fact
  • and dimension tables in ADLS under the unity catalogue. Migrating an
  • existing databricks workspace to unity catalogue.
  • ∙ Loading the datasets in to PowerBI dashboard and developing the dashboards
  • frontend as per clients need.
  • ∙ Automated client reports by creating 4 dataflows on power BI Service
  • that removed all the manual intervention done by client.
  • ∙ Working on adhoc requests of extraction of data from backend as per
  • client requirement.
  • ∙ Serving as a Power BI SME for optimizing and developing the dashboards
  • across all projects in unilever.
  • ∙ Serving as productionization SME to migrate the various azure services
  • to higher environment across all projects in unilever.
  • ∙ Reduced the execution time by 1/10 of 2 client dashboards by moving
  • all the heavy complex calculation done by Power BI DAX to pyspark
  • resulting revenue visibility of EUR 800K.
Azure DevopsAzure Data FactoryPySparkPower BIData TransformationAzure Data Engineering+1

Deloitte india (offices of the us)

Data Engineer

Dec 2020Jan 2023 · 2 yrs 1 mo · Gurugram, Haryana, India

  • Leading healthcare client
  • ∙ Worked on exploratory data analysis using PySpark in Azure
  • databricks to build a personalization engine for the client. This involves
  • carrying out multiple experiments to help achieve the desired outcome
  • for the client.
  • ∙ Worked on enhancing the framework based on source data consisting
  • of patient responses from the Azure Data lake to achieve 100 percent
  • adherence in Order ready return to stock program. The framework also
  • involves scheduling and testing airflow jobs.
  • ∙ Worked on a development in pyspark where we introduced the mms
  • functionality that embed media content including pictures and videos in
  • to sms messages that goes to target patients. Including the mms content
  • in sms content increased the Rx pick up rates from store. Implemented
  • multiple narrow transformations in to the experimentation framework to
  • fulfill the business requirement.
  • ∙ Worked on a development in pyspark that involved adding the new set
  • of verbiage in the sms content for the flu season for the target patients.
  • Development involved doing multiple narrow transformations consider-
  • ing best optimization techniques.
  • ∙ Worked on a development where the intent was to make target patients
  • aware of store lunch hours in California and Illinois states. This involved
  • doing multiple narrow transformations considering best optimization
  • techniques.
  • ∙ Responsible for creating validation notebook and performing end to end
  • testing of airflow dags in Azure Databricks.
  • ∙ Solely responsible for understanding the client requirements and ensur-
  • ing the deliverables are on time.
PySparkAzure Data LakeAirflowData AnalysisData EngineeringExploratory Data Analysis

Ericsson india

Data Analyst

Oct 2017Sep 2020 · 2 yrs 11 mos · Gurgaon, India

  • SPRINT USA
  • Key responsibilities:
  • ∙ Analyze the cell availability KPI of 2G, 3G & 4G technology sites for the
  • whole market from the dataset received from NOC team using Pandas.
  • ∙ Performing Statistical analysis on data with the frequency of daily, weekly
  • and monthly which included trend comparison and prediction.
  • ∙ Prepare fault categorization report for all the sites in the market on
  • weekly & monthly basis. Report visualization with the help of Tableau.
  • ∙ Prepare outage summary report of the sites on weekly & monthly basis.
  • Report visualization with the help of Tableau.
  • ∙ Find out top 10 longest non-performing sites in the whole market.
  • ∙ NH monitoring of the sites for a week by extracting KPIs and performing
  • RCA if the KPI trend is not up to the mark. If found any major degradation
  • on any site then raise flag and contact to the market lead for the
  • issue asap.
  • ∙ Monitoring KPIs of special events which takes place in USA on daily basis.
  • KPIs such as CFR, FAILURES, CDR, DROPS, ATTEMPS, HOSR, etc.
  • Preparing dashboards from tableau for the same KPIs on regular basis
  • and sharing it with the customer.
  • ∙ Creating & analyzing root metrics report for a whole market. Find out
  • sleeping cells in the whole market by extracting data of all the sites in
  • that market. Performing root cause analysis to find out why the cells are
  • not taking traffic.
PandasTableauStatistical AnalysisData AnalysisData Visualization

Uttar pradesh rajya vidyut utpadan nigam limited

Summer Trainee

Jun 2016Jul 2016 · 1 mo · Kanpur Area, India

Education

D P Vidyaniketan

Intermediate — PCM with Computer Science

Galgotias College Of Engineering And Technology

Bachelor of Technology (B.Tech.) — Electrical and Electronics Engineering

Stackforce found 100+ more professionals with Azure Data Engineering & Data Pipeline Development

Explore similar profiles based on matching skills and experience