Nelwin J.

Data Engineer

Mülheim an der Ruhr, North Rhine-Westphalia, Germany5 yrs 11 mos experience
Most Likely To Switch

Key Highlights

  • Expert in data engineering across multiple cloud platforms.
  • Proven track record in enhancing retail operations through data analytics.
  • Strong collaboration skills with diverse teams across time zones.
Stackforce AI infers this person is a Data Engineering expert in Retail and Real Estate sectors.

Contact

Skills

Core Skills

Data EngineeringMicrosoft AzureGoogle Cloud Platform (gcp)Web DevelopmentAwsSustainability AnalyticsHadoopSoftware EngineeringMiddleware Integration

Other Skills

Apache SparkAzure DatabricksGitlabHiveContinuous Integration and Continuous Delivery (CI/CD)Azure Data FactoryEnterprise Data Warehouses (EDW)Python (Programming Language)JiraTechnical DocumentationOnboardingExploratory Data AnalysisAgile Project ManagementConfluencePySpark

About

Experienced data analytics and engineering professional adept in diverse sectors including banking, real estate, and retail. With a proven track record in harnessing the power of AWS and GCP cloud environments, I am currently delving into data engineering within the Azure cloud sphere. A dynamic team builder with a knack for steering initiatives, I've collaborated on projects spanning locations and time zones. Thriving in challenging work settings, I'm also fueled by a deep passion for music. Let's connect to explore synergies, share insights, and perhaps even exchange some favorite tunes. Looking forward to connecting with fellow professionals who share a drive for innovation and a love for music! 🎶📊 #DataAnalytics #CloudEngineering #TeamLeadership #MusicProduction

Experience

5 yrs 11 mos
Total Experience
1 yr 5 mos
Average Tenure
2 yrs 8 mos
Current Experience

Aldi dx

Senior Expert - Data Engineer

Aug 2023Present · 2 yrs 8 mos · Mülheim an der Ruhr, North Rhine-Westphalia, Germany · Hybrid

  • Collaborating with stakeholders such as data scientists, machine learning engineers, and product owners to define requirements and objectives for enterprise data analytics projects aimed at enhancing Aldi store operations.
  • Leveraging Azure Data Factory to design and implement robust data pipelines for efficient extraction, transformation, and loading (ETL) of data from diverse source systems into Azure cloud environments.
  • Ensuring data quality and consistency through rigorous preprocessing and validation techniques within the data pipelines, in alignment with business needs and analytics objectives.
  • Implementing scalable and reliable solutions to handle large volumes of data and optimize performance for building high-performing analytics applications.
  • Continuously iterating and improving data pipelines and analytics products based on feedback, emerging technologies, and evolving business requirements to drive innovation and value for Aldi's store operations.
Apache SparkMicrosoft AzureAzure DatabricksHadoopGitlabHive+10

Pricehubble

Data Engineer

Mar 2022Aug 2023 · 1 yr 5 mos · Hamburg, Germany · Remote

  • Worked closely with stakeholders to design, build and maintain the data platform for
  • the GeoInsights feature team at Pricehubble
  • Developed ETL workflows using Airflow, Luigi, PySpark and BigQuery to collect,
  • transform and ingest the underlying data for Pricehubble’s API products
  • Responsible for identifying and assessing data sources and providers for the
  • introduction and improvement of features
  • Improved logging and monitoring capabilities for various data workflows
  • Worked on a data pipeline to generate datasets from offers and transactions data for training machine learning models for property valuation and to analyse real-estate market trends
  • Contributed to building the data infrastructure for the team on Google Cloud Platform
  • in close collaboration with the SRE team
  • Helped with technical onboarding of new engineers and documentation efforts of the
  • GeoInsights team
ConfluencePySparkdockerGoogle BigQueryPostgreSQLData Pipelines+15

Liminalytics

2 roles

Working Student - Data Science

Jan 2022Mar 2022 · 2 mos · Frankfurt am Main, Hesse, Germany

  • Worked closely with the founders to develop the back-end and front-end of the
  • company’s product suite using HTML, CSS, Django and PostgresQL
  • Worked on processing and transformation of geospatial data using Python libraries like
  • GeoPandas and Shapely for use in the development of a physical risk model
  • Collaborated with the founders to identify and evaluate different technologies to
  • achieve the company’s goals and vision
  • Developed interactive maps using Mapbox for the visualisation of various risk indicators
  • on a web application
Amazon Relational Database Service (RDS)AJAXHTMLGitHubdockerPostgreSQL+10

Data Science Intern

May 2021Aug 2021 · 3 mos · Frankfurt, Hesse, Germany

  • Integrated analytics that assess climate & other sustainability impacts, both physical & transitional, across value chains, and translated into financial risks and opportunities.
Amazon Relational Database Service (RDS)AJAXGitHubdockerPostgreSQLData Pipelines+9

Yttrium

Associate Intern - Data Science

Aug 2021Dec 2021 · 4 mos · Frankfurt, Hesse, Germany

  • Developed solutions for sourcing and analysing data about B2B tech startups to support
  • the fund’s investment decisions
  • Established a data pipeline combining on-premise systems and AWS for storing and
  • processing data sourced from multiple sources
  • Created scrapers to gather data on important KPIs from various online sources
  • Designed APIs for integration of the data sourced by the data science team with the
  • Salesforce environment used by the investment team
  • Successfully completed a data migration project to migrate data from an on-premise
  • database to AWS Cloud
Amazon Relational Database Service (RDS)GitHubPySparkdockerREST APIsPostgreSQL+11

Ey

Associate Analyst

Oct 2018Apr 2020 · 1 yr 6 mos · Bangalore

  • Performed data pre-processing and transformation tasks to support analyst teams that
  • work on retail lending issues of a bank
  • Worked on a data migration project to migrate data from a Hadoop data lake to a
  • MongoDB database for API development
  • Supported the implementation of a data lake to store and manage credit card related
  • data for a bank
  • Developed an ETL workflow for report generation from access logs and timesheet data
  • as part of internal processes
HadoopPySparkMongoDBData PipelinesProblem SolvingMicrosoft Office+6

Ibm

Associate System Engineer

Jul 2018Oct 2018 · 3 mos · Bengaluru

  • Worked on the development of a middleware integration solution with a messaging
  • system based on the pub-sub model
  • Contributed to project management and technical documentation activities of the team
  • Worked on L1 support of the middleware integration product
JavaProblem SolvingMicrosoft OfficeGitMicrosoft ExcelSQL+3

Education

University of Koblenz and Landau

Master's degree — Web and Data Science

Jan 2020Jan 2022

Cochin University of Science and Technology

Bachelor of Technology - BTech — Computer Science

Jan 2014Jan 2018

Stackforce found 100+ more professionals with Data Engineering & Microsoft Azure

Explore similar profiles based on matching skills and experience