Neeraj Goyal

Data Engineer

Bengaluru, Karnataka, India10 yrs 8 mos experience
Highly Stable

Key Highlights

  • 10 years of experience in data engineering and software development.
  • Expert in designing robust data warehouse solutions.
  • Proficient in cloud platforms like AWS and Azure.
Stackforce AI infers this person is a Data Engineering expert with a strong focus on cloud-based solutions and ETL processes.

Contact

Skills

Core Skills

Data EngineeringEtlDatabase ManagementProduct Engineering

Other Skills

APIAgile Software DevelopmentAgile software developmentAlumni EngagementAmazon RedshiftApache AirflowAutoSys Scheduling ExpertiseBig Data processingBlockchainCICD Pipeline ImplementationCICD pipelinesCloud and On-premises Relational DatabasesDBDWH DesignData Architects

About

Experienced Data Warehouse Professonal with 10 years in software development and a Computer Science degree. Proficient in Python, data engineering, and cloud platforms (AWS, Azure), I specialize in designing and implementing robust data warehouse solutions, optimizing data management, and collaborating with teams to ensure efficient, scalable systems.

Experience

Morgan stanley

Data Engineer

Sep 2025Present · 6 mos · Bengaluru, Karnataka, India · On-site

  • Data Engineering
  • ETL
  • Snowflake
  • Python
  • Problem Solving
Data ArchitectsPython (Programming Language)Data MaintenanceData ModelingSnowflakeData Engineering+1

Wissen technology

3 roles

Senior Principal Engineer

Apr 2024Sep 2025 · 1 yr 5 mos · Bengaluru, Karnataka, India

  • Data Engineering
  • CICD pipelines
  • ETL
  • Python
  • Snowflake
  • SQL
Data EngineeringCICD pipelinesETLPythonSnowflakeSQL

Principal Engineer

Promoted

Oct 2023Apr 2024 · 6 mos · Bengaluru, Karnataka, India

  • 1. Working On "On-Prem" to Snowflake Migration Initiatives
  • Migrating from on-premise databases to Snowflake, ensuring seamless transition and optimized performance.
  • 2. ETL Solutions Development (Python, Yaml, Informatica)
  • Developed robust Extract, Transform, Load (ETL) solutions using Python, Yaml, and Informatica, streamlining data workflows and enhancing data quality.
  • 3. Integration of New Data Sources (DB, API)
  • Onboardig diverse data sources including databases and APIs, expanding the scope of accessible information for analytical purposes.
  • 4. AutoSys Scheduling Expertise
  • Applied AutoSys scheduling tool adeptly, ensuring timely execution of critical data processes and optimizing resource utilization.
  • 5. CICD Pipeline Implementation
  • Orchestrated Continuous Integration and Continuous Deployment (CICD) pipelines, streamlining the development-to-production process for greater efficiency.
  • 6. Agile Software Development (Scrum)
  • Demonstrated proficiency in Agile methodologies, particularly Scrum, enabling iterative and collaborative development cycles for effective project execution.
On-Prem to Snowflake MigrationETL Solutions DevelopmentIntegration of New Data SourcesAutoSys Scheduling ExpertiseCICD Pipeline ImplementationAgile Software Development+2

Senior Software Engineer

Sep 2020Oct 2023 · 3 yrs 1 mo · Bengaluru, Karnataka, India

  • Working on Cloud and On-premises Relational Databases including database design, modeling, SQL and stored procedure.
  • Working with Big Data processing and technologies like Snowflake, Hadoop, Spark, Kafka.
  • Working on UNIX/Linux platform, Shell scripting
  • Working on On-premises to cloud migration
  • Developing ETL solutions using Python , Yaml, and Informatica.
  • RDBMS Sybase and/or DB2, with ability to write simple and complex SQL queries.
  • Working with data in JSON format and/or NoSQL databases
  • Working on scheduling tool like AutoSys.
  • Agile software development experience- Scrum
Cloud and On-premises Relational DatabasesBig Data processingUNIX/Linux platformETL solutionsData in JSON formatScheduling tool like AutoSys+3

Zoomcar

Data Engineer

Dec 2019Sep 2020 · 9 mos · Bengaluru Area, India

  • Develop, construct, test and maintain architectures
  • Align architecture with business requirements
  • Data acquisition and migration
  • Develop data set processes
  • Use programming language and tools for ETL
  • Identify ways to improve data reliability, efficiency and quality
  • Conduct research for industry and business questions
  • Use large data sets to address business issues
  • Deploy sophisticated analytics programs, machine learning and statistical methods
  • Prepare data for predictive and prescriptive modeling
  • Find hidden patterns using data
  • Use data to discover tasks that can be automated
  • Deliver updates to stakeholders based on analytics
  • Reporting-dash boarding and story telling using tableau
Data ArchitecturesData acquisition and migrationETLMachine learningStatistical methodsReporting-dash boarding+1

Birla institute of technology and science, pilani

Database Manager-Alumni Relations

Oct 2018Dec 2019 · 1 yr 2 mos · Pilani

  • 1. Cultivate Partnerships with Industry Stakeholders
  • Initiate and oversee collaborative ventures and events with industry partners for mutual growth and success.
  • 2. Strengthen Alumni Engagement
  • Forge and nurture connections with alumni and alumni groups to foster a sense of belonging and support.
  • 3. Maintain Alumni Database
  • Administer the alumni database, ensuring accurate and updated information on contacts and committee members.
  • 4. Implement and Validate Data Systems
  • Establish and rigorously test new database and data handling platforms to optimize operations.
  • 5. Monitor Database Performance
  • Oversee the efficiency and functionality of the database, ensuring smooth and reliable operations.
  • 6. Generate Management Reports
  • Design and compile insightful reports for management, providing valuable insights for decision-making.
  • 7. Standardize Data Processing Protocols
  • Develop and enforce protocols for streamlined data processing, ensuring consistency and accuracy.
  • 8. Formulate Complex Queries
  • Create advanced query structures to extract specific data, enhancing the depth of analysis and insights.
Partnerships with Industry StakeholdersAlumni EngagementDatabase MaintenanceData Systems ImplementationDatabase Performance MonitoringDatabase Management+1

Cartesian consulting

Product Engineer

Sep 2017Sep 2018 · 1 yr · Bangalore · On-site

  • [Product Engineer : Python | ETL| SQL]. Working on SOLUS [ Product Development] Development. SOLUS is an Cartesian flagship product which sends highly personalized communication to customers and It is an end to end platform to do campaign and campaign analysis.The sole goal of SOLUS is customers engagement with their brands and customers repeat. SOLUS also help it's customers to increase their revenue top line.
Product DevelopmentPythonETLSQLProduct EngineeringData Engineering

Wipro limited

Project Engineer

Jun 2015Aug 2017 · 2 yrs 2 mos · Kolkata, West Bengal, India · On-site

  • Worked for Novartis pharmaceuticals. Novartis is an Switzerland based pharmaceuticals company. Responsibilities include in that project designing and development of DWH. Implementation of ETL pipe line using informatica power centre and mattle [Mettle is an Novartis's inhouse DWH tool]. Development of KPI driven dashboard that show the vulnerability of servers across the globe. Efforts management and DWH management Dashboard that help business users to keep on eye of business activities
DWH DesignETL pipeline implementationKPI driven dashboard developmentData EngineeringETL

Education

Madhav Institute of Technology and Science, Gwalior

Master of Computer Applications (MCA) — computer applications

Jan 2012Jan 2015

City College Gwalior

BCA - Bachelor of Computer Application — Computer Science

Jan 2008Jan 2011

Stackforce found 100+ more professionals with Data Engineering & Etl

Explore similar profiles based on matching skills and experience