Manoj Pandey

Senior Software Engineer

Hyderabad, Telangana, India19 yrs experience
Most Likely To SwitchAI ML Practitioner

Key Highlights

  • 18+ years of experience in Data Engineering.
  • Expert in Azure Data Platform and BI solutions.
  • Microsoft Certified Professional in SQL Server.
Stackforce AI infers this person is a Data Engineering expert in SaaS environments with a strong focus on Azure technologies.

Contact

Skills

Core Skills

Azure Data PlatformData EngineeringData MigrationData Warehouse DevelopmentBusiness IntelligenceTraffic Management SystemsData Quality ManagementDatabase DevelopmentReporting And AnalysisCrm Data MigrationCrm Application Support

Other Skills

Azure Data LakeAzure DatabricksSpark/ScalaPySparkPythonSQLAzure Data Factory (ADF)Azure CosmosDBAzure Analysis Service (ROLAP Cube)Power BIExcelData FactoryDatabricksAzure Event HubSQL Server

About

Overall 18+ years of rich experience as a Data Engineer with end-to-end implementations of various Big data, DW BI, ETL, and Reporting solutions using Microsoft Azure data platform stack. - Currently working as Lead Data Engineer with Microsoft India for 12+ years - Experienced in large-scale, big-data, and modern DW-BI data solutions on Microsoft Azure cloud using Azure Data Lake, Databricks, Synapse, Azure SQL DB, CosmosDB, Data Factory (ETL), AAS Cube, Power BI, and on-premise solutions. - Leading a team of Data Engineers who are responsible for developing data engineering solutions (Data model design and development, ETL development as well as reporting and analytical solutions). - Currently working on a Data Lakehouse and custom/self-serve Reporting solution using Databricks/Synapse (Spark, Scala), Data Factory ETL pipelines, AAS Cube (ROLAP/Tabular), and PowerBI tech stack. - Expert in implementing end-to-end DW/BI solutions by deriving Data Warehouse, Data Lakehouse & Data Marts from existing OLTP, Relational, noSQL, and heterogeneous source systems. - Strong in SQL querying with Spark & SQL Server, writing complex & optimized SQL scripts, Stored Procedures, Query tuning & optimization, and Administration. - Hands-on experience with Data & Dimensional Modelling, Database Design and Development. - Managed several Upgrades and Migrations of VLDBs with TBs of data with complex/custom business logic/code, and 3rd party tool dependency. - Microsoft Certified Professional (MCP), 70-461 - Querying Microsoft SQL Server 2012/2014 - Evangelizing “SQL and Data Analytics” through my personal blog SQLwithManoj.com

Experience

Microsoft

3 roles

Senior Software Engineer

Sep 2020Present · 5 yrs 6 mos

  • Working as a Senior Data Engineer and supporting Microsoft's mission-critical business applications and data platform for its CE&S (Premier) and Consulting (ISD/MCS) business.
  • Currently leading a team of 10 engineers on a project where we are building an end-to-end Data Lakehouse, BI & Reporting solution on the Azure data platform. This includes Azure Data Lake, Data Factory (ADF), Databricks, Synapse, Spark, Scala, Analysis Services, and Power BI.
  • I lead a project with 5 resources for automating manual processes in our Finance Data Lake which were also error-prone, and time taking. This includes providing a UI for upstream and downstream partners to manage data contracts. By using these data contract details we made provisions for access security and data availability to users from/to our Azure Data Lake.
  • Lead the team with Scrum planning & meetings, task prioritization, and allocation.
  • Help new team members grow and adapt to Org culture by coaching them.
  • Contributed to Org by bringing in the right talent (fresher & lateral) by participating in hiring events.
  • Technology: Azure Data Lake, Azure Databricks, Spark/Scala, PySpark, Python, SQL, Azure Data Factory (ADF), Azure CosmosDB, Azure EentHub, Azure Analysis Service (ROLAP Cube), Power BI & Excel for self-service BI.
Azure Data LakeAzure DatabricksSpark/ScalaPySparkPythonSQL+7

Software Engineer 2

Dec 2016Aug 2020 · 3 yrs 8 mos

  • Worked as a Data Engineer & DW-BI developer to support Microsoft's mission-critical business for its Premier partners & customers. The system was built on legacy applications and tools which were on-premise and not able to scale up with the increasing data volume & velocity and changing needs of the business.
  • The most impactful project for the line of business I did was the decommission of a legacy application MSSolve and its migration to a new application. As an IC I worked with the partner Engg & PM team to identify gaps, gather breaking changes, and make sure that the migration is done seamlessly at our BI end. This also involved moving from the old traditional data-pull scenario to the new data-push scenario from the partner source team by means of data streaming using Azure EventHub with Azure Data Lake, and creating a module for handling events & routing to the SQL Server database using Databricks, Spark & Scala technologies. I lead this project with 3 resources and completed it successfully without any downtime or issues.
  • As a senior resource, I lead a team of 8 engineers in performing the migration of the legacy ETL framework which was built using Informatica to ADF (Azure Data Factory) a cloud-based ETL.
  • As a senior IC I worked on various transformations & migrations happening at the data platform level within the group, like SAP data onboarding & retirement of a legacy application.
  • Technology: Azure DataLake, Azure Event Hub, Databricks, Spark/Scala, SQL Server, T-SQL, Informatica Power Center, Azure Data Factory (ETL)
Azure DataLakeAzure Event HubDatabricksSpark/ScalaSQL ServerT-SQL+4

Software Engineer

Jan 2012Nov 2016 · 4 yrs 10 mos

  • Worked as a database & DW-BI developer to support PSR a mission-critical system for Microsoft IT (MSIT org) for Premier-Support business. The system comprises a Data Warehouse built using SQL Server and PDW, SSAS Cube, a reporting application built using SSRS & custom Excel reports for Self-Service reporting, and ETL using Informatica & SSIS.
  • Took care of the DW-BI system end to end, which includes data acquisition from various data sources by creating ETLs. Storing the acquired data in a Factory/Staging database, and further integrating the data with multiple & complex business rules using Stored Procedures with optimized code. Creation of Data Marts & maintaining a Dimension Model for OLAP Cube, with a reporting application built on top of it for business users.
  • As an IC I also performed a few upgrades of the Data Warehouse, OLTP database, and OLAP SSAS Cube from aging SQL Server 2008 to SQL Server 2012, (and further from 2012 to 2014 version) without any production downtime. This involved identifying & replacing incompatible tools & apps with the new version and making sure that the discontinued or deprecated features worked seamlessly after the upgrade, maintaining with same or better performance.
  • Technology: SQL Server 2012, SQL Azure (PaaS), T-SQL, Informatica Power Center, METaL (SSIS framework), SSAS, SSRS
SQL Server 2012SQL Azure (PaaS)T-SQLInformatica Power CenterMETaL (SSIS framework)SSAS+3

Niit technologies limited

Senior Software Engineer

Nov 2011Jan 2012 · 2 mos · Noida, Uttar Pradesh, India

  • Tenix, an automated Traffic System for the Australian govt.
  • Technology: MS SQL Server 2008, TSQL
MS SQL Server 2008TSQLTraffic Management Systems

Rms india

Senior Software Engineer

Apr 2011Nov 2011 · 7 mos · Noida, Uttar Pradesh, India

  • The RMS® Data Quality Toolkit, a robust client-server application, enables insurers and reinsurers to assess exposure data quality, measure its impact, and target data improvements where they matter most for catastrophe risk models. Data can be viewed at a macro level, e.g. by cedant or portfolio, or honed in to the location level. The Data Quality Toolkit delivers objective and independent insight into data quality, providing transparent, actionable, and easy to communicate metrics to inform portfolio management and underwriting decisions.
  • Unlike the use of conventional metrics, the scoring metrics produced by the Data Quality Toolkit utilize RMS’ innovative Data Quality Analytics that is build upon the foundations of our catastrophe models to weight the quality of data by the importance of vulnerability, hazard, and net exposure. Scores account for the severity and gradient of the hazard, the relative importance of modeling attributes (e.g. occupancy, construction class, year built, and number of stories) for that region and peril, and the implications of financial structures, including attachment points.
  • Responsibilities:
  • Involved in understanding the business needs, functional specifications, work with the UI team in collaboration with onsite staff in US & UK and provide suggestions & solutions.
  • Technology: MS SQL Server 2008, T-SQL, MS SQL Server Integration Services (SSIS)
MS SQL Server 2008T-SQLMS SQL Server Integration Services (SSIS)Data Quality Management

Eli india

3 roles

Software Developer

Dec 2010Mar 2011 · 3 mos · Faridabad, Haryana, India

  • AdvantageCS system (as discussed in the previous project) also lacked features of a CRM system. Customer sales and service is becoming very necessary to keep track of sales pipeline and reporting.
  • We evaluated many CRM solutions in the market to suit our needs and budget. FreeCRM & Salesforce were the two viable candidates. SFDC was expensive, but was more flexible, robust, user friendly, on cloud application and had a fast response time. On the other side FreeCRM was very cheap, but was less flexible than SFDC, had a very complex architecture, on premise application with slow response time. So we finally ended up with the SFDC’s Sales Cloud Professional Edition.
  • Roles & responsibilities:
  • Involved in customizing the SFDC screens to capture every information related to a leads/contacts/accounts/sales.
  • Setting up backend integration for setting up provision to push leads and related data from our main system AdvantageCS (SQL Server as backend) to SFDC.
  • Using MS SQL Integration Services (SSIS) and Apatar (3rd party, free, open-source ETL tool/utility) as a Business Intelligence (BI) tool to integrate them.
  • With prior experience in CRM systems I also provided training to the Sales Reps & Managers on SFDC.
MS SQL Server 2005T-SQLASP.net MakerASP.net Report MakerMS ASP.net HTMLJavaScript+2

Software Developer

Promoted

Apr 2010Nov 2010 · 7 mos · Faridabad, Haryana, India

  • AdvantageCS contains millions of Customer’s information, publications, newsletters and other inventory items like CDs, Transcripts, Training Camps for Medical Coding, etc. It is a robust system in itself but every system has some limitations and drawbacks.
  • Like there is no provision for maintaining information related to Training Camps and Live products (CD/DVDs). And with new business processes getting functional there is a need of a new system which can maintain those processes and would be also extensible & flexible to implement future needs.
  • The new system would also reduce the number of Advantage licenses thus reducing the operating cost globally.
  • "Interface" will act like an extension to the AdvantageCS maintaining extended information for various items contained in AdvantageCS. It has 2 parts, first inventory management managed by admins. Second reporting used by various users & management which would reduce lot of AdvantageCS licenses.
  • Technology: MS SQL Server 2005, T-SQL, ASP.net Maker, ASP.net Report Maker, MS ASP.net HTML, JavaScript, AJAX
  • Role: Technical Analyst, Database Developer
MS SQL Server 2005T-SQLAdvantage CSCrystal ReportsMS ExcelVBA+1

Circulation Reporting Analyst

Aug 2009Mar 2010 · 7 mos · Faridabad, Haryana, India

  • Eli uses Advantage Computing System (AdvantageCS) as its internal tool to store its customer’s data and their preferred subscriptions. Subscriptions includes various kinds of publications, magazines, newsletters, Audio & live conferences, etc. related to Health-Care, Sports, Adventure, Finance, Travel, Business, etc. Advantage system tracks each and every outgoing publication and ensures their timely delivery to its customers. Advantage also helps to identify new and prospect customers with the help of its internal marketing team.
  • We as a team are engaged in providing production support to the system from the backend. My nature of work includes creation/modification of complex SQL queries, stored procedures, triggers and refreshable/pivot excel reports on ad-hoc basis. Besides this I also provide end user support to Advantage users by taking care of some of its admin activities like, user management, process/jobs creation, issue tracking & troubleshooting in a timely manner.
  • Technology: MS SQL Server 2005, T-SQL, Advantage CS, Crystal Reports, MS Excel, VBA
  • Role: Reporting Analyst, Business Analyst, Database Developer
MS SQL Server 2005T-SQLInternal CRM ApplicationDatabase Development

Sapient

Associate Technology L2

May 2008Jul 2009 · 1 yr 2 mos · Gurugram, Haryana, India

  • Toronto Dominion Banking & Financial Group, headquartered in Toronto, ON, Canada. TD bought two banks in the US, Commerzbank and Banknorth. In Sapient we are involved in integrating these banks to provide a seamless, TD bank’s experience to all customers.
  • The Banking mainframe system is based upon (FIS) Fidelity Information System with External System Interfaces as Image Archive (Titan & AFS), Solimar (eStatements), Metavante (Card Management System), RETI (Holds bank’s branch information) and SQN Fraud Detection system. Services for Consumer Apps (channels) are EH-Teller, Call Center App, Retail & Small Business Banking. The contract for this application is to expose number of the core services provided by Fidelity and other external systems (Web method Broker, JaQI, IBM CICS adapter, etc.) as a Web-Service that can be consumed by different channels/clients.
  • Responsibilities:
  • I was involved in creating/modifying business rules for the Business Support Center CRM application at the backend, i.e. Stored Procedures and writing SQL queries to fetch data for reporting.
  • Technology: MS SQL Server 2005, T-SQL, Internal CRM Application
  • Role: Database Developer
MS SQL Server 2005T-SQLDTSOnyx CRMInformatica Power CenterAutosys+1

Janus henderson investors u.s.

Sapient Consultant for JANUS

Jul 2007Apr 2008 · 9 mos · Denver Metropolitan Area

  • Janus Capital Group (Denver, CO, USA) decided to phase out its old CRM tool (Onyx) and switch to a new, robust and dynamic CRM i.e SalesForce (SFDC). With Sapient as its existing support model and Salesforce as a vendor Janus decided to use SFDC as its CRM tool onwards.
  • Played a key role in the process of data migration from Onyx CRM to SFDC. This involved data mining, cleansing, reporting, and creation of various stored procedures, triggers, Informatica mappings & Autosys jobs. It also involved the migration of integrations with 3rd parties like ADP, Wall Street on Demand (WSoD), iMake News, Moore, BARS, etc.
  • Data includes the personal information of millions of Janus customers, their accounts, funds, fund value, newsletters, gifts, and collaterals.
  • Technology: Onyx CRM, SFDC, T-SQL, MS SQL Server 2000, DTS, Informatica Power Center, Autosys
  • Role: Onsite POC for India team, Technical Analyst, Database Developer
MS SQL Server 2000T-SQLDTSOnyx CRMInformatica Power CenterAutosys+3

Sapient

Associate Technology L1

Feb 2006Jun 2007 · 1 yr 4 mos · Gurugram, Haryana, India

  • Onyx, a web-based CRM application that enables the management of all the customer information in a central database. Janus (Janus Capital Group, Denver, CO, USA) Internal users are its customers. Janus client information and tracking are done through Onyx. Onyx helps to manage every business interaction with your customers, from initial contact through sales cycles and customer care programs. Using Onyx, users can gather, integrate and share valuable customer information with everyone in any organization.
  • In Sapient, we are engaged in ongoing production support for the CRM application.
  • Responsibilities:
  • Writing/updating TSQL code (scripts, stored procedures, triggers) for implementing new & existing business rules.
  • Run data-loads, creating/updating UIs, monitoring 3rd party integrations.
  • Responding to ad-hoc client’s requests and handling critical issues in a timely manner.
  • Technology: MS SQL Server 2000, T-SQL, DTS, Onyx CRM, Informatica Power Center, Autosys, ASP, JavaScript
  • Role: Technical Analyst, Database Developer, Business Analyst

Education

APEEJAY School of Management (GGS IP Univ. New Delhi)

MCA — Computer Science & IT

Jan 2003Jan 2006

Amrapali Institute (Kumaon Univ. Nainital, UA)

BCA — Computer Science & IT

Jan 2000Jan 2003

Beersheba Sr. Sec. School, Haldwani, Nainital, UA

10+2

Jan 1992Jan 2000

Beersheba

Stackforce found 100+ more professionals with Azure Data Platform & Data Engineering

Explore similar profiles based on matching skills and experience