Mandeep Cheema

Operations Associate

Gurugram, Haryana, India16 yrs 11 mos experience
Highly StableAI Enabled

Key Highlights

  • 16 years of experience in project and engineering management.
  • Expert in Enterprise Big Data & Analytics solutions.
  • Recognized as Most Valuable Professional at Databricks.
Stackforce AI infers this person is a Big Data and Cloud Solutions Architect with extensive experience in SaaS.

Contact

Skills

Core Skills

Data EngineeringCloud Storage

Other Skills

Analytical SkillsAnalyticsApache PigApache SparkArtificial Intelligence (AI)Big Data AnalyticsBusiness AnalysisBusiness AnalyticsBusiness DevelopmentBusiness Intelligence (BI)Business Process ImprovementBusiness Relationship ManagementCapacity PlanningCloud ConsultingCloud-Native Applications

About

Expertise and proven experience of over 16 years in project & engineering management, sales, solution design and implementation of Enterprise Big Data & Analytics solutions on-premise and cloud Experience includes a vast global exposure while working in various geographies including Europe, APAC & EMEA and remote interaction with teams spread across the globe. Hands on experience with Azure, Databricks & Hadoop, python & spark

Experience

Atlan

Technical Account Manager

May 2024Present · 1 yr 10 mos

Machine LearningCloud StorageBusiness AnalyticsLarge Language Models (LLM)Data StrategiesStakeholder Management+5

Databricks

2 roles

Senior Solutions Architect

Feb 2023May 2024 · 1 yr 3 mos

Machine LearningCloud StorageBusiness AnalyticsLarge Language Models (LLM)Data StrategiesOptimization+13

Solutions Architect

Aug 2021Jan 2023 · 1 yr 5 mos

  • Serve as a trusted advisor in Big Data & AI solutions, with a focus on Databricks, multi cloud, Python, SQL and Spark.
  • Leverage hands-on experience in Databricks to align with customer priorities
  • Present technology solutions and strategic vision to both executive and technical stakeholders, establishing credibility as a Trusted Advisor.
  • Conduct engaging demos and workshops to illustrate the practical applications and benefits of Databricks platform.
  • Collaborate closely with sales and leadership teams to align technical solutions with business priorities.
  • Lead engagements with customers and stakeholders, showcasing the value of Databricks Platform from demo to proof of concept to workshops and implementation.
  • Maintain a deep understanding of competitive and complementary technologies, facilitating effective positioning of Databricks.
  • Create repeatable processes and documentation based on customer engagements, ensuring effective knowledge transfer.
  • Collaborate on and create industry-based solutions that enhance the value of Databricks for diverse customers.
  • Achieved multiple quarterly awards for closing the highest number of logs at Databricks.
  • Recognized as the Most Valuable Professional twice consecutively at Databricks.
  • Successfully initiated and organised a global LLM Cup Hackathon at Databricks, promoting innovative solutions.
Machine LearningCloud StorageBusiness AnalyticsLarge Language Models (LLM)Data StrategiesOptimization+12

Tata consultancy services

10 roles

Senior Solution Architect - Big Data & AI at ABN AMRO

Promoted

Jun 2020Jul 2021 · 1 yr 1 mo

  • Design and implement solutions for a data driven bank with Cloud and Big data technologies
  • Provide thought leadership to the department for data and AI capabilities in Azure
  • Lead Azure solution architecture for all Domain Data teams of the bank
  • Work closely with Microsoft Azure, Databricks & other vendors to conduct POCs and workshops
  • Liaise with Microsoft Azure, Databricks & other vendors for business as usual activities
  • Work with Databricks as a part of their product advisory board
  • Enable data value creation for Retail Banking, Corporate Banking, Operations, Risk, HR, Finance & Taxation, Private Banking
  • Assess various tools & technologies such as delta lake & synapse
  • Identify & create best practices for related teams for continuous improvement
  • Guide & coach development teams to help with the implementations
  • Design streaming, batch ingestion, ETL, orchestration, metadata & lineage, governed file transfer & other usage and consumption patterns for generic usage and capability for various domains
  • Create designs to handle complex unstructured data in a domain data store
  • Create delta lake architecture to handle standard workloads
  • Ensure end to end quality delivery of the defined architecture
  • Align with all involved POs for solution designs & architecture in alignment with enterprise architecture
  • Help the POs to define episodes, epics and user stories
  • Conduct workshops with domain teams, Microsoft, Databricks & other involved vendors
  • Review implemented solutions to close gaps if any with the required architecture
  • Promote and advocate inner source way of working for the development community of the bank
  • Align with multiple stakeholders including business & Enterprise Architects
  • Define data engineer platform and architecture roadmap
  • Enable engineers to showcase their developed solutions to the wider development community in the bank
  • Technologies: Spark3, Python, Delta Lake, Synapse, ADF, Azure Functions, Airflow, AKS, Event Hub, Azure DevOps
Machine LearningCloud StorageBusiness AnalyticsData StrategiesOptimizationBusiness Relationship Management+11

Product Owner at ABN AMRO

Mar 2020Jun 2020 · 3 mos

  • Responsible to create a high potential one data platform team to enable the growing business demands of the bank
  • Define the vision & scope for the newly formed data platform team to manage all existing & upcoming domain data stores of the bank
  • Connect with various stakeholders for enhanced visibility and required recognition of the team
  • Engage with POs of related domain data stores to identify their product roadmap & priorities
  • Create one roadmap for one data platform team on the basis of the stakeholders priorities and needs
  • Co-ordinate with Enterprise Architects, Solution Architects, Governance teams & other important stakeholders to identify needs for the successful journey of the team
  • Create episodes, epics and user stories for the one data platform team
  • Manage the backlog for the one platform team
  • Coach one data platform team in various capacities
  • Ensure timely and quality delivery of the committed objectives
  • Organize and chair sprint demos after each iteration
  • Enable the development team to enhance their technical skills
Cloud StorageBusiness AnalyticsOptimizationStakeholder ManagementStrategic LeadershipData Engineering+6

Strategy Consultant & Solution Architect

Aug 2019Mar 2020 · 7 mos

  • Responsible for setting up Inner source working principles for all the teams within the Data and BI department
  • Review cloud and data solutions for 10+ data teams
  • Create & chair the inner source community with more than 400 members including data engineers, software engineers, solution architects, POs, scrum masters, Enterprise architects
  • Identify gaps and create a common roadmap for all the involved teams
  • Connect with POs of all data teams to define epics and identify requirements along with shared capacity
  • Build a high potential virtual development team
  • Create people, process, technology & data roadmap for the department
  • Assist the department head to define the target operating model for the involved teams
  • Connect with various stakeholders to realize the inner source way of working
  • Advocate the need for collaboration and re-usability
  • Manage the backlog & delivery for inner source virtual team
  • Design Ingestion, ETL & orchestration frameworks to be used as a part of inner source way of working
  • Identify and create reference architecture for the data teams
  • Create generic blueprint and platform structure required for all data teams
  • Create standards for data processing needs of the teams involved
  • Create a platform for multiple data teams to collaborate & showcase their solutions/frameworks to promote re-usability
  • Market the frameworks created out of inner source way of working across the bank to improve adaptability of the solutions
  • Liaise between enterprise architects and involved data teams
  • Creation of a physical horizontal team size of 17 from a virtual team size of 3
  • Create working streams within the one horizontal team
  • Organize workshops & hackathons including the vendors and various domains of the bank
Cloud StorageBusiness AnalyticsOptimizationStakeholder ManagementStrategic LeadershipData Engineering+6

Product Engineering Lead

Mar 2019Dec 2020 · 1 yr 9 mos

  • Define the technical and strategic vision of the TCS IP based products
  • Create & implement end to end architecture and design for Azure native data quality tool
  • Design & create python based AI/ML framework
  • Develop python & spark based platform agnostic data quality tool
  • Manage and prioritize backlog of the product
  • Identify & create a reliable team to enable and support the product
  • Guide, coach and motivate the team to achieve the goals
  • Conduct demos & presentations for various clients with help of sales team
Cloud StorageBusiness AnalyticsOptimizationStakeholder ManagementStrategic LeadershipData Engineering+6

Solution Architect & Big data AI consultant

Aug 2018Aug 2019 · 1 yr

  • Work as a part of Analytics & Insights horizontal in TCS
  • Provide consulting for various clients across globe majorly in APAC, EMEA, Europe & North America regions
  • Connect with C - Level executives to define & present 3 to 5 year people, process, technology & data roadmap
  • Conduct assessments to identify the technology related gaps in Big Data & AI space
  • Connect with various different business stakeholders including developers and C - level executives to identify gaps
  • Present the assessment results with long term implementation roadmap plan and quick wins
  • Drive end to end RFP process for various business lines
  • Coach the newly joined associates to grow into the field of big data
  • Conduct POCs and present the solutions as a part of the presentations to senior management
  • Help existing delivery teams with code reviews & technical solution guidance
  • Provide knowledge sharing sessions and industry best practices to the wider engineering team
Cloud StorageBusiness AnalyticsOptimizationBusiness Relationship ManagementStakeholder ManagementStrategic Leadership+7

Solution Designer at ABN AMRO

Feb 2017Jul 2018 · 1 yr 5 mos

  • Design & implement a central data distribution hub (data lake)
  • Reviewing and providing inputs in defining the future state architecture.
  • Design of the security framework for the data layer.
  • Co-ordinate with multiple business teams to gather inputs and make the Data Layer GDPR compliant.
  • Set up the CI/CD pipeline for Development, Test, Acceptance, Production environments.
  • Create a metadata driven python framework for data ingestion.
  • Implementation of end to end process from data extraction from data providers till hive tables.
  • Worked with Kerberos implementation in Hadoop with integration of various applications such as Microstrategy, SAS, Abinitio, Informatica Data Quality.
  • Identify and implement the best fit tool for exploration of data on Hadoop.(Ambari, Microstrategy)
  • Implemented Informatica Enterprise Data Catalog (EDC) with Hadoop for metadata exploration and lineage tracking.
  • Creation of a web based access form integrated with Informatica EDC.
  • Prepare and deliver demos to Business management.
  • Arrange and deliver knowledge sharing sessions across Analytics and Insights group.
  • Provide solution to integrate on premise Hadoop with Azure and AWS along with a Hybrid solution.
  • Tools used- Azure, AWS, Bit-bucket, Jenkins, Python, Hive, Hbase, Phoenix, Oozie, Mircostrategy, Apache Ranger, Atlas, Informatica EDC, HTML5, Microstartegy
Cloud StorageBusiness AnalyticsOptimizationStakeholder ManagementStrategic LeadershipData Engineering+4

Lead Engineer at Chrysler

Dec 2015Jan 2017 · 1 yr 1 mo

  • Responsible for creation of data lake at FCA
  • Detailed analysis to understand the functional and technical requirements.
  • Design and build Python based framework for Hadoop Data Ingestion using Sqoop jobs and HAWQ tables.
  • Assisting architects in designing high performance big data Applications based on functional and technical requirements.
  • Set up a 3 node CDH cluster for internal learnings and POCs.
  • Coordination with Infrastructure team and production support teams for daily monitoring and quick issue resolution.
  • Create and execute a linear/ branching workflow with actions that include Hadoop jobs, Pig jobs, shell Scripts, custom actions, etc.
  • Orchestrate a workflow to execute regularly at predefined times, including workflows that have data dependencies.
  • Evaluate and provide feedback on future big data technologies and new releases/upgrades.
  • Tools- HDFS, Pig, Hive, Sqoop, HAWQ, Hue, Oozie, Spark, Microsoft Azure.
Cloud StorageBusiness AnalyticsOptimizationProblem SolvingComputationData Engineering

Lead Engineer

Promoted

Sep 2015Nov 2015 · 2 mos

  • Provide solutions to various different projects on Big Data.
  • Help them identify the best fit tool for their use case.
  • Provide training to associates to help deliver those projects in Big Data technologies.
  • Help draft RFP responses for various projects as a part of presales team.
  • Proactively Deliver POCs as a part of identifying the use cases for new projects.
  • Help in installation of Hadoop distributions as per requirement.
  • Started a platform to share best practices & experiences across the teams under ABIM Mumbai. Manage allocation of associates to various different projects.
  • Connect with Alliance partners for better solutions.
  • Tools- Hortonworks, Cloudera, PivotalHD, Microsoft Azure, MarkLogic, MongoDB, Qlikview
Cloud StorageBusiness AnalyticsOptimizationProblem SolvingComputationData Engineering

Functional Analyst

Dec 2012Aug 2015 · 2 yrs 8 mos

  • Setting up of a Test Lab for testing and analysing various aspects of HDFS in comparison to RDBMS. Testing on CDRs and Services datasets using PIG and Hive queries to achieve faster results as compared to traditional RDBMS.
  • Help marketing team chalk out new strategies by analysing customer behaviour on live products to help result in increased revenue under test conditions with Hive.
  • Coordination with Infrastructure team and production support teams for daily monitoring and quick issue resolution.
  • Analyst role with direct interaction with Business Support Application of CPOS and UPSS.
  • Analyze current processes / procedures and perform required proactive and reactive reconciliations to assure smooth service delivery.
  • Fulfilment of Ad-hoc Reports requirements of Client- users at Circle Level.
  • Developed internal dashboard to consolidate and display data required for everyday activities.
  • Tools: MySQL, SQL Developer, PL/SQL, Unix, Cloudera, Pig Latin, Hive, HUE
Problem Solving

Developer

Feb 2012Nov 2012 · 9 mos

  • Worked on Unix and Mainframe systems to give access on multiple platforms and also to create new users.
  • Configuring exchange servers for users.
  • Manage security and access of all related applications.
  • Responsible for managing termed employee data.
  • Designed and developed internal dashboard site for client hosted on SharePoint acting as a central repository for SOPs and other important documents.
  • Tools: AD, Mainframe, Windows 2003 Advanced Server, Unix
Business AnalyticsProblem Solving

Dell

Individual Contributor

Jun 2008Jan 2012 · 3 yrs 7 mos · India

Problem Solving

Education

Birla Institute of Technology and Science, Pilani

Master of Technology - MTech — Data and Analytics

Oct 2018Jan 2021

Guru Gobind Singh College of Modern Technology

Bachelor's degree — Mechanical Engineering

Jan 2004Jan 2010

Lawrence Public Senior Secondary School

High School

Jan 2002Jan 2004

St. Xavier Chandigarh

Stackforce found 100+ more professionals with Data Engineering & Cloud Storage

Explore similar profiles based on matching skills and experience