Shreyam Kesarwani

Business Development Executive

Mumbai, Maharashtra, India4 yrs 6 mos experience
Highly Stable

Key Highlights

  • Over 4 years of experience in BFSI sector
  • Expert in Hadoop and PySpark for data solutions
  • Led migration projects to Datalake infrastructure
Stackforce AI infers this person is a Data Engineer specializing in Fintech with expertise in big data technologies.

Contact

Skills

Core Skills

PysparkHadoopSql

Other Skills

Agile MethodologiesAzure DatabricksBashBitbucketC (Programming Language)C++Data StructuresGitGithubHiveHiveQLIBM Tivoli Workload Scheduler (TWS)Microsoft AzureMicrosoft ExcelMicrosoft PowerPoint

About

Data Engineer with 4+ years of experience crafting effective solutions for complex business challenges within the BFSI (Banking, Financial Services, and Insurance) sector. Skilled in Pyspark, Hadoop, Hive, HDFS, Python, SQL, Shell Scripting, CDP (Cloudera Data Platform), CDH (Cloudera Distribution Hadoop), IBM TWS Scheduler, BitBucket, TeamCity, Agile Methodologies, JIRA and Data warehousing.

Experience

Morgan stanley

Associate

Oct 2025Present · 5 mos · Mumbai, Maharashtra, India · Hybrid

Mathco

Senior Data Engineer

Aug 2025Sep 2025 · 1 mo · Bengaluru, Karnataka, India · Hybrid

Sigmoid

Senior Data Engineer

Apr 2025Aug 2025 · 4 mos · Bengaluru, Karnataka, India · Hybrid

Capgemini

3 roles

Data Engineer

Apr 2024Apr 2025 · 1 yr · Pune, Maharashtra, India · Hybrid

  • Client: Fortune 500 company which stands as a global leader in the financial services industry
  • Developing, managing and supporting the platform that generate trade data for 10000+ external clients including Ultra-High-Net-Worth (UHNW) individuals with a net worth exceeding $30 million, ensuring diverse feed delivery within SLA time.
  • Contributed to integrating a client and newly acquired electronic trading platform. Executed Comet setup, established test data, conducted PROD connectivity testing, finalized converted accounts for the acquired firm, and validated feeds for all clients/vendors involved.
  • Actively involved in the migration of jobs, data and all customers from Mainframe to Datalake infrastructure
  • My day-to-day responsibilities:
  • 1. Responsible for building scalable distributed data solutions using Big data technologies.
  • 2. Developed and orchestrated 10+ Hadoop and Pyspark jobs through TWS scheduler, ensuring seamless dependencies and data processing.
  • 3. Designed and developed analytical reports using complex Spark transformations and Spark SQL, improving data accessibility for operations teams and enhancing decision-making efficiency by 20%.
  • 4. Dumping data into tables in Hive involves utilizing HDFS for storage and harnessing SQL to generate reports or uncovering information on an ADHOC basis to meet business requirements.
  • 4. Co-ordinating with multiple teams, ensuring issue resolution to closure.
  • 5. Used Bitbucket for version control and TeamCity for building and deploying configurations.
  • 6. Involved in gathering requirements, designing, development, and testing.
  • 7. Created Reconciliation Framework for Data Processing.
  • 8. Addressing client's customer queries and ensuring prompt resolution.
  • 9. Root cause analysis for production issues/defects.
  • 10. Implementing logging mechanisms.
  • 11. Scheduled IBM TWS jobs and provided platform support.
  • 12. Managing a team, boosting productivity with agile workflows, technical guidance, and regular feedback.
PySparkPython (Programming Language)SQLHadoopHivehdfs+5

Senior Software Engineer

Promoted

Apr 2023Mar 2024 · 11 mos · Pune, Maharashtra, India · Hybrid

PySparkSQLPython (Programming Language)HadoopHiveHiveQL+5

Software Engineer

Aug 2021Mar 2023 · 1 yr 7 mos · Pune, Maharashtra, India · Hybrid

  • Completed training in Azure, Python, SQL and Data Warehouse
Python (Programming Language)Microsoft SQL ServerAzure Databricks

Ieo makers fablab pvt. ltd.

Machine Learning Intern

Apr 2020Jun 2020 · 2 mos · Remote

  • Contribute in analyzing and performing image processing on the License plate using OpenCV
  • Contribute in segmenting the alphanumeric characters from the license plate
  • Implement CNN model in keras to recognize the characters from the license plate

Udacity

Secure and Private AI Scholarship Challenge Scholar

May 2019Aug 2019 · 3 mos

  • The Secure and Private AI Scholarship Challenge by Udacity and Facebook highlights the importance of security and privacy in a world that is quickly pacing towards AI.

Education

KIET Group of Institutions

Bachelor of Technology - BTech — Computer Science

Jan 2017Jan 2021

St. Michael's Convent School

Intermediate in Science (Physics

Jan 2016Jan 2017

Stackforce found 100+ more professionals with Pyspark & Hadoop

Explore similar profiles based on matching skills and experience