Mintu Agarwal

Product Manager

Mumbai, Maharashtra, India3 yrs 9 mos experience
Most Likely To SwitchHighly Stable

Key Highlights

  • Expert in DuckDB and in-memory databases.
  • Led multiple complex projects in credit risk modeling.
  • Strong foundation in data engineering and quantitative research.
Stackforce AI infers this person is a Fintech professional with strong data engineering and quantitative research expertise.

Contact

Skills

Core Skills

Data AnalysisQuantitative ResearchSoftware DevelopmentData Engineering

Other Skills

API developmentAbstract Syntax TreeAlgorithmsAnalytical SkillsApache KafkaApache SparkC (Programming Language)C++CVSCode ReviewCommunicationCompetitive ProgrammingComputer ScienceData StructuresDebugging

About

Hi, Mintu here. I am a Model Developer in Wholesale Credit Risk Quantitative Research team in JP Morgan. Prior to JP Morgan, I graduated from IIT Kharagpur with Major in Industrial Engineering and Minor in Computer Science. With my time at JP Morgan, I have become an Operating Systems and Databases Enthusiast while working on state-of-the-art technology here. I had the privilege of having multiple mobilities in the Wholesale Credit Risk(WCR QR) organisation which allowed me to solve different kinds of complex problems and learn new skills to solve them. With my last role in WCR QR Data, I came across DuckDb and the concept of In-memory Databases. I have been excited about databases and workflow orchestration systems and design since. With my current role, I'm trying to learn more about the Data Science and modelling techniques and contribute with whatever I have learnt throughout my career here.

Experience

Jpmorganchase

3 roles

Model Development Associate

Promoted

Jun 2025Present · 9 mos · Mumbai, Maharashtra, India

  • In this role, I am involved in the implementation of various Scenario Credit Loss(SCL) forecasting models under the Commercial and Industrial (C&I) portfolio of JP Morgan Wholesale Credit division.
  • Currently leading the onboarding of workflows to forecast the possible benefit of financial instruments taken against the C&I portfolio to hedge SCL losses.
  • Migrated the ETL processes to use DuckDB for publishing workflow results faster onto Dremio Analytics Dashboard.
DuckDBETL processesScenario Credit Loss forecastingDremio Analytics DashboardData AnalysisQuantitative Research

Data Engineering Associate / Quantitative Research

Jan 2025Jun 2025 · 5 mos · Mumbai, Maharashtra, India

  • Implemented Parsing and Evaluation for Low Code Framework which allows creating data transformation rules as a list of waterfall decisions in yaml list. Exploring migration of the core evaluator to duckDb Tables for faster execution of rules.
  • Created an API for the in-house workflow orchestration platform to override inputs and rerun certain tasks from workflows against staged code to assess code change impact, increasing developer productivity significantly
  • Got rid of some manual input Tagging dependencies on the Operations team by adding dynamic search functionality to the workflow orchestration platform to resolve and re-use certain datasets in the input context of an existing tagged "official" workflow.
Low Code FrameworkAPI developmentDuckDBWorkflow orchestrationSoftware DevelopmentData Engineering

Software Engineer I / Quantitative Research

Jul 2022Jan 2025 · 2 yrs 6 mos · Mumbai, Maharashtra, India

  • As a Data Engineering Analyst in the Wholesale Credit Risk QR Team,
  • created ETL pipelines as workflows on an in-house workflow orchestration platform based on Python, S3, Oracle SQL DBMS, Dremio for creation and analysis of loan loss forecasting modelling and back-testing datasets
  • Automated the scheduling of workflows by strategically configuring the specifications applying Strategy Design Pattern, reducing operational workload for the team by over 30 hours per quarter
  • added support to persist time series datasets as parquet partitions in S3 reducing data redundancy and achieving efficient regression for any model and transformation updates
  • contributed to the in-house Pandas like API for duckDb tables implementing GroupBy class methods for numeric and string aggregations, transform just like pandas
ETL pipelinesPythonOracle SQLDremioTime series datasetsData Engineering+1

Morgan stanley

Technology Analyst Intern

May 2021Jul 2021 · 2 mos · India · Remote

  • Demonstrated a Proof of Concept on ETL layer re-architecture for Morgan Stanley Global Reconciliation Project(MSGR). Created model pipeline re-vamping to scalable Distributed architecture using Pub-Sub model for reading financial data with Apache Kafka, Apache Spark for applying transformations and No-SQL MongoDb database for persistent storage. Projected reduction in in-efficiencies by 70%
ETL layer re-architectureApache KafkaApache SparkMongoDBSoftware DevelopmentData Engineering

Education

Indian Institute of Technology, Kharagpur

Bachelor of Technology - BTech — Industrial Engineering

Jan 2018Jan 2022

Hariyana Vidya Mandir

Student

Jan 2016Jan 2018

Stackforce found 100+ more professionals with Data Analysis & Quantitative Research

Explore similar profiles based on matching skills and experience