Purshartha Srivastava

CEO

Charlotte, North Carolina, United States20 yrs 10 mos experience
Most Likely To SwitchHighly Stable

Key Highlights

  • Over two decades of experience in capital markets.
  • Expert in counterparty credit risk management.
  • Multiple patents pending for technical innovations.
Stackforce AI infers this person is a Fintech expert specializing in counterparty credit risk management and data engineering.

Contact

Skills

Core Skills

Counterparty Credit Risk ManagementData IntegrationEtl DevelopmentPerformance TuningEtl SolutionsBusiness IntelligenceData Quality Assurance

Other Skills

HadoopTerraformData QualityPython (Programming Language)Google AnalyticsSystems DesignGoogle Cloud Platform (GCP)DatabasesMicrosoft AzurePythonGoogle Cloud PlatformUnix Shell ScriptingTeradataAbinitioOracle

About

I am a Principal and Executive Director Capital Markets at Wells Fargo, where I provide transformational leadership, vision, and relationship management to various stakeholders in the Counterparty Credit Risk Management group. With over two decades of experience in this role, I have designed, developed, and implemented resilient systems and high-quality solutions that address business needs and conform to the requirements of Technology, Risk, and Information Security. I have extensive knowledge and understanding of capital markets and investment banking, as well as cloud, big data, and containerization frameworks, such as Azure, AWS, Google, Hadoop, Spark, Kafka, Docker, and Kubernetes. I leverage these skills to engage in counterparty credit risk measurement and architecture projects for central clearing of interest rate swaps, credit default swaps, and exchange traded products. I also support the development and implementation of methodologies and stress testing for client cleared portfolios. Additionally, I mentor and foster talent in the technology team and hire strong talent from the market with integrity. I have multiple patents pending and have received several awards and honors for my technical excellence and innovation. Cloud platform Azure: (ADF, ADLS Gen2, Azure SQL Database, Azure DevOps, Databricks) AWS: (IAM, EC2, EMR-5.32.0, S3, Athena, Glue Catalog, Cloud Formation, Lambda, BOTO3, Elastic Container Service (ECS), Fargate, RDS) Goggle: (IAM, VMs, Cloud Storage, VPC, Google Big Query, DataProc),Gen AI, build of agentic agents on industry top models Bigdata Frameworks : Apache Hadoop (HDFS, Hive, Pig, Sqoop, Impala, Hue, KUDU, Hbase.) Apache Spark (Spark Core (RDD, DataFrames), Spark SQL and Spark Streaming) Apache Kafka, Confluent Kafka, Kafka Connect, KSQL Hadoop Distributions: Cloudera Databases: MySQL, SQL Server, Oracle NoSQL: MongoDB Containerization Engines: Docker containers and Kubernetes Languages: Python, Scala, Java, SQL, Shell Scripting, Terraform, Ansible, Abinitio PDL, Abinitio Product suite. Version Control: GIT Scripting: Shell/Bash Workflow Management Tools: Ariflow, Cron, Microsoft Scheduler, Autosys,$U

Experience

20 yrs 10 mos
Total Experience
2 yrs 11 mos
Average Tenure
11 yrs
Current Experience

Wells fargo

Architect - Executive Director Capital Markets

Apr 2015Present · 11 yrs · Charlotte, North Carolina Area

  • Counter Party Credit
  • Enterprise Counterparty Risk Management Group (ECRM) is responsible for defining and managing the counterparty risk framework for Risk measure of different trading product.
  • Engage in counterparty credit risk measurement and architecture projects for central clearing of interest rate swaps (IRS), credit default swaps (CDS), and exchange traded products.
  • Support development and implementation of methodologies used for measuring Central Counterparty (CCP) exposure in ETL platform.
  • Support development and implementation of stress testing methodologies used for client cleared portfolios.
  • Ensure consistent counterparty risk factor methodologies and aggregation methodologies within counterparty risk systems used.
  • Evaluate new LoB initiatives including new product launch to ensure that counterparty risk incurred by these activities is properly captured.
  • Coordinate with Counterparty Risk Analytics group on risk methodology used for trading products, with a strong emphasis on cleared products.
  • Partner with the Counterparty Technology group on delivering robust and scalable counterparty risk solutions.
  • Perform data analysis for various risk projects, and define data validation rules and default rules.
HadoopTerraformData QualityPython (Programming Language)Google AnalyticsSystems Design+5

Jpmorgan chase

Sr Abinitio Consultant

Aug 2013Apr 2015 · 1 yr 8 mos · Columbus, Ohio Area

  • Working in Chase enviorment for one of the high profile project which has a direct impact on customer profiling.
  • Been involve in understanding the business requirement and providing the most efficient solution with different approaches considering volume of data getting processed, meeting SLA requirement, Scalability, Maintainability, parallel processing is a part of my role.
  • I am involve End to End business solution also in performance tuning of the code,Automation of process by ETL tools,gudiance to peer colleagues and standard improvements are some of the day to day activities.
HadoopTerraformPython (Programming Language)Systems DesignGoogle Cloud Platform (GCP)Databases+2

Wipro technologies

Tech Lead

Jul 2011Aug 2013 · 2 yrs 1 mo · New Jersey

  • Worked on NextGen analytics and business intelligence product offerings for Citi Group providing them solutions on ETL and performance front for their build in Global Transaction Services (GTS) business.
  • Harness Utility:
  • Skills used: Abinitio 3.02,Oracle,Unix,Java
  • It is performance testing utility, which gives flexibility to user to scale up and scale down the data as per user requirement. Addition to it also gives the user to end to end integration of application.
  • It is innovative approach used for getting the performance testing of applications robust.
  • Mart Applications:
  • Skills Used: Ab Initio 3.02, Oracle, Unix
  • This project is critical to success of the bank and its share in a tough environment for Financial Services Industry.
  • Responsibilities:
  • Team Lead/Sr.Software Engineer
  • Supervise the day-to-day activities of ETL developers in the capacity of Team Lead
  • Assign tasks and track progress.
  • 30/30 (30 minutes every 30 days) feedback sessions with my team members
  • Design/Build Graphs
  • Design and build reusable ab initio graphs and sub-graphs
  • Design custom Ab Initio graphs
  • Perform design and code reviews
  • Use knowledge as a designer to provide insight into the designs to developers
  • Provide design approach to designers/developers when they are stuck
  • Forum for discussing common issues
  • Onshore/offshore coordinator:
  • Coordinating between onshore and offshore activities/tasks
  • Task assignments and reporting progress to onshore counterparts.
  • Issue Driver/Tracker/Reporter:
  • Providing developers with quick paths to resolution for their issues
  • Tracking and status reporting on issues
  • Helping developers determine the severity of the issues
  • Filter out issues that can be handled offshore versus those that need to be followed up with the designer
HadoopTerraformPython (Programming Language)Systems DesignGoogle Cloud Platform (GCP)Databases+2

Atos origin

Software Engineer

Oct 2009Jul 2011 · 1 yr 9 mos · PUNE, FRANCE

  • FT-Herman was basically responsible for extracting the data from the Enterprise Data
  • Store (Oracle Apps), performing the required transformations on the data as per the business requirements and loading this data into the Oracle Apps and DW-STORES database.
  • Ab-Inito Team is responsible for the end user data generation is as per the business expectations. As there are process which runs for 24*7, so there is huge amount of data which is processed on the daily basis. So it is very important to ensure the quality of data is as per the standards or not. To ensure that we are using another DW-MARTS. That is called as audit. These audit checks are execute as per the audit points; Which tracks the quality of data throughout the process.
  • The goal of DATA INTERFACER is to consolidate data from CDW, FDR, Fraud, and ADS Warehouses into a single source of consumer card data which supports business requirements for reporting and analysis. It provided a single view of consumer data and allowed new applications (e.g. RDC) to be added in a timely fashion with reduced cost.
  • DATA INTERFACER was made up of 3 Projects:
  • Data Model Consolidation
  • ETL Standardization
  • Technical Infrastructure Upgrade
  • Customer-centric consumer model, without explicit customer-account relationship
  • Retail Dual Telephone Card
  • Authorization (ASC & CD tables)
  • Billing Promotions Subject Area
  • Fraud Subject Area
HadoopTerraformPython (Programming Language)Systems DesignGoogle Cloud Platform (GCP)Databases+2

Bitwise solutions pvt. ltd.

Programmer

May 2008Sep 2009 · 1 yr 4 mos · Pune

  • Worked on development of various Data Warehousing projects to automate the processing of several customer related information in order to provide our client “Discover Financial Services”, required data.
  • Extracted, transformed, cleansed, and loaded (ETL) various Telemarketing Solicitation information for DFS in Teradata Database
  • Providing Support to various applications running on production, developed by Card Member Ware House.
  • Created MLOAD script for UNIX to load various processed load files into the Teradata Database.
  • Created different design documents for various applications as per the Client Requirement.
  • Card Member Decision Support System (CDSS) Team:
  • CDSS was basically responsible for extracting the data from the Enterprise Data
  • Store (EDS), performing the required transformations on the data as per the business requirements and loading this data into the IDS and CIMS database.
  • Integrated Data Store and Data Quality (IDS) Team:
  • Worked on the development and maintenance of the applications for the Integrated Data Store. The key feature of this data store is to maintain the quality of data which is loaded to the Teradata Tables. To achieve the data quality, another special data mart is maintained which is used as the standard reference data. Various Referential Integrity checks were built using this Reference Data Mart. This data in the IDS is the current data which is maintained in the database over a month’s time
HadoopSystems DesignGoogle Cloud Platform (GCP)Databases

Playerx mobile entertainment

Trainee

Jan 2008Jan 2008 · 0 mo · Pune Area, India

  • Developed in house application on dot net environment called as Royalty Reporting System.
  • Worked with End client understanding there needs and translating to technical requirements.
  • Generation of unit test cases and testing the application is part of my responsibility.
  • Have also worked in mobile game testing, and reporting the bugs to development team.

Wipro infotech

Technical Support Engineer

Sep 2004Mar 2005 · 6 mos · Pune Area, India

  • Working on call support for day to day, customer requirements.
  • Maintaining ADS directory.
  • Network trouble shooting and fixing issues permanently. Switching of network
  • From primary to backup etc.

Indian railways

Technical Executive

Jan 2002Jul 2004 · 2 yrs 6 mos · Jabalpur, Madhya Pradesh

  • Working on the RDBMS environment for report generation.
  • Generating of customized forms on oracle 6i.
  • Working with end terminal customer on there technical issues. Providing resolutions in SLA's.
  • Maintaining high data volume for daily transaction.

Education

Savitribai Phule Pune University

Master's degree — Computer Application

Jan 2005Jan 2008

INDIRA

Stackforce found 100+ more professionals with Counterparty Credit Risk Management & Data Integration

Explore similar profiles based on matching skills and experience