Varun Joshi

CEO

Bengaluru, Karnataka, India10 yrs 4 mos experience
Most Likely To SwitchAI ML Practitioner

Key Highlights

  • Over a decade of experience in data engineering and analytics.
  • Expertise in AI, data warehousing, and data migration.
  • Proven track record of delivering high-impact data solutions.
Stackforce AI infers this person is a Data Engineering and Analytics expert in the Fintech and SaaS industries.

Contact

Skills

Core Skills

Data EngineeringArtificial Intelligence (ai)Data MigrationData ModelingGoogle Cloud Platform (gcp)

Other Skills

Data PipelinesData WarehousingAmazon Web Services (AWS)Apache SparkApache AirflowEnterprise Solution DesignGoogle BigQueryDesigningPubSubPythonSQLAWSSQL ServerbashExtract, Transform, Load (ETL)

About

With over a decade of experience in data engineering and analytics, I contribute to developing high-impact data solutions at Vivriti Capital as Associate Director – Data Engineering – Analytics. My work centers on advancing AI, data engineering, and analytics strategies to align with organizational goals, while fostering cross-functional collaboration to address complex data challenges effectively. My expertise spans artificial intelligence, data warehousing, and data migration. I am committed to enhancing data infrastructure, improving analytics capabilities, and ensuring data quality. By building cohesive teams and driving synergy, I focus on delivering data-driven solutions that create value and empower decision-making processes.

Experience

10 yrs 4 mos
Total Experience
1 yr 1 mo
Average Tenure
3 yrs
Current Experience

Vivriti capital

2 roles

Associate Director - Data Engineering - Analytics

Promoted

Oct 2025Present · 7 mos · Bengaluru, Karnataka, India

  • I am dedicated to delivering high-impact data solutions at Vivriti Capital, focusing on organizational value.
  • Spearheaded the development of AI, Data Engineering, and Analytics strategies to align with business objectives.
  • Fostered cross-team collaboration to effectively identify and address data challenges.
  • Enhanced data infrastructure, significantly improving analytics capabilities and data quality.
  • Built and maintain cohesive synergy within the team and across functions at Vivriti Capital.
Artificial Intelligence (AI)Data EngineeringData ModelingData PipelinesData WarehousingAmazon Web Services (AWS)+2

Data Engineering Manager

May 2023Oct 2025 · 2 yrs 5 mos · Bengaluru, Karnataka, India

Artificial Intelligence (AI)Data EngineeringData ModelingData PipelinesData WarehousingData Migration

Confluent

Data Engineer II

Feb 2022May 2023 · 1 yr 3 mos · Bengaluru, Karnataka, India · Hybrid

  • As a level 3 Data Engineer at Confluent, I work very closely with the Go to Market (GTM), FieldOps and Customer Success business partners to build new end-to-end solutions through data pipelining, data modelling and data warehousing along with maintaining existing workflows
  • Independently lead communication and develop relationship with Business Partners and understand their requirements along with deep business context
  • Design optimised efficient end-to-end solutions
  • Data Modelling for requirements involving development of a full fledged data warehouse
  • Building pipelines using Apache Airflow
  • Create/Access new data sources for business partners using API methodologies or third party integrations
  • Gain expertise in business knowledge and context around the designed solutions for selected verticals
Apache AirflowData ModelingEnterprise Solution DesignData PipelinesGoogle BigQueryData Engineering

Searce inc

Senior Data Engineer

Sep 2020Feb 2022 · 1 yr 5 mos · Bengaluru, Karnataka, India · Remote

  • Searce Inc. being one of the top Managed Service Partners of Google (GCP) across the globe, is always designing and implementing solutions for clients needing GCP services. As a Sr. Cloud Data Engineer, I work cross functionally with the Pre-Sales, Solutioning and Delivery teams in different roles. Searce’s main goal is to replace the existing architecture with better and efficient solutions.
  • Independently work with the clients to understand their existing architecture and identify the problem statement
  • Collaborate with the design team to design an efficient solution on GCP for data migration.
  • Create, test and implement data pipelines to migrate data for various sizes to the GCP Environment using various GCP products like Dataflow, Cloud Functions, Pub/Sub etc.
  • Independently perform Proof of Concepts with several GCP Products to validate the efficiency and durability of the proposed architecture.
  • Deliver an end-to-end solution for clients with possible add-on values
  • Conduct hands-on demonstrations/workshops of GCP Products for clients
DesigningPubSubGoogle Cloud Platform (GCP)Google BigQueryData Engineering

Self employed

Data Science and Engineering Consultant

Aug 2019Nov 2020 · 1 yr 3 mos · Kolhapur Area, India

  • As an independent consultant, I'm helping Indian software companies design end to end Data Solutions for their clients across the world. I'm using my experience in data so far to help organisations get insights into their business by organising their data and building interactive reports off this data for them as per their requirements.
  • Designing an end to end solutions includes choosing the right tools to automate the extraction, transformation and loading of the raw data into a suitable data warehouse along with creating advanced interactive reports which can be presented to business owners.
  • My current client is Mobifilia, a leading software firm based in Kolhapur Area specialising in Mobile and Web based solutions. Data has been a byproduct of these solutions and I'm helping them harness the power of this data by creating end to end data solutions for their clients.

Somos community care

Data Engineer

Oct 2017Mar 2019 · 1 yr 5 mos · New York, New York

  • Our aim at SOMOS is making sure as many people supported by medicaid stayed healthier and got proper medication/hospitalisation in time. The solution to this was purely data driven. We were receiving medicaid data from several sources such as the New York Dept of Health, Insurance Companies, EMR data vendors, etc.
  • As a Data Engineer with SOMOS, my responsibilities entailed designing data pipelines to ingest data from several such data sources and building a gospel data warehouse which was further used to build reports and analyse data. A major part of my job also entailed automating the ingestion of this data to maintain maximum efficiency and minimal error.
  • My expertise Python, SQL, AWS and bash were instrumental in achieving the goals at SOMOS. The experience at SOMOS was undoubtedly cherished and will be beneficial in every aspect through out my life
PythonSQLAWSData Engineering

Citadel llc

Data Production Analyst

Feb 2017Oct 2017 · 8 mos · Raleigh, North Carolina

  • Responsible for handling request from and coordinating with business owners, financial profile managers, sector analysts and stake owners to process unstructured vendor data using Python modules (pandas) and mining web data using web scraping and text mining modules like Scrape, Selenium web driver and lxml in python.
  • Knowing the purpose of the data and adding more insights to the data with processing expertise. Bulk inserting the processed data into SQL Server using sql modules in python. Performing a thorough Quality Assurance (QA) check on the data before handing it over. Scheduling these scripts so that they run over at a desired frequency to study and analyze the trend of the data.
  • Solutions:
  • Developing python scripts to process unstructured vendor data in to structured data and writing to SQL Server.
  • Scrapping the unstructured data off the requested websites in to structured data and writing it to SQL Server.
  • Performing QA checks on the data using SQL queries.
  • Scheduling python scripts to run at a certain frequency to track the trend of the data.
  • Technologies:
  • Python (pandas, scrape, lxml, selenium, pdb), SQL Server, bash and shell scrips, git commands
PythonSQL ServerbashData Engineering

Perficient

Technical Consultant

Jul 2016Feb 2017 · 7 mos · Charlotte Metro

  • As a Technical Consultant specializing in Data Science and Analytics at Perficient, I am responsible for creating complex data extracting, transformation and modelling solutions in several industries such as Pharmaceutical, Utilities, Retail, Marketing and Sales, etc. along with building solutions for visualizing this data on several on-premise and cloud platforms assuring a thorough analytical insight to our clients.
  • TECHNOLOGY:
  • Database Processing and Analysis: Apache Spark (using Python), SQL, PL/SQL, Trans-SQL SPSS, Informatica ETL
  • Analytics & BI Solutions: SAS Desktop/Enterprise, R (R Studio), MSFT Power BI, Tableau, Qlik, Minitab, Adv. MSFT Excel (VBA, Macros)
  • Coding Tools: Python, Java Object Oriented, C, Matlab, Keil, Modelsim, Visual Basic Scripting, AutoHot Keys
  • Project Management Tools: Confluence, JIRA, Siemens Teamcenter
  • SOLUTIONS:
  • Apache Spark Solutions for efficient and high speed data processing on Hadoop Clusters
  • Short Term and Long Term Load Forecasting using Predictive Modeling in the Utilities Industry
  • Java Object Oriented Programming (Conversant)
  • Big Data Analysis & Market Forecasting for an Internal Sales and Marketing Division
  • Data Visualization dashboards for several industries
  • Designing and Implementing Solutions on Amazon Web Services
  • INDUSTRY:
  • Electric Utilities and Energy
  • Information Technology
  • Automotive Manufacturing
  • Pharmaceutical
  • Retail
  • Marketing and Sales

Godwin global, inc.

Engineering Systems Consultant

Feb 2016Jun 2016 · 4 mos · Charlotte Metro

  • SUMMARY:
  • As a Engineering Systems Analyst, I was responsible for gathering specific requirement to design customized solution for their Project Lifecycle Management (PLM) tools, especially Siemens Teamcenter. Apart from requirement gathering and designing solutions, a major chunk of my responsibilities was to articulately convert these business requirements into technical queries for the solutions developing team and assure the solution's precision.
  • TECHNOLOGY:
  • Project Management Tools: Atlassian Confluence, Atlassian JIRA, Siemens Teamcenter
  • Coding Tools: Python, Java Object Oriented, C
  • SOLUTIONS:
  • Customized Project Lifecycle Managment (PLM) Vendor Solutions
  • Systems Engineering and Database Design
  • Java Object Oriented Programming
  • Agile Development, Sprint Sessions, UAT and Production Testing
  • INDUSTRIES:
  • Automotive Manufacturing
  • Racing

Unc charlotte

2 roles

Graduate Teaching Assistant

Aug 2015Dec 2015 · 4 mos

  • Assist professors in teaching the subjects of Energy Systems and Quality Systems.

Graduate Assisant and Photographer

Feb 2015Dec 2015 · 10 mos

  • Tutor undergraduate students for different courses to get the best out of them and make sure it is reflected in their grades.
  • A college media photographer.

Value plus

Business Data and Development Consultant

Sep 2013Jul 2014 · 10 mos · Pune/Pimpri-Chinchwad Area

  • SUMMARY:
  • As a Business Data and Development Consultant, a part of responsibility were to study the client's product/service and analyze the market data for the respective products. The other part of my responsibility entailed generating leads for my clients in innovative and creative ways to endorse the sale of their products/services.
  • TECHNOLOGY:
  • Zoho CRM
  • Microsoft Excel
  • Microsoft Power Point
  • Virtual Desktop Integration
  • INDUSTRY:
  • Virtual Desktop and Server Integration
  • Mobile Application Development
  • Software Development

Education

University of North Carolina at Charlotte

Master of Science (MS) — Engineering/Industrial Management

Jan 2014Jan 2015

Kls Gogte Institute Of Technology Belgaum

Bachelor of Engineering (BEng)

Jan 2009Jan 2013

Stackforce found 100+ more professionals with Data Engineering & Artificial Intelligence (ai)

Explore similar profiles based on matching skills and experience