V

Vaishnavi Muley

Data Engineer

Aurangabad, Maharashtra, India5 yrs 2 mos experience
Most Likely To Switch

Key Highlights

  • Strong background in Data Science and Engineering.
  • Proficient in cloud technologies and data ingestion automation.
  • Passionate mentor with teaching experience in Python and DSA.
Stackforce AI infers this person is a Data Engineering professional with expertise in cloud computing and data automation.

Contact

Skills

Core Skills

Data EngineeringCloud ComputingDevopsTeaching

Other Skills

AWS DynamoDBAWS Managed GrafanaAWS Managed PrometheusAWS S3AWS SESAirflowAmazon DynamodbAmazon EC2Amazon EKSAmazon S3Apache AirflowApache KafkaAppsmithC (Programming Language)C++

About

Hi, This is Vaishnavi, first of all, thanks for reaching out to my profile and I hope you are doing well. Let me introduce myself to you, - I am a Tech Geek and a Data Lover, striving to be better in it by putting consistent efforts and upskilling every day as data drives me. - I am highly organized, motivated and diligent with significant background in Data Science, Data Engineering, and Operations (DevOps - MLOps). - I am a Passionate, Agile, Curious, Quick learner, and detail-oriented engineer. - I am always ready to take on new challenges, the responsibility of my work as well as mistakes, share my knowledge and learn from others. - For me learning is a never-ending process and thus I have built a habit to learn on the job. My skill set includes:- 1. Programming Languages [C, C++, and Python] 2. Databases [Relational - MySQL, NoSQL - DynamoDB] 3. DSA in Python 4. Linux [Ubuntu] 5. Version Control [GitHub] 6. CI/CD [GitHub Actions] 7. Cloud [AWS] 8. Container management and orchestration [Docker, Kubernetes] 9. Data Streaming Platform [Apache Kafka] 10. Data Collection from files, databases, APIs, and websites through web-scraping [Python Libraries - requests, JSON, BeautifulSoup, Selenium, boto3, etc.] 11. Data Preprocessing [NumPy, Pandas] 12. Data Analysis [Pandas] 13. Data Visualization [Matplotlib, Seaborn] 14. Conveying the results/insights gained from the analysis by writing a detailed report over it. I have implemented the gained knowledge by doing the following projects:- 1. Prediction of percentage based on a study of hours. 2. Case study of Indian startups. 3. Analyzing data of restaurants collected from Zomato API 4. Created InstaBot and collected data from certain pages from Instagram using selenium and BeautifulSoup. 5. Created a heart disease prediction system where I used a decision tree algorithm and KNN algorithm. 6. Twitter Sentiment Analysis Following are the certification done by me:- 1. Python for Everybody Specialization [University of Michigan - Coursera] 2. Crash course on Python [Google - Coursera] 3. An Introduction to Programming through C++ [IIT Bombay - NPTEL] 4. MTA: Introduction to Programming using Python [Microsoft] 5. Data Science and Machine Learning Complete [Coding Ninjas] Apart from technical skills, I do possess communication, presentation, and interpersonal skills. Currently, I am focusing on the Data Engineering domain. Thank You

Experience

Optum

2 roles

Data Engineer 2

Promoted

Apr 2024Present · 1 yr 11 mos · Mumbai, Maharashtra, India · Remote

Data Engineer 1

Feb 2023Apr 2024 · 1 yr 2 mos · Mumbai, Maharashtra, India · Remote

Episource

2 roles

Associate Data Engineer

Jul 2022Feb 2023 · 7 mos · Mumbai, Maharashtra, India · Remote

  • Worked on a large amount of data ingestion automation in Snowflake along with Apache Airflow, AWS S3, and AWS DynamoDB.
  • Deployed AWS Managed Prometheus and AWS Managed Grafana, created dashboards over Airflow and Karpenter metrics, also built alerts on top of them.
SnowflakeApache AirflowAWS S3AWS DynamoDBAWS Managed PrometheusAWS Managed Grafana+2

Data Engineer Intern

Jan 2022Jun 2022 · 5 mos · Mumbai, Maharashtra, India · Remote

  • Github action automation to set up semantic release versioning workflow on the GitHub
  • repository.
  • Porting Internal dashboard from standard react-based setup to No-Code tool (Appsmith) along
  • with its end-to-end deployment on Kubernetes.
  • PoC and tested autoscaling capability of custom GitHub action runner on AWS-managed
  • Kubernetes.
  • Prototyped End-to-end streaming pipeline with Apache Flink/Beam.
GitHub ActionsKubernetesNo-Code toolsAppsmithDevOpsCloud Computing

Pregrad

Python Mentor

Jul 2021Jul 2022 · 1 yr

  • I was a Mentor of Python and DSA at Pregrad.
  • Also gave them an overview of Data Analysis.
  • I taught the students and guided them throughout their learning journey with Pregrad.
Data extractionSQLAWS S3Data Engineering

Sg analytics

Data Engineer Intern

Jun 2021Oct 2021 · 4 mos

  • I worked on data extraction from various websites, APIs, and JSON formats and converted the unstructured data into a structured one.
  • I also worked on tasks related to SQL and AWS S3 services.
PythonDSAData AnalysisTeaching

The sparks foundation

Data Science and Business Analytics Intern

May 2021May 2021 · 0 mo · Maharashtra, India

  • Worked on Data Exploration and Visualization (Data Analysis).
  • Worked on: Supervised and Unsupervised Machine Learning, Text Analysis, and Timeline Analysis problems.

Tutorialsinhand

Technical Writer

Jan 2021Jun 2021 · 5 mos

  • Technical content writing in Data Science and Machine learning.
Data extractionSQLAWS S3Data Engineering

Knowledge solutions india

Intern

Aug 2020Aug 2020 · 0 mo

  • Performed data cleaning and data preprocessing, Data Visualization, Built models and used Decision Tree Classifier and KNN algorithms for prediction.
  • Worked on Heart Disease Prediction project.

Indian opensource community

Summer Intern

Jun 2020Jul 2020 · 1 mo

  • Was a multi-cloud project-based internship in which I performed a few basic tasks as follows:
  • 1. Launched one T2 micro-type windows server instance on AWS and Azure and configured an IIS server with the default web page.
  • 2. Created 2 EC2 windows free tire instances, configured IIS server on both and configured a classic load balancer between those two instances with HTTP service port on AWS.
  • 3. Accessed my AWS account from windows command prompt and launched one instance with one free tier AMI on a free tier instance type and took its remote access and created a 3G EBS volume and attached it to one running instance and created a 1GB partition on that and mounted it on one
  • directory.

Education

Deogiri Institute of Engineering and Management Studies, Aurangabad

B.Tech — Computer science and engineering

Jan 2018Jan 2022

Shri Santaji Jr College

HSC — Science

Maharashtra Public School

SSC

Stackforce found 100+ more professionals with Data Engineering & Cloud Computing

Explore similar profiles based on matching skills and experience