Rajesh kumar

Data Engineer

Noida, Uttar Pradesh, India7 yrs 2 mos experience
Most Likely To Switch

Key Highlights

  • Expert in AWS cloud and serverless solutions.
  • Proven track record in data engineering and automation.
  • Strong experience in developing scalable APIs.
Stackforce AI infers this person is a Data Engineer specializing in cloud solutions and data automation for digital marketing.

Contact

Skills

Core Skills

Data EngineeringCloud SolutionsWeb Development

Other Skills

API CrawlingAWS AthenaAWS EC2AWS LambdaAWS S3Apache AirflowBoto3Data AnalysisData ArchitectsData MaintenanceDjangoDjango REST FrameworkElasticsearchExtract, Transform, Load (ETL)FastAPI

About

Professional Summary: I am a versatile technology professional with expertise in designing and managing cloud infrastructures, AWS EC2 instance environments, data pipelines, and automation systems. My role encompasses developing serverless solutions on AWS, building robust ETL pipelines, automating repetitive tasks through RPA, and creating secure APIs. I also focus on continuous process improvement, quality assurance, and effective collaboration to support business growth and innovation. Currently, we are introducing Elasticsearch to enhance data handling and management. Key Responsibilities: AWS Cloud & Serverless Solutions: Design, deploy, and manage AWS Lambda functions alongside related services such as S3, Secrets Manager, and CloudWatch. Implement Infrastructure as Code (IaC) using tools like CloudFormation to automate the deployment process, while managing AWS EC2 instances and Windows servers for deploying scripts. Ensure optimal performance, scalability, and security across cloud infrastructures. Data Engineering & Pipeline Development: Build and maintain robust ETL pipelines using Apache Airflow to facilitate the extraction, transformation, and loading of data from diverse sources. Ensure data quality through real-time validations and comprehensive error-handling mechanisms, supporting accurate reporting and analytics. RPA Automation & Scripting: Develop automation scripts using Selenium and Python to handle tasks such as API crawling and collecting large CSV data sets, streamlining digital campaign management and data synchronization. Integrate robust error handling and automated testing to maintain high process reliability and minimize downtime. Project Development & Redevelopment: Lead new project initiatives while overseeing the iterative redevelopment of legacy systems to incorporate new features and optimize performance. Collaborate effectively with cross-functional teams using agile methodologies to drive innovation and improve overall efficiency. API Development & Integration: Design and develop secure, scalable APIs that facilitate efficient communication between front-end applications and back-end systems, ensuring seamless data exchange. Collaboration, Quality Assurance & Documentation: Mentor team members, conduct regular code reviews, and facilitate technical workshops to promote continuous learning and adherence to best practices. Monitor systems proactively using tools such as AWS CloudWatch, while maintaining detailed documentation to ensure transparency and support continuous improvement. Elasticsearch Integration:

Experience

Agl - hakuhodo

Data Engineer

Aug 2023Present · 2 yrs 7 mos · Gurugram, Haryana, India · Hybrid

  • Project Title: Automated Data Extraction and Processing for Digital Transformation
  • Project Overview: This project aims to enhance dglobal360 India Pvt. Ltd.'s capabilities in digital marketing, data analytics, design thinking, and business intelligence by implementing automated data extraction and processing solutions. By leveraging web scraping, API crawling, and RPA actions, we aim to streamline data collection, improve decision-making processes, and drive business growth for our clients across various verticals.
  • Key Responsibilities:
  • Developed and maintained web scraping tools using AWS Lambda and Boto3 for extracting data from Flipkart, Amazon, Blinkit, and Zepto.
  • Implemented API crawling for efficient third-party data extraction.
  • Designed and executed RPA workflows to automate repetitive tasks.
  • Utilized AWS S3 for centralized data storage.
  • Leveraged AWS Athena for data querying and analysis.
  • Created dashboards and reports to visualize key performance indicators (KPIs).
  • Provided real-time data insights to support strategic decision-making.
  • Achievements:
  • Improved data-driven decision-making for clients.
  • Enhanced client engagement through personalized marketing strategies.
  • Streamlined data collection and processing, increasing operational efficiency.
Python (Programming Language)FlaskAWS LambdaBoto3Web ScrapingAPI Crawling+5

Dgmarket international

Python Developer

Sep 2020Aug 2023 · 2 yrs 11 mos · Delhi, India · Hybrid

  • Build a software to Extraction data, you will be responsible for extracting and ingesting data from websites using web
  • Scraping tools. In this role you will own the creation process of these tools, services, and workflows to improve scrape
  • analysis,reports and data management.We will rely on you to test the data and the scrape to ensure accuracy and quality
  • and Loading the data into the database. Using technologies like, Python,Django, Pandas, Selenium webdriver.
Python (Programming Language)Data MaintenanceWeb ScrapingSeleniumData Engineering

Daas labs : data and analytics services

Python Developer

Dec 2018Aug 2020 · 1 yr 8 mos · Gurugram, Haryana, India

DjangoDjango REST FrameworkWeb Development

Education

SUS College of Engineering And Technology(SUSCET)

Bachelor of Technology - BTech — ece

Jan 2013Jan 2017

Stackforce found 100+ more professionals with Data Engineering & Cloud Solutions

Explore similar profiles based on matching skills and experience