B

Botu Prasad

Product Engineer

Hyderabad, Telangana, India11 yrs 11 mos experience
Highly Stable

Key Highlights

  • 15+ years in cloud data engineering
  • Expert in AWS and data platform modernization
  • Proven track record in leading cross-functional teams
Stackforce AI infers this person is a Healthcare Data Engineering expert with strong cloud and ETL capabilities.

Contact

Skills

Core Skills

Amazon Web Services (aws)Data WarehousingData EngineeringEtl

Other Skills

AWSApache AirflowApache KafkaApache SparkBusiness AnalysisC++COBOLContinuous Delivery (CD)DB2Data LineageData QualityDatabricks ProductsEnglishEnterprise DataExtract

About

Results-driven Technical Manager with 15+ years of experience in cloud data engineering, leading the design and delivery of modern data platforms across healthcare, pharma, and enterprise domains. Hands-on expertise in AWS (Glue, Athena, EMR, Redshift, Lambda, Step Functions), Apache Spark, Snowflake, Databricks, dbt, and Python. Adept at building high-performance ETL/ELT pipelines, managing full SDLC, and driving platform modernization at scale. Known for leading high-performing, cross-functional teams and delivering critical projects under tight deadlines. Skilled in stakeholder engagement, solution architecture, team mentorship, and production-grade data solutions that are secure, scalable, and business-aligned. Passionate about transforming data into decisions and empowering teams to succeed through automation, best practices, and cloud-native thinking.

Experience

11 yrs 11 mos
Total Experience
2 yrs 3 mos
Average Tenure
7 mos
Current Experience

Amgen

Principal data engineer

Oct 2025Present · 7 mos · Hyderabad, Telangana, India · On-site

Apache SparkAmazon Web Services (AWS)Data WarehousingEnterprise DataDatabricks ProductsApache Airflow+4

Valuemomentum

Technical Lead Manager

Oct 2020Jun 2021 · 8 mos · Hyderabad, Telangana, India

  • Developing a platform for data management solution that harnesses the power of the healthcare data explosion to enhance people's live.
  • As a Technical manager, Responsible to design and implement the AWS cloud-based data lake and foundational data marts (Redshift) to integrate the data coming from different vendors to ease the building of the reporting layer, ease of communication between marts, extensibility of the framework to new subject areas with minimal code changes in the configs, ABC framework, auto sizing of the cluster based on the batch size etc.
  • Responsibilities:
  • 1) Part of design, development, and delivery of complex ingestion framework implementation.
  • 2) Designed the framework to implement features such as Standardization rules for Data Quality, deduplication criteria, auto sizing of EMR cluster based on batch file sizes, Data lineage tracking implementation for data, audit data for job runs, etc.
  • 3) In depth data profiling activities on all the source system entities to identify the distribution styles, sort keys and encoding techniques to be used on Redshift Tables.
  • 4) Automate the implementation of data model changes to framework with respect to new tables / column addition etc. with minimal build effort driving it through config changes.
  • 5) Performance tuning –
  • a) Spark memory optimization, Pools implementations, AWS redshift distributions and sort keys design
  • 6) Estimate- system capacity to meet near- and long-term processing requirements.
EnglishSnowflake CloudAmazon Web Services (AWS)Enterprise DataData Warehousing

Optum

Senior Data Analyst

Aug 2015Sep 2020 · 5 yrs 1 mo · Greater Hyderabad Area

  • Optum is a leading health services and innovation company focused on improving affordability, quality and efficiency of care. RQNS is Risk Quality and services portfolio deals with services provided to Member, Payer and Provider for US Healthcare. It deals with assigning Risk score to Members which in turn reflects the revenue of the payer.
  • Responsibilities:
  • Designing ETL and Reporting solutions using Spark, Python, Sqoop, Hive and HBase..
  • Developed data pipelines to extract the data from S3,HDFS and NoSQL technologies like Cassandra and MongoDB.
  • Enhancing the Application as per the Business Requirement.
  • Generating Weekly and Monthly Reports based on client requirement.
  • Implemented Incremental load to improve the performance which in turn saved 300+ hours of productivity.
  • On- Boarding the New Clients which includes both External and Internal .
  • Design and Implement new ETL’s Using Sqoop to ingest & retrieve data from various RDBMS like Oracle DB & MySQL.
  • Designed the Re circulation flow to fetch the missing data from different domains which is caused due to timing difference.
  • Design recommend best approach suited for data movement from different sources to HDFS using Apache Kafka
  • Worked on Automation of tasks which include manual intervention earlier and it saved manual efforts.
  • Proven ability to effectively manage time and resources and prioritize tasks to ensure smooth delivery to Business.
  • . Experienced in Team Management , also part of Training program for new joinees.
EnglishEnterprise Data

Liquidhub

Senior Software Engineer

Aug 2014Aug 2015 · 1 yr · Hyderabad

  • migrated legacy Applications in SAS to Hadoop Environment, Understood Existing legacy application and identified the performance related issues and provided solution in Hadoop environment. Migration is completed Using Sqoop,Hive,HBase for data extraction and storage and MR for processing data.
English

Unitedhealthcare

Software Engineer

May 2012Aug 2014 · 2 yrs 3 mos · Greater Hyderabad Area

  • The BSYS SAS Data Mart provides a single environment for reporting and analysis of Medicare data. SAS DataMart is built from all of the disparate sources of Medicare data. This is used by the Clinical Performance & Compliance department to support Medicare Risk Adjustment activities and to fulfill contractual arrangements and support information products.. Reports and Analysis are delivered through the appropriate CPC channel Via SAS Web Report Studio, SAS Datasets in the Shared Folder and Reports in Excel and Tableau.nt.
English

Ge

Data Analyst

Jan 2010May 2012 · 2 yrs 4 mos · India

  • Joined GE as GET and worked in the area of Supply chain management and Handled Legacy Applications and ETL along with Identifying the chances we have for the process Improvement.
English

Education

JNTU University

Master of Computer Applications - MCA — Computer Science

Apr 2006Jun 2009

Smt Godavari devi Saraf high school

Schooling — SSC

Jan 1995Jan 2000

Stackforce found 100+ more professionals with Amazon Web Services (aws) & Data Warehousing

Explore similar profiles based on matching skills and experience