Sumit Prakash

CEO

Frisco, Texas, United States15 yrs 3 mos experience
Most Likely To SwitchAI ML Practitioner

Key Highlights

  • Over a decade of experience in Big Data and AI/ML.
  • Expert in designing modern data architectures.
  • Proven track record in driving customer success.
Stackforce AI infers this person is a Big Data and Cloud Solutions Architect with a focus on enterprise-level implementations.

Contact

Skills

Core Skills

Cloud Data EngineeringData Science & Ai/mlBig Data StrategyCloud StrategyBig Data/hadoop Solution ArchitectureWeb Services DevelopmentSoftware Development

Other Skills

Amazon Web Services (AWS)AntAnt ScriptingApache KafkaApache PigApache SparkArtificial Intelligence (AI)Azure DevOpsCloud ComputingCloudera StackCore JavaCustomer successData Warehouse ArchitectureData-driven outcomesDatabricks

About

Sumit is a seasoned Big Data, Cloud and AI/ML enterprise architect with more than a decade of experience helping organization build disruptive products by accelerating their Data Analytics , Machine Learning/AI competencies Areas of expertise include: ~ Data science & AI/ML ~ Cloud Data Engineering ~ Cloud Strategy ~ Building Modern Data Architecture ~ Hadoop ~ NoSQL and real time analytics ~ Data Streaming ~ Big Data/Hadoop solution Architecture, Design & Development ~ Big Data strategy ~ DR Setup ~ Expertise in Apache Hadoop, Apache Spark, Map Reduce, Apache Hbase, Apache Solr, Apache Pheonix, Apache Nifi, Apache Ranger, Yarn, Sqoop,Oozie,Apache Storm,Apache Kafka,

Experience

Databricks

Enterprise Solution Lead

Jan 2021Present · 5 yrs 2 mos · Frisco, Texas, United States · Remote

  • Trusted advisor helping customers achieve data-driven outcomes through the adoption of Databricks Lakehouse platform
Databricks LakehouseData-driven outcomesCloud Data EngineeringData science & AI/ML

Cloudera

Sr Solution Engineer

Jan 2017Jan 2021 · 4 yrs · Nashville, Tennessee, United States

  • Driving customer success in our Modern Data era.
  • Responsible for managing 5 Strategic account with Average ARR of $2 Million.
  • Bi-weekly Cadence with each LOB’s to guide customer for successful Journey.
  • Assisting Account Executive to identify new use opportunity and develop action plan.
  • Actively assist customers with conducting POC within the Cloudera Stack.
  • Actively involved in customers Hybrid and multi cloud strategy. Aligning Cloudera stack to match the customer’s cloud journey.
  • Work with Cloudera’s strategic customers to ensure successful implementations and drive CDH/CDP/HDP/HDF expansions/renewals
  • Work with Engineering team to bridge the gap between customer expectations and product capability.
  • Designing the Security and Compliance requirement across my customer including GDPR, CCPA, HIPAA.
  • Spread awareness across my customer base with regards to new features being introduced in the Cloudera Stack.
Cloudera StackHybrid cloud strategyGDPR complianceCustomer successBig Data strategyCloud Strategy

Hortonworks

System Architect

Jan 2016Jan 2021 · 5 yrs · Columbus, Ohio

  • Analyzes complex distributed production deployments, and makes recommendations to optimize performance.
  • Attend speaking engagements when needed and travels to various anticipated sites in the USA.
  • Documents and presents complex architectures for the customers’ technical teams.
  • Help to design and implement Hadoop architectures and configurations for customer. Drive projects with customers to successful completion.
  • Keep current with the Hadoop Big Data ecosystem technologies.
  • Work directly with customer’s technical resources to devise and recommend solutions based on the understood requirements.
  • Area of expertise include Apache Hadoop, Apache Spark, Map Reduce, Apache Hbase, Apache Solr, Apache Pheonix, Apache Nifi, Apache Ranger, Yarn, Sqoop,Oozie,Apache Storm,Apache Kafka
Hadoop architecturePerformance optimizationDistributed deploymentsBig Data/Hadoop solution Architecture

Tcs usa

3 roles

Senior Hadoop Developer at Walmart

Promoted

Aug 2013Jan 2016 · 2 yrs 5 mos

  • Writing Map Reduce program and implementing different design pattern.
  • Implementing different compression codec and other configuration parameter to achieve better performance
  • Developing Kafka consumer and parsing the stream data as per the business policies.
  • Configuring Spark Streaming to receive real time data from the Kafka and Store the stream data to HDFS.
  • Exporting Data using Sqoop from Teradata to Hadoop and Hive.
  • Managing and scheduling Jobs on a Hadoop cluster using Oozie.
  • Creating Hive generic UDFs in java to process business logic that varies based on the policies.
  • Involved in writing optimized Pig Script along with involved in developing and testing Pig Latin Scripts.
  • Converting Hive Queries to Spark SQL and using parquet file as the storage format.
  • Implementing different Hive Storage format like ORC to achieve better read.
  • Writing Hive Queries to Aggregate Data that needs to be pushed to the Cassandra Tables.
  • Creating Ozzie workflow to run multiple jobs.
  • Making JDBC connection to pull the Data from the Cassandra tables.
  • Created a POC to make JDBC connection through Spark thrift server which runs Spark-SQL on top of it.
  • ImplementingJersey Restful Web Service to make communication between UI and the Cassandra database.
  • Implementing REST API with Spring MVC framework.
  • Used Apache Shiro LDAP Authorization to authenticate the user and perform role specific tasks.
  • Environment: JDK 1.6, Eclipse 3.3, DB2, Restful Web Services, XML, UNIX,
MapReduceKafkaSpark StreamingOozieBig Data/Hadoop solution Architecture

Hadoop Developer and Web-Service developer at Home Depot

May 2013Aug 2013 · 3 mos

  • Designing and Developing REST API.
  • Implementing Web Service Layer for exposing order services to other Platform independent applications.
  • Writing SQL queries to generate report using iText PDF
  • Used Data Access Object (DAO) pattern to introduce an abstraction layer between the business logic tier and the persistent storage tier
  • Providing extensive pre-delivery support using Bug Fixing and Code Reviews
  • End to End Testing of the product.
  • Used Ant tool for building application War for deploying on Tomcat servers
  • Implemented Web Service Layer for exposing order services to other Platform independent applications.
  • Used SAX-DOM parsers for parsing XML data, JAXB API for binding
  • Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data
REST APISQLMapReduceXMLWeb Services Development

Point of Sale Developer at Walgreens

Dec 2010Apr 2013 · 2 yrs 4 mos

  • Used Beanstore framework to design and develped POS application.
  • Designing and Developing new patch release required to introduce new functionality at Walgreens POS.
  • Used JMS to post the POS transaction into the POS Database (Db2)
  • Creating Java Service Call for receiving and responding to the client request. Used jetty server and exposed it as a service in the Windows OS.
  • Writing Ant Scripting For Automating the Deployment Process.
  • Writing Shell Script to Automate the process.
  • Used ORM technology like Hibernate to make Database connection.
  • Environment: JDK 1.5, Eclipse 3.3, DB2, Web Service , Shell Scripting, Swings ,iBatis , Ant Scripting, Web Sphere Application Server, PHP , jQuery ,
POS applicationJMSAnt ScriptingSoftware Development

Education

JSSATE,BANGALORE

B.E — CSE

Jan 2006Jan 2010

Chinmaya vidyalaya.Bokaro

12 th — Science

Jan 2003Jan 2005

Stackforce found 22 more professionals with Cloud Data Engineering & Data Science & Ai/ml

Explore similar profiles based on matching skills and experience