Sandeep Khurana

VP of Engineering

Bengaluru, Karnataka, India22 yrs 11 mos experience
Most Likely To Switch

Key Highlights

  • Over 20 years of experience in software architecture.
  • Expert in Big Data technologies and cloud-based solutions.
  • Proven track record in compliance implementations.
Stackforce AI infers this person is a Big Data and Cloud Solutions Architect with extensive experience in enterprise software.

Contact

Skills

Core Skills

Data EngineeringBig DataIdentity ManagementCloud ServicesSoftware Development

Other Skills

AWSAWS LambdaAirflowApache FlinkApache Ni-FIApache StormArgocdAzure CloudDatabricksDistributed SystemsDynamoDBEJBElasticSearchFlumeHadoop

About

Software and Data Architect having more than two decades of experience with a demonstrated history of working in the computer software industry. Strong engineering professional skilled in developing large scale cloud based Enterprise software products in Data Engg and JEE in B2B and B2C spaces. Hands on and good exposure to Big Data Technologies like Spark, Kafka, Hive, Storm, Hadoop (Yarn, MR etc), Java EE, Spring, Spring Boot, Kubernetes, Docker, ArgoCD, Micro services, Java Message Service (JMS), Hibernate etc. I have experience in implementing CCPA, SOX, 7216 compliances in the data platform. Also implemented security using encryption, access control in the platform and pipelines. And I run my YouTube channel where I share my knowledge about technical topics , containerization (Spark, Kubernetes, Scala...) etc . Please do join - https://www.youtube.com/channel/UCOVWgUTWVsTZy8XHItccUjA

Experience

Arcesium

VP, Senior Principal Engineer

Feb 2023Present · 3 yrs 1 mo · Bengaluru, Karnataka, India

Intuit

Principal Data Engineer

Jul 2020Feb 2023 · 2 yrs 7 mos · Bengaluru, Karnataka, India

  • Working on unified ingestion platform. The platform ingests data from RDBMS(Postgres, MySQL, Oracle, SQL Server), NoSQL (DynamoDB, Cassandra), Kafka, API, Files and reconciles into the lake. The ingested data can be mutable which gets merged (updated, created or deleted) into the Hive lake. The self serve allows users to onboard themselves to the platform and have their data ingested. Real time ingestion is possible for stream of data and batch ingestion happens for batched data eg from file, etc. The platform works at scale ingesting tens of thousands of order of magnitude of tables/entities. The components of the platform self serve, adaptors, reconciler etc have automated smart monitoring and alerting enabled which detects issues in both preventive and reactive manner. Technologies uses are Spark batch and streaming, hive lake with Parquet and Delta format, Java, Scala, Spring boot, Python etc, The ingested data in lake is of petabyte scale.
  • Implemented CCPA, SOX, 7216 compliances in the platform.
SparkHiveJavaScalaSpring BootPython+3

Apttus

Principal Architect

Dec 2017Jul 2020 · 2 yrs 7 mos · Bengaluru Area, India

  • Cash to Quote company. Solving good complex problem in Pricing domain. Built incentive management system using Big Data technologies Spark, Hive, Vertica, Airflow, Databricks, Azure Cloud.
SparkHiveBig DataAirflowDatabricksAzure Cloud+1

Vmware

Staff Engineer 2

Nov 2016Sep 2017 · 10 mos · Bengaluru Area, India

  • Worked on VMWare Identity Management product. I was part of team which built Authorization service into the product. OAuth2, OIDC specs were implemented and used. The Rest service was built using spring boot, spring security and used DynamoDB as storage system. It was deployed in AWS using elastic bean stalk.
OAuth2Spring BootDynamoDBAWSIdentity ManagementCloud Services

Nokia

Architect - Big Data & Analytics

Sep 2013Oct 2014 · 1 yr 1 mo · Bangalore

  • Built complete Big data processing and analytic pipeline. It had ingestion layer, ETL layer, reporting and dashboard. Most big data technologies were used in this like flume, hive, oozie, pig, data-warehouse framework. And since it was built on AWS platform, so most AWS techs like SWF (Simple workflow), S3, redshift,hive from aws, EMR, jobflow etc were also used. Few example of the work being done are given below.
  • Developed real time processing and analytics system using storm, kafka, flume, HDFS (Aws S3)
  • Customized distributed file system, extension of HDFS for security. It is extension of HDFS. It used an in-house built Key Management System (KMS). Used tomcat, mysql, java cryptography for KMS.
  • Built big data streaming ingestion layer. Technologies used are Scribe, apache Flume, AWS components like Load balancer, HDFS (Aws S3) etc
FlumeHiveOoziePigAWSBig Data+1

Yahoo!

Principal Engineer

Jul 2010Sep 2013 · 3 yrs 2 mos

  • Worked on local domain - Was involved in migrating legacy application (developed in perl) to Yahoo's Hadoop grid. This application is handling millions of records and was not scalable on legacy system. Used oozie as workflow engine, pig and map reduce programs for business logic and data was stored in hdfs and pushed to search engine.
  • Later worked on live event streaming project, video transcoding pipeline projects.
HadoopOoziePigBig Data

Oracle

JEE Architect

Apr 2008Jul 2010 · 2 yrs 3 mos

  • Was involved in developing health check solutions for "On-demand" customers. Oracle would provide hardware and software to its on-demand customers, These clients are business enterprises.
  • Designed, architect-ted and developed JEE application which would predict issues/failures and the solutions in software and hardware in advance.
JEEJavaSoftware Development

Robert bosch

Technical Manager

May 2005Feb 2007 · 1 yr 9 mos

  • Was part of the technical team in Germany which came up with workflow platform for Bosch applications worldwide. Later was leading workflow competency centre from Bangalore, India.

Ibm global business services

Senior Developer

May 2000Sep 2004 · 4 yrs 4 mos

  • Was involved in developing telematics based solution for auto industry. It got good media coverage e.g. http://www.computerworld.com/s/article/91266/International_Truck_to_offer_full_range_of_telematics.
  • Developed applications for advertising customer as well.

Efunds

Developer

Feb 1999Apr 2000 · 1 yr 2 mos

  • earlier eFunds used to be called HCL Deluxe

Education

Thapar Institute of Engineering & Technology

B.E. — Mechanical

Jan 1992Jan 1996

Stackforce found 100+ more professionals with Data Engineering & Big Data

Explore similar profiles based on matching skills and experience