V

Vikas Sharma

Software Engineer

Bengaluru, Karnataka, India11 yrs 11 mos experience
Most Likely To SwitchHighly Stable

Key Highlights

  • Over 13 years of backend service development experience.
  • Expertise in real-time data processing and analytics.
  • Strong foundation in distributed systems and data engineering.
Stackforce AI infers this person is a Backend Engineer specializing in Big Data and Data Engineering for E-commerce and Sports Tech.

Contact

Skills

Core Skills

Big DataData EngineeringBackend DevelopmentSoftware DevelopmentNetworking

Other Skills

API DevelopmentAerospikeAlgorithmsApache KafkaApache SparkApache Spark StreamingAvroCC++Core JavaData AggregationData StructuresDruidElasticsearchHive

About

I am an experienced engineer with a Masters in Engineering in Computer Science from BITS-Pilani. Have a strong foundation in computer science with an interest in distributed systems and data engineering. Proficient in Java and know my way around Golang. Close to 11 years of experience designing and developing data pipelines and access layers using Hadoop, Spark, Columnar Stores, OLAP, and similar technologies. And over 13 years of experience in developing backend services.

Experience

Cred

Individual Contributor

May 2022Present · 3 yrs 10 mos · Bengaluru, Karnataka, India · On-site

Flipkart

3 roles

Software Developer IV

Promoted

Jan 2021May 2022 · 1 yr 4 mos

  • Led the data platform team in the Flipkart M3 group that powers system
  • feedback, reporting, and analytics insights for Ads, Merchandising, Recommendations,
  • and CRM.
  • ● Led the team to deliver a real-time reporting system built on top of Spark Streaming,
  • Kafka, Aerospike, Elasticsearch, and Druid.
  • ● The systems show live reports to the advertisers by providing an aggregated view on top
  • of historical and real-time datasets,
  • ● Scaled the system to handle ingestion load of 1M Qps
Apache Spark StreamingKafkaAerospikeElasticsearchDruidBig Data+1

Software Developer III

Jan 2019Dec 2020 · 1 yr 11 mos

  • Unified Processing Framework:
  • ● Lead a pod of 4, to develop a custom data processing framework on top of Spark to
  • abstract out the common part of our processing pipelines and reduce the turnaround time
  • for developing a business pipeline for all new product use-cases.
  • ● Collaborated with the team, mentored and enabled juniors to develop the framework.
  • Unified Reporting Platform:
  • ● Worked on developing a single reporting platform for all reporting use-case. The platform
  • is designed around a set of queryable APIs (RESTFUL) interfaces that encapsulates the
  • needs of all kinds of dashboard Advertisement reporting requirements.
  • ● Feature set: Time-series & Tabular APIs, LogicalViews over datasets, Dimension
  • Enrichments, Pluggable Business Logic, Application side lookups, etc.
Apache SparkRESTful APIsData EngineeringBackend Development

Software Developer II

Jan 2017Dec 2018 · 1 yr 11 mos

  • Inhouse Slice and Dice Platform:
  • ● Designed and developed an API layer that would provide a seamless view over data
  • sources, abstracting out the physical data store(MySQL, CH, Druid).
  • ○ SQL Query Interface.
  • ○ Designed and developed Apache MetaModel module for integration with Druid
  • ● Collaborated with different team members to develop a custom in-house Slice and Dice
  • reporting solution on top of the Data View API Layer.
MySQLDruidAPI DevelopmentBackend DevelopmentData Engineering

Samsung research india, bangaore

Lead Enginner

Apr 2014Jul 2015 · 1 yr 3 mos · Bangalore

  • I worked on Samsung Sports Application backend. The app had major audience in European and Latin America countries and it supported all the major soccer leagues and soccer world cup. I was part of the team that handled following responsibilities:
  • 1. Data Aggregation from different sports content providers:
  • 2. XML and Json feeds parsing and pushing to RDB and Redis
  • 3. Synchronizing between multiple content providers and data correction
  • 4. Aggregation from different SNS content providers
  • 5. Pushing Realtime updates(match scores and commentary) and notifications to the clients.
  • 6. Admin portal: Developed an Admin portal for aggregator module for data verification, remote log analysis, content monitoring, health check, maintaining and archiving data feed files from content providers.
Data AggregationXMLJSONRedisBackend DevelopmentData Engineering

Samsung research india

Senior Software Engineer

Jul 2012Mar 2014 · 1 yr 8 mos · Bangalore

  • As a senior software engineer I worked on multiple projects in Samsung listed below:
  • Application Backend Development for Samsung Android applications(~1 year): I worked as a part of a small team (4 member) to develop a backend service to an application similar to Google Now on Tap but with very limited functionality as it was started as proof of concept. I was involved in following modules.
  • 1. Used Spring framework for building the whole system.
  • 2. Tag generation using text filtering and DBPedia, based on the content provided by the application.
  • 3. Content aggregation from different open/free web services (like twitter,facebook,news aggregator).
  • 4. (Experimental) Generate recommendations using EasyRec and Apache Mahout based on user actions.
  • 5. Used MongoDB and MySQL for database, Lucene for indexing the content.
  • ChatOn Instant Messaging Application for Android based phones(~2 months).
  • 1. Mostly worked on commercialization of the product, fixing bugs and adding features.
  • 2. For small time took work on refracting and optimisation of ChatOn application code.
  • ChatOn Instant Messaging Application for J2ME based phones:
  • 1. This was my first project/team I was part of when I joined Samsung.
  • 2. Here also worked on commercialization of the product, fixing bugs.
  • 3. Spent good amount on feature enhancement like chat backup and restore for J2ME phones.
Spring FrameworkMongoDBMySQLBackend DevelopmentSoftware Development

Telecom bretagne

Intern Project Developer

Jun 2011Aug 2011 · 2 mos · Rennes, France

  • Implementation of Dynamic DNS for wireless Sensor Networks(WSN).
  • I spent my summer working as an intern in one of the research labs of Telecom Bretagne in Rennes under Professor Laurent Toutain. The objective of intern was to send dynamic dns updates from wireless sensor nodes to facilitate easy monitoring and data collection. I worked independently on this which involved
  • 1. Understanding complete working of WSN, its consructs and restrictions.
  • 2. Sensor nodes computing and storage capabilities, which were quite low to conserve power.
  • 2. The operating sytem (contiki) used on the nodes.
  • 3. DAG based routing algorithms and its implementation used in WSN.
  • 4. 6LoWPAN, IPv6 and DNS protocols.
  • 5. C++ and linux networking.
  • I was successfully able to register the sensor nodes on DNS whenever they changed ip addresses using Dynamic DNS updates.

Education

Birla Institute of Technology and Science, Pilani

ME — Computer Science

Jan 2010Jan 2012

Aligarh Muslim University

Btech(computer science)

Jan 2006Jan 2010

City High School(AMU)

10th

Jan 2004Jan 2006

Senior Secondary School(AMU)

10+2

Jan 2004Jan 2006

Stackforce found 100+ more professionals with Big Data & Data Engineering

Explore similar profiles based on matching skills and experience