Indra Bardhan

CTO

India21 yrs 8 mos experience
Most Likely To SwitchHighly Stable

Key Highlights

  • Expert in building real-time analytics pipelines.
  • Proven track record in large-scale data processing.
  • Strong background in personalized content delivery systems.
Stackforce AI infers this person is a B2C data engineering expert with extensive experience in real-time analytics and content ingestion.

Contact

Skills

Core Skills

Apache KafkaHadoopHbaseHiveApache Storm

Other Skills

Android SDKApache FlinkCore JavaDB2ElasticSearchKibanaLogstashLuceneOracle PL/SQL DevelopmentPL/SQLPerlSoftware DevelopmentSybase Adaptive ServerSybase IQ

Experience

Walmart global tech

3 roles

Distinguished Engineer

Promoted

May 2023Present · 2 yrs 10 mos

Principal Engineer

Promoted

Nov 2020May 2023 · 2 yrs 6 mos

Staff Engineer

Feb 2018Nov 2020 · 2 yrs 9 mos

Dailyhunt (versé innovation pvt ltd)

Software Architect for Dailyhunt

Dec 2014Jan 2018 · 3 yrs 1 mo · Bengaluru Area, India

  • Dailyhunt is one of India's leading news Mobile app bringing news to users in 12 different languages with around 16-18 Million MAU (Monthly Active Users)
  • 1. Architected and implemented an Analytics pipeline based on Lambda Architecture
  • Real time business insights using Kafka, Storm and ELK stack
  • Batch processing over Cloudera Hadoop using Camus, Apache Flink.
  • Built Custom aggregation frameworks using Flink to build pre-aggregated data.
  • The volume of analytics data being processed in the platform is close to 1 biilion events per day i.e., around 1TB per day with peak hour throughput being close to 75000-80000 rps
  • 2. Architected and implemented User Profile serving as an input for Personalization platform to deliver personalized news streams to users based on their preferences and affinity
  • Modelled storing various time and action based aspects of profile in HBASE - of close to 40 Million users.
  • Edge Ranking of affinity scores to provide better attribution to recency and specific actions indicating strong affinity
  • Transformation of the profile data into Hive for adhoc Analytics by Business and MIS
  • 3. Modelled storage of analytics event data in Hive for various adhoc Analytics by Business and MIS
  • 4. Built an Android SDK that ships along with Dailyhunt app - To generate analytics data from the app and deliver it to HTTP endpoint. Involves buffering and storage of events in device database (Sqlite) and deliver it to HTTP endpoint in batches.
  • 5. Built a Geo tagging library for content enrichments of English content
  • Used Stanford NER library for identifying locations in content
  • Enriched the locations and tagged the content using Lucene, with additional district, state and country data using corpus downloaded from geonames.org
  • Helps in delivering localized content to users.
  • 6. Keyword based Topic Mapping
  • Built a TFIDF based keyword extraction and topic mapping and enrich content based on the keywords
Apache KafkaApache StormApache FlinkElasticSearchLogstashHBase+2

Yahoo

Principal Engineer

Feb 2012Dec 2014 · 2 yrs 10 mos · Bengaluru, India

  • 1. Apache Storm based Real time Content Ingestion platform - This project deals with building a near real-time content ingestion platform using Storm (one of the firsts at Yahoo) which would replace the existing Hadoop based ingestion platform.
  • This is one of the key projects in HP&V Business Unit in order to retire a number of platforms termed as redundancies in the overall content ingestion architecture at Yahoo.
  • 2. Video Content Ingestion Platform (Hadoop based) - The project dealt with building an ingestion platform for videos inline with the existing article & photo ingestion platforms.
  • Subsequently, we also built new solutions to improve the video serving experience at Yahoo e.g., work across teams to identify ideal transcoding profiles suited for Yahoo’s player, video watermarking and end-card integrations, visual seek, closed captioning etc.

Infosys (for goldman sachs)

Technology Lead

Jul 2004Feb 2012 · 7 yrs 7 mos

  • Was a part of Regulatory Operations Technology for my client Goldman Sachs.
  • 1. Common Business Reporting Architecture (COBRA) - This project dealt with building a unified model to store positions, products, contractuals and market data. This would support the common position reporting use case of retrieving positions and shares equivalent in a consistent and seamless manner across all product classes and types.
  • 2. Section 23A Regulation Reporting System - This project dealt with building a reporting system for reporting Affiliate exposures arising through “Covered Transactions’. This reporting obligation arouse from the client’s capacity as a Member bank to US Federal Reserve Board.
  • The reporting system aided in monitoring the covered transactions done by the bank and its related entities with its affiliates to avoid breaching limits set by Federal Reserve Act (Section 23A).
  • 3. Beneficial Ownership Reporting System - This project dealt with developing and maintaining a reporting system for the client’s reporting obligation arising in its capacity of a member firm from SEC regulation 13D, 13G and Form 13F.

Education

National Institute of Technology Rourkela

Bachelor's Degree — Civil Engineering

Jan 2000Jan 2004

Ispat English Medium School

Higher Secondary — Science

Jan 1998Jan 2000

Sri Aurobindo's School, Rourkela

High School

Jan 1985Jan 1998

Stackforce found 100+ more professionals with Apache Kafka & Hadoop

Explore similar profiles based on matching skills and experience