Sri Ganesh Venkataraman

Software Engineer

Bengaluru, Karnataka, India14 yrs 1 mo experience
Highly Stable

Key Highlights

  • Expert in designing data pipelines with Hadoop ecosystem.
  • Proficient in large-scale data processing and cloud technologies.
  • Strong background in Java and enterprise application development.
Stackforce AI infers this person is a Big Data and SaaS expert with strong Java and cloud capabilities.

Contact

Skills

Core Skills

Google Cloud Platform (gcp)Large-scale Data ProcessingData EngineeringBig DataJavaReal-time Data Processing

Other Skills

Spring BootApache SparkLow-Code Development PlatformsApache HadoopApache NiFiKafkaHBaseSparkIn-memory Data GridCore JavaJ2EEWeb ServicesEnterprise Search ToolsWeb Content ManagementEnterprise Portal Development

About

• Expertise in designing data pipelines with Hadoop ecosystem - Implementation of Data Lake • Good experience on Core Java, J2EE, Spring Boot, JPA, Web services and Enterprise search tools. • Expertise in Web Content Management and Enterprise Portal development. • Ability to learn new concepts and adapt quickly to changing environments.

Experience

14 yrs 1 mo
Total Experience
2 yrs 6 mos
Average Tenure
1 yr 7 mos
Current Experience

Coupang

Staff Software Engineer

Oct 2024Present · 1 yr 7 mos

  • Core Data and Ingestion Platform

Walmart global tech india

3 roles

Staff Software Engineer

Promoted

Sep 2021Sep 2024 · 3 yrs

  • Data Platform - ETL and Data Exploration
Spring BootApache SparkGoogle Cloud Platform (GCP)Large-scale Data ProcessingLow-Code Development Platforms

Senior Software Engineer

Promoted

Jan 2020Sep 2021 · 1 yr 8 mos

  • Global Data and Analytics

Software Engineer III

Sep 2018Dec 2019 · 1 yr 3 mos

E2open

Software Engineer II - R&D

Aug 2017Aug 2018 · 1 yr

  • Design and development of Zyme/E2open's Data Lake system (EZLake) for Bigdata Analytics. Project was developed from scratch and was involved in various POC’s and architecture design discussions to finalize on the technology stack to be adapted for implementation of EZLake.
  • First phase of project focuses on data ingestion by building necessary data pipelines to replace the traditional ETL based approach.
  • Installed and configured Apache Hadoop, Apache NiFi, Kafka and HBase on prototype server.
  • Developed REST based management service application to configure data sources for ingestion
  • Implemented NiFi Data Flows for data Extraction from disparate data sources.
  • Developed Spark based streaming application for data Transformation and Load to Apache HBase.
  • Support for multi-tenancy and configurable customer-based schema/namespace.
  • Implemented real-time MySQL CDC
  • Working on performance optimization to accelerate the data ingestion process.
  • POC's : Apache Kylin, Hive and Kafka Connect

Maas360 by fiberlink, an ibm company

Staff Software Engineer

Jul 2016Jul 2017 · 1 yr · Bengaluru Area, India

  • Implementation of a distributed in-memory data grid with data replicated from transactional systems. IBM MaaS360 is a SaaS based enterprise Mobile device management platform. In-memory data grid aids in providing real-time insights on various vulnerabilities prevalent in the registered devices and reporting the same to customer admins to initiate timely action.
  • Designed and developed a JAVA based multi-threaded utility to load data to in-memory grid.
  • Developed a Kafka connect based solution to stream data to the grid.
  • Developed back-end application for supporting reporting and insights infrastructure
  • Performed various POC’s to finalize on an optimal way for data ingestion.
  • Mentored team on JAVA and in-memory distributed systems.
  • POC's : Apache Sqoop, Apache Zeppelin

Intel corporation

Intern - Application Developer

Jan 2016Jun 2016 · 5 mos · Bengaluru Area, India

  • ETL developer – Ab Initio
  • Learnt the intricacies of ETL and successfully re-architected a traditional VBScript based flow to Ab Initio data flow and deployed the same.

Tata consultancy services

2 roles

IT Analyst

Apr 2013Jun 2014 · 1 yr 2 mos

  • EMC Documentum -
  • Documentum content server upgrade – 6.5 SP1 to 6.7SP1
  • Installation of FAST/xPlore index server
  • Carried out a POC on PolySpot enterprise search.

System Engineer

Mar 2010Mar 2013 · 3 yrs

  • Experience in EMC Documentum -
  • Created new object types, alias set objects in Docbase for supporting Docbase users and applications.
  • Created Workflows, ACLs, Groups.
  • Creation of content templates, rules and XSLT presentation files.
  • Integration of Documentum with Weblogic Portal 9.2 MP2 and Oracle Weblogic Portal 10.3.
  • Customization of Web publisher according to business requirements using WDK Framework.
  • Created a utility using DFC enabling automation of content import and publishing.
  • Extensively worked on DFC in automation of end-end setup to enable content authoring/publish for new clients - (Cabinets,Folders,ACL,Group,LDAP Config,SCS Config)
  • Created TBO’s, Presets and custom methods as per the business requirement.
  • Migration of complete Documentum System from HP-UX (PA-RISC) to HP-UX (Itanium) server.
  • Experience in Weblogic Portal Development -
  • Creation of Desktop, Book, Page, Portlets - Page flow, Look and Feel of Portal - .laf, skins, skeletons, shell.
  • Creation/Configuring of Weblogic connection pools - Oracle
  • Custom Tag library development as per the business needs
  • Custom implementation of interfaces provided by Weblogic to replace the bridge between Documentum – Weblogic Portal(9.2 MP2) thereby upgrading Weblogic Portal(9.2 MP2) to Oracle Weblogic Portal 10.3 retaining Documentum as underlying CMS.

Education

International Institute of Information Technology Bangalore

Master of Technology (M.Tech.) — Computer Software Engineering

Jan 2014Jan 2016

Velammal Engineering College

Bachelor of Engineering (B.E.) — Electrical and Electronics Engineering

Jan 2005Jan 2009

Stackforce found 100+ more professionals with Google Cloud Platform (gcp) & Large-scale Data Processing

Explore similar profiles based on matching skills and experience