Ajinkya Chatufale — Software Engineer
With 10 years of experience in modern phases of the Software Development Life Cycle (SDLC). * I specialize in cutting-edge technologies such as Big Data, AWS, GCP, Azure, GenAI, Data Engineering, Real-Time Data Processing, Cloud-Native & Serverless Architectures. * I have extensive expertise in Scalable System Design (HLD & LLD) with Design Patterns, including CAP, CDN, Cache, Load Balancer, and advanced concepts like Data Mesh, Data Fabric, Data Governance, Data Security, and Data Privacy. * Proficient in LLM, Prompt Engineering, Vector Databases, Agentic AI, LlamaIndex, LangChain, and Model Tuning (LoRA, QLoRA) with RAG solutions, N8N workflow automation. •Expertise in AWS services such as S3, Glue, EMR, EC2, Redshift, Lambda, RDS, Secrets Manager, Amazon Managed Airflow, and CloudWatch, with experience in building end-to-end frameworks using these services. •Proficient in AWS streaming services: SQS, SNS, EventBridge, and Kinesis for real-time data processing. •AWS CloudFormation expertise: Designed and implemented complex CloudFormation templates to automate infrastructure provisioning, using AWS services like Service Catalog. •Data Governance: Proficient in addressing data governance across security, compliance, access control, auditing, and data management using AWS services such as Lake Formation, AWS Glue Data Catalog, IAM, CloudTrail, and KMS. •Spark experience: Extensive work with Spark DataFrames, Datasets, and RDDs, creating end-to-end frameworks using transformations like withColumn, filter, groupBy, orderBy, join, count, and creating temporary views. •Proficient in Python: Skilled in data analysis, manipulation, and processing using Pandas and NLTK, with experience in building frameworks using OOP concepts and leveraging Langchain. •NLP experience: Expertise in Tokenization, Stop Words, Stemming, Lemmatization, Bag of Words, and Word2Vec. •GenAI and Vector DB knowledge: Experience with OpenAI, Gemini, Hugging Face, Groq Cloud, and Vector DBs such as FAISS, ChromaDB, Pinecone, and LlamaIndex. prompt engineering with few-shot, one-shot, and hybrid prompts. •Deep Learning basics: Familiarity with ANN, RNN, LSTM, Transformers, and BERT. •Cassandra, Snowflake and DynamoDB experience: Proficient in Cassandra Data Replication (Replication Factor, Replication Strategy) and DynamoDB (on-demand and provisioned modes). Skilled in managing Snowflake External Tables and automating data ingestion using Snowpipe. •Experience in building High level design and low level design.
Stackforce AI infers this person is a Data Engineer specializing in Big Data and Cloud technologies across Fintech and Healthcare industries.
Location: Pune, Maharashtra, India
Experience: 11 yrs 3 mos
Skills
- Big Data
- Aws
- Data Engineering
- Software Development
Career Highlights
- 10 years of experience in Software Development Life Cycle.
- Expertise in AWS and Big Data technologies.
- Proficient in Data Engineering and Real-Time Data Processing.
Work Experience
Barclays
Lead Data Engineer (4 yrs 8 mos)
ConnectWise
Senior Data Engineer (1 yr)
FIGmd, Inc.
Data Engineer (2 yrs 5 mos)
IBM
Data Engineer (1 yr 6 mos)
Tech Mahindra
Software Developer (1 yr 7 mos)
Education
Master’s Degree at Sinhgad school of Computer studies
Bachelor’s Degree at Mangalvedhakar institute of management