Swapnil P.

Cloud Engineer

Leidschendam, South Holland, Netherlands7 yrs 6 mos experience
Most Likely To SwitchHighly Stable

Key Highlights

  • Over 10 years of IT industry experience
  • Expertise in AWS and Azure cloud solutions
  • Proven track record in CI/CD and DevOps practices
Stackforce AI infers this person is a Cloud Infrastructure and DevOps expert in the SaaS industry.

Contact

Skills

Core Skills

Cloud InfrastructureDevopsBig DataData Transformation

Other Skills

AWSAnsibleApache PigApache SparkAzkabanAzure DevOpsCircleCICloud ConsultingCloud ServicesCloud-Native ArchitectureCore JavaDockerELKETLElasticsearch

About

As a Cloud Engineer at NN Group, I am responsible for designing, implementing, and managing cloud-based solutions for the company's business needs. I have over 10 years of professional experience in the IT industry, with a strong background in AWS, Azure, Kubernetes, Docker, and DevOps. I am passionate about leveraging cloud technologies to optimize performance, scalability, security, and cost-efficiency. I have successfully migrated several applications from on-premises to cloud environments, using continuous integration and automation tools. I have also contributed to multiple publications on open source tools and best practices for enterprise software applications. My goal is to keep learning and innovating in the cloud domain and deliver value to my team and organization.

Experience

Nn group

Cloud Engineer

Feb 2019Present · 7 yrs 1 mo · The Hague, South Holland, Netherlands · On-site

  • Streamlined software delivery by developing CI/CD pipelines using Azure DevOps and Jenkins, resulting in a 40% decrease in time-to-market.
  • Implemented security measures using tools such as SSO, MFA, S.I.E.M and T.S.C.M. for multiple applications.
  • Improved system reliability and efficiency resulting in preventing P1 incidents.
Azure DevOpsJenkinsSSOMFAS.I.E.MT.S.C.M+2

Mycujoo

Cloud Engineer

Jul 2018Dec 2018 · 5 mos · Amsterdam Area, Netherlands · On-site

  • Role: Cloud Engineer.
  • Tools: Kubernetes, Docker, Google Cloud, AWS, CircleCI, ELK, Prometheus, Kibana, Elasticsearch, Git, Python, Shell scripting.
  • Responsibilities :
  • Implement deployment process in CircleCI for Dev Team.
  • Handle the production issue's on GKE (Kubernetes) production cluster and debug them.
  • Implement logging and monitoring for Dev team using prometheus, grafana, kibana, elasticsearch, fluentbit.
  • Writing kubernetes yaml files for the environment.
  • Implement API gateway's such as Ambassador for Dev team.
  • Create and launch kubernetes cluster including cert manager and extrenal DNS.
  • Creating docker images for the product of different component.
KubernetesDockerGoogle CloudAWSCircleCIELK+8

Idemia

Cloud Engineer

Jan 2018Jul 2018 · 6 mos · Lodz, Lodz District, Poland · On-site

  • Project -1: M-Connect 3 ( IOT tool)
  • Role: Senior Integration Engineer (DevOps).
  • Tools: Kubernets, Docker, Salt stack, AWS, Jenkins, Git, Python, Shell scripting.
  • Responsibilities :
  • Responsible for integrating and deploying end to end IOT product for the customer's such as Etisalat and Saudi telecom company.
  • Handle the production issue's on validation and production cluster and debug them.
  • Responsible for end to management of profiles of MNO's.
  • Creating a validation and production environment for the product.
  • Writing kubernets yaml files for the environment.
  • Creating docker images for the product of different components.
  • Writing Ansible play books for the configuration management.
KubernetesDockerSalt stackAWSJenkinsGit+4

Exadatum

Cloud Engineer

Jun 2016Jan 2018 · 1 yr 7 mos · Pune Area, India · On-site

  • Project -1: EVOC (Enterprise View of Customers)
  • Client : Kohls (American department store retailing chain)
  • Tools: Hadoop 2.4, HDFS, MapReduce, Pig, Hive, Azkaban, Shell Script, Sqoop.
  • Description:
  • 1) ESP job at client side initiates both full and delta ingestion data flows by triggering EVoC Extract Job.
  • EVoC Extract Job fetches the data from Oracle Exadata. It uses Protegrity to tokenize PII columns.
  • 2) EVoC Extract Job writes 6 extracts in XML format. Four main data extracts are Profile, Customer,
  • Customer Group Association and Account. In addition to the main extracts, it generates an accompanying extract Reference Values containing reference codes dictionary and extract Deleted Entities with deletions information.
  • 3) Each set of extracts is also supplied with a control file that serves two purposes: trigger MFT
  • transfer and provide meta data. MFT transfers data extracts and accompanying meta data to
  • DDH, using a directory on Hadoop Access Node as the target.
  • 4) MFT agent on Access Node writes the data files to a directory mounted with NFS Gateway, so files land directly into HDFS.
  • 5) Ingestion Job uses Hive to define data sets structure as a Hive table and uses Pig to transform
  • the data on different stages of transformation.
  • Responsibilities:
  • Writing Pig, Pig UDF's and Hive Scripts to process the semistructured data.
  • Writing shell scripts to automate the job.
  • Writing azkaban job to automate the job flow.
  • Handling the client calls and understanding the requirements on daily basis.
  • Project -2: Xsuite
  • Role: DevOps Engineer.
  • Tools: Docker, Ansible, Packer, Vagrant, Jenkins, Sonarqube, maven, Git, Python, Shell scripting.
  • Responsibilities :
  • Creating Docker images with HDFS single node cluster and product installed with the help of packer and vagrant.
  • Setup Jenkins server and automate the process for CI and CD.
  • Writing ansible playbooks for upgrading and maintaining ubuntu servers.
  • Writing python and shell scripting to automate the process.
HadoopHDFSMapReducePigHiveAzkaban+14

Savy software private limited

ETL Developer

Aug 2014Aug 2015 · 1 yr · Pune Area, India · On-site

  • Description:
  • The Iris registry is a registry built for the Ophthalmology hospitals in the USA.
  • The registry is used to fetch clinical data related to ophthalmology from the client (hospital) database. This data is then stored and various quality checks are carried out. Once the data quality is confirmed reports are generated based on this data. These reports are used by the doctors of the hospitals to submit to the AAO, from which they are granted incentives based on their performance which is delivered through those reports.
  • Responsibilities:
  • Developed ETL programs using ETL tool (developed by the company itself) to implement the business requirements.
  • Performed the data extraction, transformation and report generation phase.
  • Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.
  • Performed scheduling through PINNACLE job scheduler.
  • Worked on MS SQL, MySQL, and Oracle.
  • Worked independently and completed assigned project responsibilities under limited supervision and aggressive deadlines.
  • Environment: PINNACLE QMS 1.6, PINNACLE 2.0.
ETLInformaticaMS SQLMySQLOracleData Transformation

Education

Sinhgad Institute Of Management

Master’s Degree

Jan 2011Jan 2014

Sinhgad Institute Of Management

Bachelor; BCA — Computer Application

Jan 2008Jan 2011

Stackforce found 100+ more professionals with Cloud Infrastructure & Devops

Explore similar profiles based on matching skills and experience