TINA KALBENDE – Associate Consultant
Devops Engineer
Contact: +91-967******* ************@*****.***
Current Location: Mumbai
Profile Summary
● B.E Computer Engineer with 4.5+ years of IT experience, currently associated with Capgemini
● 3.5+ years’ experience in Devops and related technologies and providing support using the Devops tools.
● 1.1 years’ experience in Hadoop technologies and in implementation of different Hadoop components
● Good Knowledge and hands on experience with technologies like Map-Reduce,Sqoop,Hbase,Hive
● Worked on configuration of alerts and monitoring systems like Nagios
● Worked extensively on app configuration and health rules setup for App Dynamics
● Good knowledge on working with tools like Kibana and Splunk
● Worked with Jenkins and Github during automated deployments using Ansible playbooks and shell scripts
● Troubleshooting the alarms and issues generated in production environment and doing the Root Cause Analysis
● Worked on Agile based project with 2 weeks sprint based culture
● Hands on knowledge and experience on working with AWS components like creating EC2, EMR, S3
● Done POC on newly adopted technologies like Apache Airflow and Snowflake and Gitlab
● Used Airflow for orchestration and scheduling of the ingestion scripts
● Developed scripts for ingestion using languages like Python and Pyspark
● Well acquainted on usage of coding standard tools like Pylint,Pep8 and Pycode
● Worked with CICD pipeline creation using Gitlab and Ansible
● Worked on Docker based containers for using Airflow
● Experience on working with the Issue tracking and Documentation tool JIRA and Confluence
● Worked on different databases like Oracle, MSSQL Professional Experience
Capgemini Pvt.Ltd Associate Consultant May
2018 – till date
Infosys Limited, Pune Senior System Engineer Feb 2014 - November 2017
Key Domain and Technical Knowledge
IDEs : Eclipse, Pycharm
Database : Hbase, MySQL, MSSQL, Oracle, Snowflake Operating System : Windows, Unix, Linux
Scripting Languages : Bash,Python
Technologies : Java, HDFS, Map-Reduce, Pig, HIVE, Sqoop,Pyspark Web Related : Apache Tomcat
Tools : Putty, SVN, Nagios, App-Dynamics, Kibana,
Frameworks : Hadoop
Application Server : Cloudera Manager
Domain : Networking
Technical : Java, Hadoop – Map Reduce, NoSQL
Bug Tracking : JIRA
Documentation Tools : Confluence
Code Repository : Github
CICD tools : Gitlab, Git,Jenkins,Ansible
Scheduling tools : Airflow
Academic Achievements
• B.E. in 2013 from College of Engineering & Technology, Akola,Maharashtra with 67.77% Project Details
Role : Devops Engineer
Duration : May 2018 till date
The client is from the Media and Entertainment arena, and hence major work is on the data related to the different channels. Major work area is on AWS components like EC2, EMR and S3. We are currently working on the Snowflake layer for the Data Ingestion purpose .All the ingestion scripts developed in Python and Pyspark are orchestrated and scheduled using Airflow. The entire pipeline for the different environments is automated using the CICD tools like Gitlab and using Ansible playbooks .
Responsibilities:
● Working on Agile based project with 2 weeks sprint based culture
● Hands on knowledge and experience on working with AWS components like creating EC2, EMR, S3
● Done POC on newly adopted technologies like Apache Airflow and Snowflake and Gitlab
● Used Airflow for orchestration and scheduling of the ingestion scripts
● Developed scripts for ingestion scripts using languages like Python and Pyspark
● Well acquainted on usage of coding standard tools like Pylint and Pep8
● Worked with CICD pipeline creation using Gitlab and Ansible
● Worked on Docker based containers for using Airflow
● Experience on working with the Issue tracking and Documentation tool JIRA and Confluence
● Worked on different databases like Oracle, MSSQL Role : Devops Engineer
Duration : August 2015 to November 2017
Overall support provided for the Telecommunication domain client with respect to the servers configured in the environment which is divided across onshore and offshore teams. Being a part of the Off-shore support, responsible for keeping the servers up to date and healthy. The environment is setup is configured using the different devops tools such as Nagios, App Dynamics and Kibana. Responsibilities:
● Keeping the heath check on the servers using the tools such as App Dynamics, Nagios, Kibana
● Performed the configuration related tasks for Nagios and App Dynamics.
● Debugging the code cloned using Gerrit/GitHub and finding the code flaws
● Writing bash scripts for the automation of the archival and clean up tasks on the servers.
● Keeping the alerts and issues well documented using the agile tools such as JIRA and Confluence
● Assisting the Onsite team during the deployments and providing assistance with Jenkins and Ansible.
● Checking the issues related to database servers using Tableau dashboards.
● Leading the shift handover discussions with the onsite teams and discussing the technical obstacles
● Taking tools related overview sessions and major initiative in Knowledge sharing with the other team members
● Scheduling snapshots of volumes for backup and find root cause analysis of failures and documenting bugs and fixes; scheduled downtimes and maintenance of data center Role : Hadoop Developer
Duration : July 2014 to August 2015
Worked for a healthcare organization and is currently taking a leap into the big data analytics .The client offers a suite of solutions for population health management and healthcare analytics that are powered by a data intelligence platform. This suite of solutions enables transformation of data into insights that drive actions through decision support. Disease Management is an analytic module of Caradigm which helps in gaining insights of the patient’s health statistics and predict what respective measures can be taken up by the hospital to avoid further illness. Population and insurance management is also another module where the insurance plans can be devised as per the age and health of the patients. Responsibilities:
● Map-Reduce implementation for population health management, health insurance management and disease management.
● Worked on extraction of data from various database warehouses using Sqoop
● Writing map reduce for predictive analysis of disease management.
● Performed in code and design reviews.
● Performed in schema design reviews of hbase
● Installation and configuration of the hbase module
● Cloudera Management Configurations
● Monitoring the health of the nodes in the Hadoop cluster.
● Wrote custom scripts to monitor Namenode, data node, secondary name node, job tracker and task trackers daemons and configuring alerting system.