Post Job Free

Resume

Sign in

DevOps

Location:
Schaumburg, IL
Salary:
$80 per hour
Posted:
February 25, 2021

Contact this candidate

Resume:

Lopamudra Satpathy

Email id: adkhgr@r.postjobfree.com

Mobile: +1-224-***-**** / +1-715-***-****

LinkedIn : https://www.linkedin.com/in/lopamudra-satpathy-8711ba127/

Profiles Summary:

●4 + years of experience as Cloud DevOps Engineer, Operation Engineer and Data analyst in Software Design, Development, Integration, Implementation and Maintenance of Data warehousing and Database applications in Digital E-Commerce, Retail Industry.

●8 years of experiences consists of piping specification, piping Stress analysis, piping layout in diversified industries like Power, oil & Gas & Carbon Black.

●Strong knowledge of Software Development Life Cycle (SDLC), SCM practices along with software Build and Release Management methodologies. In depth knowledge about Agile, Waterfall and Scrum methodologies.

●Experienced on Maven with Jenkins integration and automate app's deployment process.

●Experience in setting up the enterprise infrastructure on Azure like VM, ILB, Storage Account, Vnet, Subnet, Network Security Group (NSG), Auto scaling, Azure Marketplace Image (AMI), Azure SQL, IAM, Azure function, Azure App Service, Arm template, AKS, App Insight, Azure service bus

●Experienced in setting up environments and network using Chef.

●Experienced in creating Docker images and Docker containers.

●Experienced in Container management using Docker by writing Docker files and set up the automated build on Docker HUB and installing and configuring Kubernetes.

●Experienced in creating Jenkins CI pipelines to automate most of the build related tasks.

●Good Experience in Setup automated build, test and release platform using Jenkins pipeline as a code, SonarQube to be triggered on every code commit.

●Good knowledge in Terraform for provisioning resources, building, changing the infrastructure in azure cloud.

●Experienced in monitoring tools like Dynatrace, Graphite, ELK Kibana, Splunk, Zabbix.

●More than 2+ years of experience in scripting languages like Bash, Shell, Python.

●Experienced in setting up Baselines, Branching, Merging and Automation Process using Shell and Bash Scripts.

●Experienced in working with GIT to store, commit and integrate the code with Jenkins, ADO.

●Excellent understanding of SCM tools such as SVN, Git, Bit Bucket and GitHub.

●Experienced in deployment of applications on Apache Web server, Nginx, JBOSS, WebLogic and WebSphere Application.

●Experience in Installation, Configuration, Backup, Recovery, Maintenance, and Support of Windows & Linux servers.

●Experienced in JIRA, Confluence.

●A highly motivated, energetic individual, a team player with excellent communication and inter-personal skills.

●Strong verbal and written Communications Skills with the ability to effectively distill critical details into actionable highlights.

Technical Skills:

DevOps Tools

Jenkins, Docker, Kubernetes, SonarQube, Azure DevOps, Chef

Build Tools

Maven, Gradle

Version Controls

GIT

Monitoring Tools

Dynatrace, Graphite, Zabbix, ELK Kibana, Splunk, Prometheus, App Insight

Scripting/Programming

R, HTML, Java, Python, Shell, Bash, Power shell

Application Servers

Apache Tomcat, Jboss, WebLogic, Ngnix, SQL Server

Data Bases/Servers

Oracle, Mongo DB, Azure SQL

Operating Systems

Windows, Red Hat (LINUX) and UNIX.

Company: InSigma Inc Jan 2020 – Sep 2020

Client: TCS/Walgreens Digital

Role: RunOps Engineer

Domain: Digital Ecommerce

Environment: CI/CD, Jenkins, Maven, Kubernetes, Ant, Docker, SonarQube, Git, Shell, Bash, Python, Window and Linux.

Responsibilities:

•Co-ordinated the offshore development and managed day to day activities

•Performed functional and performance testing of solutions

•Researched and evaluated current and upcoming technologies and frameworks

•Maintained documentation of production schedules, production Runbooks, and assist in documenting operational best practices

•Ensured that all facets of the Operations Command Center are executed by approved procedural documentation

•Performed code migration to the production environment

•Strong partnership Development, Account Managers and Production Support Team

•Analyzed monitor calibration reports and workstation logs to identify and solve the issue

•Created and maintained documentation on the platform, such as platform details, Runbooks, escalation procedures, and any other that becomes a necessity

•Provided support to customer/user inquiries or issues regarding Walgreens products or services

•Provided after hours on-call production support

•Assisted in post-implementation and continuous-improvement efforts to enhance performance and improve future outcomes

•Worked on ITIL concepts like (Incident management, Change management and Problem management)

•Incident management and problem management, join Bridge lines, provide timely updates, troubleshooting production issues, vendor engagement

•Managed the daily workload based on priorities and Maintain SLA's to provide quality services to end users

•Worked with the Release management team to organize, plan, manage and execute new releases

•Documented and created new knowledge base to provide the most effective solutions to application issues. Solutions can be new code development, defect fixes

•Provided documentation and made technical presentations to management

•Participated in the release cycle of the product which involved environments like Development, QA, UAT and Production.

•Performed in PoCs (Proof of Concept) and help the Department with selection of Vendor Solutions, Technologies, Methodologies and Frameworks

•Documented all the process of CI/CD Pipeline to make sure that all the steps are completed successfully and for feature reference of Dev teams.

•Installed configured Jenkins, SonarQube, Docker in build server.

•Created ARM template stacks to automate for building various azure resources.

•Created quality gates in SonarQube dashboard and enforced in the pipelines to fail the builds when conditions not met.

•Build Scripts using MAVEN and Gradle build tools in Jenkins to move from one environment to other environments.

•Created, expanded, and configured automation in Maven/Gradle to automatically build, package, and deploy Java applications to multiple development and testing environments.

•Installed and Configured the GIT repository for sharing the artifacts within the company.

•Integrated Docker container orchestration framework using Kubernetes by creating pods, config Maps and deployments.

•Used Nagios as a monitoring tool to identify and resolve infrastructure problems before they affect critical processes and worked on Nagios event handlers in case of an automatic restart of failed applications and services.

•Responsible for tagging and maintain code on version control GIT and Creating branches and tags on GIT repository and provided branches access permission to dev team.

•Worked on integrating GIT into the continuous Integration (CI) environment along with Jenkins.

•Created branches in GIT implementing parallel development process.

•Written Shell scripts to apply the Integration label to all the files which needs manual labelling of files.

•Build and release software baselines, code merges, branch and label creation and interfaced between development and infrastructure.

•Analyzing application logs in Elastic search Kibana and monitoring applications.

•Monitoring applications log and troubleshoot using Dynatrace.

•Monitoring matrix among network utilization, disk space consumption, CPU load using Zabbix.

•Implementing a Continuous Delivery framework using Jenkins pipelines.

•Installed Jenkins on a Linux machine and created a master and slave configuration to implement multiple parallel builds through a build farm.

•Prepared the Release plan and Coordination activities from Release Management.

•Active participant in scrum meetings, reporting the progress and maintain effective communication with each team member and manager.

Company: Walmart Sept 2019 to Jan 2020

Role: Asset Protection Customer Specialist

Domain: Retail/Supply chain

Environment: Windows, MS Excel, MS Office

•Primarily responsible for preventing financial loss caused by theft and fraud and supporting safety and environmental program compliance in their assigned store or multiple stores.

•Utilizing tools to minimize loss to the Company, including but not limited to identifying incidents of theft and fraud, reviewing CCTV and exception reports, monitoring the store's physical security, auditing the Electronic Article Surveillance

•Driving a "shrink elimination" culture in the store.

•Preparing accurate and detailed case reports, documenting apprehensions and recoveries, preserving evidences.

•Interacting with law enforcement and testifying in criminal and civil court actions.

•Reporting any hazardous or unsafe condition to the manager on duty and carry out job responsibilities in a manner that minimizes the risk of injury to themselves, other associates, vendors, customers, and the Company.

Company: Camel Marketing Pvt. Ltd January 2016 to Feb 2019

Role: Data Analyst/DevOps

Domain: Digital Ecommerce

Environment: SQL Server 2005, RStudio 3.4.3, Shiny APP, MS Excel, MS Word

•Scheduling jobs for batch processing of data to run at a specified time

•Creation and configuration of database

•Creating data model for getting actionable and predictable result

•Interpreting data, analyzing results using statistical techniques and providing ongoing reports

•Getting and cleaning data using R programming, MS Excel.

•Acquiring data from primary or secondary data sources and maintaining databases/data analysis systems

•Keep track of every date cleaning operation, so you can alter changes or remove operations if required

•Identify, analyze, and interpret trends or patterns in complex data sets

•Filter and “clean” data by reviewing computer reports, printouts, and performance indicators to locate and correct code problems

•Sort data by different attributes

•For large datasets cleanse it stepwise and improve the data with each step until you achieve a good data quality

•To handle common cleansing task, create a set of utility functions/tools/scripts. It might include, remapping values based on a CSV file or MYSQL database

•Analyze the summary statistics for each column (standard deviation, mean, number of missing values,)

•Design of a Customer Enquiry using SHINY APP:

•Understanding the business requirement and finding the objective of the problem

•Exploration of data, preparation of dataset (missing data, handling Outlier, validation)

•Training the data and modeling to get the customer insight and its implementation

•Visualization with the right code for the objective

Bill of Material:

•Create, maintain, and validate the bill of material

•Part, routing, and manufacturing part orders

•Communicating about the inconsistencies to the concerned party and supervisor

•Auditing of the bill of material and resolve discrepancy

DevOps:

•Creating stories and helping the scrum master to add the story in the current sprint

•Participating in the sprint planning to plan out the sprint deliverables

•Checking in the code from local system to azure DevOps repository

•Running Azure built pipeline to build and deploy the code in the different runtime environment

Company : GS E&C CORPORATION Feb2011 to September2011

Role : Piping Engineer

Domain: Oil & Gas/EPC / Powerplant

•Co-ordination between Piping and other Engineering departments.

•Extraction of Isometrics by PDS. Stress Report generation from CAESAR II.

•Locate and identify pipe supports on isometrics along with type of supports.

•Review and sign-off piping isometrics considering flexibility and supporting.

•Prepare complete documentation of the stress analysis and support design

Company: Magnasoft Consulting India Pvt Limited June 2008 to October 2008

Role: Piping Engineer

Domain: Oil & Gas/EPC

•Perform pipe stress analysis for different equipment like steam turbine, compressor, centrifugal pumps, tanks, exchangers, heaters, columns, etc.

•Pipe thickness calculation.

•Preparation of Piping Material Specification.

•Preparation of General Bill of material.

•Preparing Valve Material Specification for Gate valve, Ball Valve & Blow down Valve

Company: Chemtex Consulting India Limited Dec 2007 to June 2008

Role: Piping Engineer

Domain: Oil & Gas/EPC/Carbon Black

•Pipe thickness calculation and branch reinforcement calculations.

•Preparing Piping Material Specification.

•Preparation of Valve material specification.

•Calculation of Test Pressure.

•Preparation of Technical specifications for pipes, flanges, fittings, valves, fasteners, gaskets & special items {e.g. Strainers (T, Y, Temporary type)

Company: Technip KT India Limited Oct 2006 to Nov 2007

Role: Piping Engineer

Domain: Oil & Gas/EPC/ Refinery

•Pipe thickness calculation. Preparation of Piping Material Specification.

•Preparation of General Bill of material.

•Preparing Valve Material Specification for Gate valve, Ball Valve & Blow down Valve.

•Requisitions to different vendors for all the piping items & Valves.

•Evaluation of offers from different vendors & giving the Technical Bid Tabulation report.

•Preparing the Purchase Order Summary & Issue them to respective Vendors.

•Preparing specification for Color coding, Valves, Gasket, Fasteners

Company: Lurgi India Pvt. Limited Mar 2005 to Sep 2006

Role: Piping Engineer

Domain: Oil & Gas/EPC/ Refinery

•Model static equipment design such as pressure vessels, storage tanks, Heat exchangers, columns.

•Perform pipe stress analysis for different equipment like steam turbine, compressor, centrifugal pumps, tanks, exchangers, heaters, columns, etc. Using PDS, performed Equipment Modeling, Piping Designer, Isometric Extraction, and Drawing Manager.

•Preparation of General Bill of material.

•Preparing Valve Material Specification for Gate valve, Ball Valve & Blow down Valve. Requisitions to different vendors for all the piping items & Valves.

•Evaluation of offers from different vendors & giving the Technical Bid Tabulation report.

•Preparing the Purchase Order Summary & Issue them to respective Vendors.

Professional Qualification

•Bachelor of Engineering (Mechanical) from Orissa Engineering College, Biju Patnaik Technical University (BPUT), Odisha, India securing 65.26% marks in the year 2003.

•Diploma in Engineering (Mechanical) from Rourkela Institute of Technology, State Council of Technical Education, Odisha, India securing 66.14% marks in the year 1999.

Certifications: (Please refer Accomplishment of my LinkedIn profile)

(https://www.linkedin.com/in/lopamudra-satpathy-8711ba127/)

•Earned AZ-900 certification by Microsoft Corporation.

•Earned certifications of professional achievement in Data sciences by John Hopkins University

1.Data Scientist’s Toolbox

2. R programming

3.Getting and cleaning Data

4.Exploratory Data Analysis

5.Reproducible Research

6.Regression Models

7.Statistical Inference

8.Practical Machine Learning

9.DataProduct

•Earned certifications of professional achievement in Big Data by University of California

1.Introduction to Big Data

2.Big Data Modeling and Management Systems

3. Big data Integration and processing

4. Machine learning with Big Data



Contact this candidate