Post Job Free

Resume

Sign in

Python Service

Location:
Raleigh, NC
Posted:
October 30, 2019

Contact this candidate

Resume:

Profile

Having Over **+ years of total experience in telecom domain as developer/architect.

Having 4+ Years’ experience in Open stack, Open Contrail and ONOS (SDN Controllers) internals.

CMC certified (Configuring and Monitoring Contrail)

13+ years of experience in telecom domain (Order Management, Provisioning)

Worked extensively on automation of Palo Alto Firewalls, Fortinet Firewall and Routers using Python

Extensive experience with analysis, design, development.

Contributed to Open Contrail (opensource).

Proficient in analyzing and translating business requirements to Functional requirements, design and development.

Extensive development experience in Open stack Mitaka.

Experience in execution of generateds Python framework.

Good experience in developing REST API’s.

Good communication skills, interpersonal skills, self-motivated, quick learner, team player.

Skill Set

Extensive experience developing and consuming REST APIs using Python. 5+ years’ experience in Python programming

Deep Knowledge of Open Contrail architecture and ONOS. (SDN Controller)

Experience in OpenFlow protocol.

Installed Open Contrail and Integrated with Open stack using Contrail neutron plugin.

Open source contributor for Open Contrail.

Submitted a blueprint on Open Contrail packet loss.

Working on Contrail Analytics Alarm enhancement feature

Fixed bugs in Open Contrail

Experience with revision control source code repositories (Git) and code review tools (Gerrit) and Jenkins

Proficiency with Linux environments and utilities.

Extensive python development with multi-processing, named pipes, and remote execution.

Good knowledge on all components of Open stack

Created devsetup for Open Contrail and Open stack

Exposure to web services (REST). Good Experience scripting against RESTful APIs

Experience in software development with Scrum/Agile Methodology.

Knowledge of infrastructure automation tools such as Ansible, Vagrant, Chef. Hands-on experience with RESTful, Python and Ansible

Knowledge on Service chaining configuration in Open Contrail

Knowledge of network routing and switching technologies

Strong experience with virtualization technologies, primarily KVM

Experience with container-based technologies such as Docker

Experience or exposure to the Open stack platform and its associated modules

Good knowledge in CI/CD process, viz., Jenkins etc.

Good understanding of Kubernetes. Installed and integrated Kubernetes with Open Contrail

Hands on experience with Microservices, Dockers, Containers, PaaS and IaaS based Cloud automation & Orchestration

Good understanding of L2/L3 networking protocols

Good working experience in understanding, designing and coding complex algorithms.

5+ experience in Python programming

Education:

MCA with specialization in Computer Applications from National Institute of Technology, Warangal.

BSc. with specialization in Computer science from Osmania University

Professional Experience

October 18 – till date, Technical Lead, Customer: Verizon Project: ME

Technical Skills: Open stack, Python, Kubernetes, Docker, Container, ELK stack, Palo Alto Firewall, Fortinet Firewall, SD-WAN, Rest API, XML.

My role is as follows:

Developed a framework using Python to automate scripts for verifying VNF (Hitachi, Palo Alto, Fortinet) functionality using Python OOP concepts, handled XML and JSON data.

Configured the policies, objects and applying NAT & Web filtering on Palo Alto and Checkpoint. Protocol based policies troubleshooting on Palo Alto and changing the policies according to the requirements and as per traffic flow. Configuring IP, RIP, EIGRP, OSPF and BGP in routers.

Hands-on experience with management of security rules based upon the NAT/PAT, ACL and VPN on Palo Alto firewalls. Configured VLANs and created zones on the Palo Alto firewalls and also implemented Fortinet Firewalls on the other side.

Implemented REST APIs using Python. Libraries used: urlib2, JSON libraries. Worked on Python OpenStack API's.

Worked on integrating ITP with Marketplace. Developed CI CD pipeline using Jenkins, setting up new projects and debugging build problems.

Maintained Git repositories for DevOps environment: Version control and build automation integrating git into Jenkins.

Developed custom Ansible playbook and integrated in Jenkins post configuration for setting up the automated build pipeline for GIT repository projects

Worked on infrastructure with Docker containerization. Expertise in reporting tools like Grafana, Kibana (ELK) for setting up charts and graphs for better visual representation of test results.

Expertise in Kubernetes to deploy scale, load balance, and manage Docker containers with multiple namespace versions.

Worked on creation of custom Docker container images, tagging and pushing the images. Worked on Ansible, Vagrant, and Docker for managing the application environments.

Utilizing coding and scripting background for Network Automation with routers, switches, load balancers and firewalls.

Experience in GIT, Jenkins, Continuous Development (CD)/Continuous Integration (CI), General understanding of networking concepts.

Worked on Python Open stack API's used Python scripts to update content in the database and manipulate files.

Wrote Python scripts to parse XML documents and load the data in database.

Used GitHub for Python source code version control. Used Jenkins for automating to build Docker containers. Consuming Open stack REST API using python

Worked on requirements gathering, development, deployment of VNF’s automation,running Vulnerability scan using Nessus API.

Worked on creating ELK stack, integrating with Kubernetes.

Wrote Python routines to log into the websites and fetch data for selected options.

Used Python modules such as requests, urllib, urllib2 for web crawling

March 18 – September 18, Technical Lead, Customer: Comcast Project: NGAN

Next Generation Access Networks (NGAN) based on Open Standards and Open Cloud Infrastructure (Compute, Storage & Switching) Platforms leveraging various Open Source projects/components and customizing the software stack for the Virtualized Cable Modem Termination System (vCMTS) architecture to launch virtualized residential and enterprise access services. The Project consists of software components and platforms similar to Central Office Redesigned as Data Center (CORD) involving Open Network Operating System (ONOS) Software Defined Network Controller (SDN-C), White box Switching platforms based on edge-core OFDPA and Compute based on Linux, Hypervisor, Docker/K8, Telemetry & Log management involving Prometheus/Kafka and Elastic Search ELK stack. The programming technologies would involve Java/Python/Go, C/C++ Languages in a Linux environment. This project deals with development and operations work on software version of CMTS [Cable Modem Termination System] which can be hosted on container-based virtualization platform that will replace current traditional M-CMTS [Modular CMTS].

Technical Skills: ONOS, Python, Kafka, Kubernetes, OpenFlow, Docker, ELK stack, Networking

My role is as follows:

Worked on requirements gathering, development, deployment of development of Flow verification tool to verify the flows installed in the white box switches(from Leaf, Spine, Daas) by ONOS using python, ansible, git, docker and Kubernetes.

Creating alarms and alerts for any missing/wrong flows reported by the tool in Kibana, Prometheus (ELK stack)

Developed various tools to automate tasks.

Used Test driven approach for developing the application and Implemented the unit tests using Python Unit test framework.

Perform Jenkins administration, updating plugins, setting up new projects and debugging build problems.

Developed Python based API (RESTful Web Service) to track the events.

Involved in Developing a Restful service using Python.

Jul 16 – Feb 18, Technical Lead, Customer: AT&T Project: Open Contrail

Domain 2.0 is a transformative initiative, both internal and external, to enable AT&T network services and infrastructure to be used, provisioned, and orchestrated as is typical of cloud services in data centers. It is characterized by a rich set of APIs that manage, manipulate, and consume services on-demand and in near real time. Domain 2.0 seeks to transform AT&T’s networking businesses from their current state to a future state where they are provided in a manner very similar to cloud computing Services.

The Open Contrail Controller is a logically centralized but physically distributed Software Defined Networking (SDN) controller that is responsible for providing the management, control, and analytics functions of the virtualized network.

The Open Contrail vRouter is a forwarding plane (of a distributed router) that runs in the hypervisor of a virtualized server. It extends the network from the physical routers and switches in a data center into a virtual overlay network hosted in the virtualized servers (the concept of an overlay network is explained in more detail in section 1.4 below). The OpenContrail vRouter is conceptually similar to existing commercial and open source vSwitches such as for example the Open vSwitch (OVS) but it also provides routing and higher layer services (hence vRouter instead of vSwitch).

The OpenContrail Controller provides the logically centralized control plane and management plane of the system and orchestrates the vRouters.

Working on Packet loss alarm feature.

Technical Skills: Open Contrail, OpenStack, Python,(generateds, multithreading, REST webservices) C, C++, Scons, Networking

My role is as follows:

Working on Contrail Packet loss feature development ContrailEN-190, using drop stats in VRouter.

Creating alarms to alert users for packet loss. (opserver).

Open Source Contributor for Contrail 3.2. Fixed bugs, code review complete on Gerrit.

Created blueprint for packet loss alarms (Launchpad)

Created dev setup. Installed OpenContrail 4.0 with Openstack Mitaka version

Created a Fault Management tool for alerts based on Analytics REST API response

Configuration and Monitoring Contrail (CMC certified).

Fixing the bugs/changes in the Configuration/enhancements

Requirement analysis, design, development.

Understanding OpenContrail code using reverse engineering and documentation.

Added new REST API using Python webservices.

Prepared Unit and Integration tests in Python for tempest module

Wrote Python routines to log into the websites and fetch data for selected options.

Used Python modules such as requests, urllib, urllib2 for web crawling.

Web-services backend development using Python (Bottle framework).

Developed Python based API (RESTful Web Service) to track the events and perform analysis.

Created a Python based web application using Python scripting for data processing,

Designed Cassandra schema for the APIs and Parsed XML file using Python to extract data from database.

Worked on Python Open stack API's and used NoSQL as database and followed Python test-driven development techniques

Used Python, performed XML processing, data exchange and business logic implementation.

Developed python scripts to parse XML documents and load the data in database.

Responsible for Design and maintenance of databases using Python. Developed Python based APIs (RESTful Web services).

Web-services backend development using Python (Django,Bottle framework).

Developed Python based API (RESTful Web Service) to track the events and perform analysis using Django.

Created a Python/Django based web application using Python scripting for data processing,

Designed Cassandra schema for the APIs and Parsed XML file using Python to extract data from database.

Worked on Python Open stack API's and used NoSQL as database and followed Python test-driven development techniques

Used Python and Django creating graphics, XML processing, data exchange and business logic implementation.

Developed entire frontend and backend modules using Python on Django Web Framework.

Developed python scripts to parse XML documents and load the data in database.

Responsible for Design and maintenance of databases using Python. Developed Python based APIs (RESTful Web services) by using Django

Jan 15 – Jun 16, Technical Lead, Customer: AT&T, Project: Domain 2.0

Domain 2.0 is a transformative initiative, both internal and external, to enable AT&T network services and infrastructure to be used, provisioned, and orchestrated as is typical of cloud services in data centers. It is characterized by a rich set of APIs that manage, manipulate, and consume services on-demand and in near real time. Domain 2.0 seeks to transform AT&T’s networking businesses from their current state to a future state where they are provided in a manner very similar to cloud computing Services.

OpenStack is a cloud operating system that controls large pools of compute, storage, and networking resources throughout a datacenter, all managed through a dashboard that gives administrators control while empowering their users to provision resources through a web interface or via the OpenStack REST API. Worked on Network Functions Virtualization use cases.

Installed Kubernetes using Vagrant and Ansible playbooks. Integrated with OpenStack and Open Contrail. Managed clusters of Docker.

Microservice architecture is a method of developing software applications as a suite of independently deployable, small, modular services in which each service runs a unique process and communicates through a well-defined, lightweight mechanism to serve a business goal. Seneca provides a toolkit for writing micro-services in Node.js. This leaves you free to focus on the real, business code. No need to worry about which database to use, how to structure your components, or how to manage dependencies.

Technical Skills: Python, Nodejs, Containers, Docker, Kubernetes, Vagrant, Ansible, Expect Js, Rest webservices, C++, OpenStack, Open Contrail, Chef, Puppet, Linux

My role is as follows:

Analyzed features given by the client and created detailed user stories from the requirements

Working on Improved log format by adding detailed log information which will be useful for debugging purpose.

Identified Contrail REST API’s by reverse engineering and documented them.

Installed Kubernetes with Contrail and OpenStack. Managed Clusters and Containers.

Worked on service chaining and SSL for meta-data service use cases.

Executed all Contrail REST API’s manually, documented the request and response. Understood Contrail internals.

Compiled and installed Contrail 3.1 code from GitHub, integrated with devstack neutron plugin.

Creating and managing Meters, Alarms using Ceilometer for checking the Health of Resources. Manage and troubleshoot Ceilometer Telemetry service to fetch OpenStack usage and use for capacity planning.

Use ansible playbooks to creating flavors, Nova availability zones, and Nova host aggregates as part of post-deployment.

Configuration subnet pools, networks, subnets, routers, floating-up to provide connectivity for OpenStack tenants.

Experience in configuring Ml2, Ml3, VLAN, VXLAN, GRE to provide advanced features in networking.

Manage and troubleshot the Heat Orchestration Service and write heat template to launch instance and application on OpenStack.

Experience in Configuring, troubleshooting virtualization tools like Linux KVM, Dockers.

Configuring and manage the users and services with the Keystone Identity service.

Familiar with OpenStack versions Juno, Kilo, Mitaka,Newton.

Worked on Contrail FWAAS.

Good understanding of Openstack components, Neutron, Nova, Ceilometer.

Executed Openstack CLI’s for all components

Worked on Heat template for Service chaining.

Worked on customization of Nova scheduler.

Executed tempest testcases

Worked on OpenStack use cases in compute.

Integrated Microservices with Openstack

Executed Openstack Rest API Calls.

Configuring Openstack for various requirements.

Configuring OpenContrail for various requirements.

Understanding OpenStack and OpenContrail and customizing them

Installed Docker and OpenContrail

Integrating OpenStack Neutron plugin and OpenContrail

Worked on ETAP POC for integrating the testing tool with OpenStack

Understanding OpenStack and OpenContrail code by checking bugs in Launchpad.

Used SenecaJs toolkit for implementing Microservices architecture for integrating Nodejs and Openstack.

Developed a Ceilometer Microservice using Senecajs that will continuously monitor CPU usage for a VM.

Developed recoengine Micro Service in Senecajs that will interact with ceilometer Micro Service and create a new VM, if CPU usage exceeds threshold.

CPU usage calculated using ceilometer cpu_util meter.

Trained the team

Jan 14 – Dec 2014, Technical Lead, Customer: AT&T Project: PST, MST

Production Support Tool(PST) is basically a GUI based tool designed to enhance data access requiring less effort and maps different levels of access to different groups based on the relevant views needed. Prior to creation, users would fire queries directly against the production DB and post XML straight to downstream queues. This would result in reduced DB performance and increased risk of human error, since there was no validation or tracking being performed. PST is designed to provide an easy-to-use

GUI environment that has access to production MIRROR DB, and provide a single location to perform resends, completions, and XML posts with all proper validations in place. Production support tool is used to resolve the fallout of orders which are placed in Order Management System.

MST is data synchronization tool, which to used to sync the data between OMS and ENABLER by resolving the OOS (OUT-OF-SYNC) issues. OOS means missing some attributes for an order in between applications. MST runs reconciliation scripts to identify OOS data and collects all the data and uploads to MST db. MST has certain rules - called as Identification Rules, to identify the reason for OOS for particular order. MST picks each record and run against each identification rule and separates (groups) the records based on ERRORCODE. ERRORCODE is nothing but a category which the record belongs to. Based on ERRORCODES have set of resolution rules are applied on the records and resolves the errors by performing updates in corresponding databases. The project involves of providing Maintenance and Enhancement support to web based applications PST and MST. These are internal to AT&T and used by OMS Tier 2 Fallout, OMC team. TCS has been doing enhancements to increase stability and performance of the tools.

Technical Skills: Java, JQuery, JDBC, Struts 1.X, HTML, tomcat server

My role is as follows:

Involved in the requirement analysis, development, integration, deployment and testing.

Managing, Coordinating, and Analyzing the CR.

Data Requirements gathering & Analysis & Documentation

Coordination with Client for Application based on gathered requirements.

Provide technical documentation

Conducting Daily status calls and review meetings with client.

Providing Support for UAT, Production deployment and Production issues.

Analyze and provide quick solution/workaround for Ad-hoc/critical issues.

Configuring new environment in existing testing server to accommodate the business changes.

Creating new testing environment by configuring new application server,

database for every release.

Identifying the future enhancements to the product and providing effort

estimates to clients.

Work with the business scenarios to enhance the existing functionality in the applications.

Giving accurate time estimates of work assigned.

Oct 12 – Nov 13, Technical Lead, Customer: AT&T Project: EBP

The project is to develop and provide support for one of the AT& T reporting application called EBP (Electronic Bill Presentment). The application is being used by AT&T enterprise customers and small and medium scale business customers. The customers are given various features that they can use on the report and customize them according to their choices. Customer can view the report Online and can also generate offline reports in the form PDF, excel, comma delimited etc. Application provides additional features where in the required columns can be added or deleted. The user can select reports and save for future use. Reports can also be scheduled to run monthly.

Technical Skills: Java, JQuery, JDBC, Struts 1.X, HTML, tomcat server

My role is as follows:

Involved in the requirement analysis, development, integration, deployment and testing.

Had set up the application framework using JSF2.0 with rich faces integration.

Had set up the Spring-JSF integration.

Had worked extensively in Spring AOP with annotations and Spring

Dependency injection(IOC)

Implemented Spring Bean Factory using AOP and IOC technologies

Developed Java classes and implemented the business logic.

To handle asynchronous requests, used Ajax and JavaScript.

Used J2EE design patterns like Factory Methods, MVC, and Singleton Pattern that made modules and code more organized, flexible and readable for future upgrades

Used Apache Tomcat for integrating web application.

Implemented Version Control using SVN.

Participated in Code review of other team members.

Have conducted knowledge sharing sessions for new team members.

Ensure timely response and resolution of problem tickets within the stipulated time period by the customer satisfaction as top priority

Mar 12 – Nov 12, Technical Lead, Customer: Korea telecom Project: Fallout Management

Fallout Management Tool is an application for bulk processing of records through which we can identify fallouts and resolve them.

Technical Skills: Java, J2EE, Spring, Webservices SQL Server, HTML, Maven, Tomcat Server, JavaScript, SOAP, XML, Eclipse

. My role is as follows:

Identifying functional requirements. Coding in Java

Delegating responsibilities to various team members and monitor the progress in each of them to ensure on-time delivery.

To represent the group in the teleconferencing calls with the cream layers of the Business and the senior management.

Review of the code developed by team members

Mar 10 – Feb 12, Customer: Century Link Project : ESOWF

ESOWF is data synchronization tool. This is a reporting application and is used by Century Link customers to get the usage data for the services they are subscribed.

Technical Skills: J2EE, Struts 2.x, SQL Server, HTML, Maven, Tomcat Server, JavaScript, SOAP, XML, Eclipse

My role is as follows:

Developed web interface with Struts framework.

Used different features of Struts like MVC, Validation framework, tiles and tag library

Development was done using Struts2.0 Framework based on Model-View-Controller

(MVC2) design paradigm.

Involved in configuring and deploying the application on Tomcat Server.

Involved in various design review and code review meetings

Applied various design patterns like Business Delegate, Singleton and DAO patterns

Extensively used Struts Validator for server-side validations and JavaScript for

client-side validations.

Extensively used the Struts tag libraries (Bean Tags, Logic Tags and HTML Tags etc.) and Custom tag libraries.

Involved in code reviews, debugging and testing.

Flexibly used J2EE design patterns like Value Object, Data Access Object and Singleton.

Sep 06 – Feb 10, Customer: Tata teleservices Ltd Project: Comptel

Technical Skills: Java 5, Comptel Instant Link, Expect, TCL/tk, Web services, JSP, Servlets, JDBC, Log4j, SoapUI, Oracle10, ANT, XML, PUTTY, J2EE, Struts 2.x, SQL Server, HTML, Maven, Tomcat Server, JavaScript, SOAP, XML, Eclipse

Designing and developing Macro Set for ZTE Soft switch for Wire Line Provisioning. SAS/IL receives Request Format from upstream systems, all the validations are included in the BST (Business Service Tool) and depending on the parameters (Macro Set prepares the MML command) fires the respective MML command to the switch. After firing the MML the response is back updated to the upstream systems.

The MDS/SAS receives service requests generated by the Service Management level applications, logs the requests, handles the request queues, translates the requests into a network element control language (MML: Man Machine Language) and delivers these commands automatically to the network elements.

The MDS/SAS receives Service requests from client systems. A service request may concern, for example, creating a subscriber, provisioning a new service to an already existing subscriber, temporarily disconnecting a subscriber and so on. The service request contains the necessary subscriber identification (directory number, IMSI number etc.) and subscriber service information includes like client system serial number, requested, execution date and priority and so on. The output of this process is finally returned to the requesting application together with the necessary response status. The MDS/SAS provides high security and reliability while performing all these tasks.

TTSL is using the Lucent HLR for provisioning. The Lucent HLR has two interfaces ESM and DG understanding two protocols CORBA and LDAP respectively. To distribute the provisioning traffic evenly among the two interfaces the creation and deletion work orders are sent through ESM and the display and modification LDAP queries are sent through DG. The creation and deletion work orders understood by the ESM are provided by the vendor. Java code has been built using CORBA to create the work orders and send it to the HLR. Similarly, the LDAP queries for display and

modification provided by the vendor were built in JAVA using LDAP.

Gained experience in software life cycle including Feasibility, Requirement,

Specification, Low level design, Coding, Unit Testing, Integration testing, System Integration Testing etc.

Conceptualize the solution framework and architecture of the Macro Set being part of the core architecture team

System Design

Analyzing and designing requirements.

Creation of LLD for new Requirements

Unit & System Testing

Worked on Change requests on Lucent, Alcatel, Ericsson, NEC Wireline products, Ericsson Wireless

Was responsible for handling the Analysis of requirements

Involved in programming with TCL/TK, Java, Expect, PL/SQL production support

Resolution of various issues both in testing and production to reduce the defects

Performing RCA for issues and proposing alternate solutions

Documented various process and procedures that are a part of support activities and for the Change Requests.

Installed Instant Link

Analyze the gaps, and identify opportunities and risks

Worked on change requests for both the interfaces ESM and DG in JAVA and got

excellent appreciation from the client

Worked on the GSM project which TTSL has launched recently (TATA DOCOMO).

Participated and attended the workshops for requirements gathering and identifying the various business scenarios. Gone through the vendor documents and identified the commands for each business scenario.

Worked on NEI development. Installed Instant Link for GSM and participated in

preparing the complete production environment for GSM to go live

Passport Details

Passport Number

Date of Issue

Expiry Date

Place of Issue

M5377577

12-jan-2015

11-jan-2025

Hyderabad, INDIA

Education Summary

Qualification

Subject

Percentage/Grade

Master Of Computer Application

Computer Application

7.4

Bachelor of Science

Computer Science

78.22



Contact this candidate