Hari Pulijala
************@*****.***
SUMMARY:
•Around 13+ years of experience in IT Industry as Application Developer and led onsite & offshore projects for prestigious clients in India & US. Possess excellent Experience and understanding of Infrastructure Development, AI/ML Platform development, Infrastructure Development, Network Automation, AWS and Azure Cloud, Openstack Cloud, IP Networking, Python, Django, Flask, FAST API, C++, REST API, node js, GraphQL, Kubernetes, Docker, Linux, MySQL, MongoDB, Oracle DB,Git, Jira, Agile Methodologies.
•Strong programming skills in Python, 10+ years of Hands-on experience in Python, Django, Flask, FAST API Development, Numpy, Pandas, Pyspark, Databricks. Having good experience in working in AWS, Azure, Openstack cloud computing, Docker and Kubernetes platforms.
•Having good experience in Application development and network automation using Python, Perl scripting
•Experience in working with design patterns. Used factory, singleton, adapter, Observer, decorator and many other design patterns.
•Experience on importing and exporting data using stream processing platforms like Kafka.
•Experience on Cloud computing technology like AWS EC2, AWS Load Balancers, AWS S3, Dynamo DB, SQS, API Gateway and Lambda.
•Experience in Azure Key vault, DevOps pipeline, AKS, and ACR.
•Hands on experience in the SCM tools GitHub to pull, commit, push, merge and revert.
•Expertise in developing applications in docker environment.
•Having good experience in network automation and infrastructure development.
•Implemented CI/CD pipelines for different Clients enabling Application teams to achieve high productivity by Cutting latency in Building and shipping new features.
•Experienced in working with server-side technologies including databases, restful API and MVC design patterns.
•Migrated application to AWS platform. Used DynamoDB in AWS platform.
•Enthusiastic and Eager to work on latest Cutting-edge technologies.
•Worked on Agile Development Methodology.
•Expertise in development tools such as visual studio code, IntelliJ.
•Good experience in interacting with clients. Currently working with the application owner.
•Used Python logging module to implement even logging in Python applications.
•Used IDAnywhere Authentication/Authorization system to implement authentication.
•Excellent communication and interpersonal skills with strong work ethics
•Team player, highly productive, Commitment and Result driven in team and Individual Projects
TECHNICAL SKILLS:
•Programming Languages: Python, Django, Flask, FAST API, C, C++
•API : REST API, GraphQL
•Data Engineering: Numpy, Pandas, PySpark, DataBricks
•Scripting Languages: Python, Perl
•Logging/Monitoring Tools: AppDynamics
•Version Controlling Tools: Git, SVN
•Build/CI Tools: Jenkins, GitHub, Ansible, Docker, node js,Jules, Terraform.
•Operating Systems: AIX, Windows, Linux (RedHat, SUSE, Ubuntu, Centos, Fedora)
•Virtualization: KVM Virtualization
•Orchestration Platform (PaaS): Kubernetes
•Cloud Services: Openstack cloud, Amazon Web Services (AWS), Kubernetes
•Database: MySQL, MariaDB, MongoDB, Dynamo DB, RDS, Oracle DB.
•Web/Application Servers: React
•Bug Tracking Tools: JIRA
•IDE: Visual Studio, VS Code, Eclipse, IntelliJ IDEA
•Domain Experience:AI/ML Platform development, Network Automation, Infrastructure development, IP Networking, Application Development, Cloud Computing, L2, L3 Protocol development, NMS/EMS.
EDUCATIONAL QUALIFICATION:
Bachelor of Technology (B.Tech) in Electronics & Communications, JNT University, Hyderabad, India from 1999-2003
PROFESSIONAL EXPERIENCE:
Albertsons, SFO Dec 20 to Till Date
Senior Python Developer
Responsibilities:
Working on developing infrastructure for AI/ML platform.
Working on building GraphQL API backend service for Postgre SQL database using Hasura.
Developed FAST API based side car container application for authentication and authorization of users using Azure key vault.
Worked on local environment setup of the production environment using minikube, kubectl, Kompose and docker-compose.
Developed FAST API based application for onboarding, data loader and data server applications.
Created databricks jobs using Pyspark to load data from Azure data lake to PostgreSQL database.
Environment/Tools: Python, Flask, Fast API, node js, AI/ML, Azure Cloud, AKS, ACR, Key Vault, docker, Kubernetes, Hasura, PostgreSQL, GraphQL, Git lab, Azure devops, Numpy, Pandas, Pyspark, Databricks.
Cricket Wireless, GA Jul 20 to Nov 20
Technical Lead/ Senior Python Developer
Responsibilities:
•Worked on a Python based micro services-based application which provisions requests to F5 load balancer. It aims to provide load balancing API’s with create, update, retrieve and delete services to clients. Celery module is used to process the requests asynchronously. It stores the load balancer details into MySQL Database. This data is used by Django Models to build API’s and extend it to the end users.
•Worked on a Python based micro services-based application which provisions requests to F5 load balancer. It aims to provide load balancing API’s with create, update, retrieve and delete services to clients. Celery module is used to process the requests asynchronously. It stores the load balancer details into MySQL Database. This data is used by Django Models to build API’s and extend it to the end users.
•Worked on Python based ETL application which extracts load balancer details from another application, processes and stores the data in database. This data is used by Django Models to build API’s and extend it to the end users.
•Sound knowledge of developing REST API’s, node js using Django, FAST API and Flask framework.
•Migrated a python application to AWS cloud. Designed, implemented, and maintained DynamoDB database solutions to support applications and services.
•Maintained data security and access control, configuring IAM policies, and managing DynamoDB encryption at rest and in transit.
•Implemented event driven server less architecture using AWS Lambda, API Gateway, SQS, and Dynamo DB.
•Developed Ansible scripts for deployment of software on Virtual Server infrastructure.
•One touch builds and deployment for DNS and DHCP Server. The automation script installs the required software, configures it and validate the server configuration. All the workflow is automated using Python.
•Involved in creating SNOW change during production code deployment.
•Worked on CI/CD Pipeline troubleshooting possible Issues with GIT, Jenkins, node js, Artifactory, JIRA and SonarQube. Used Jules (Jenkins) to setup Continuous Delivery pipeline.
•Worked in Agile Methodology, also worked as a Scrum Master
Environment/Tools: Python, Flask, Django, FAST API, React, Ansible, Azure, AKS, AWS, Linux, docker, Kubernetes, Micro Services, Design patterns, node js, Network Automation, Infrastructure development, DNS, DHCP, GLB, LLB, load balancer, Jules, bitbucket, Jira, Service Now, AWS, RabbitMQ, Celery, Kafka, Redis, Git, Oracle DB, git lab, jules, Terraform.
Corteva, GA Dec 17 to Jun 20
Principal Engineer
Responsibilities:
•Developed cloud orchestrator application using python, which is used to provision the resources in cloud environment.
•Developed Django Rest APIs to create, display, delete and deploy terraform plans. Used MongoDB to store plans.
•Integrated Orchestration service with Openstack keystone for authentication and policy management.
•Used celery module to process the requests asynchronously.
•Worked on Openstack deployment using Openstack Kolla module.
•Created docker images for each Openstack service.
Environment/Tools: Python, Django, Oracle Linux, node js, MongoDB, Openstack, Terraform, Orchestrator, Kolla, Jenkins.
S&P, NY Sep 15 to Nov 17
Technical Lead
Responsibilities:
•Worked on Integration of Openstack Keystone service (Identity Service) with active directory.
•Worked on Integration of Openstack Heat service (Orchestration Service) with Jenkins.
•Deployed Openstack services using Puppet.
•Responsible for bug fixing and deployment of Openstack services.
•Mentoring the internal project team Members for their BAU/Manual Task.
Environment/Tools: Python, Openstack cloud, Puppet, node js. Linux, Django.
HTI, Atlanta Mar 13 to Sep 15
Consultant
Responsibilities:
•Worked on Integration of Openstack components (Horizon, Keystone, Nova-Api, Glance, Heat-Api, RabbitMQ Clustering, MariaDB Galera and Cinder) with HaProxy for high availability, Configured Keepalived for HaProxy high availability on Controller nodes.
•Worked on Integration of Openstack Neutron with new core plugin to handle network, ports and subnets creation/deletion/updation. Worked on RESTful APIs to interact with Openstack components Nova, Neutron. Created extension APIs in Openstack Neutron to handle additional attributes which are not supported in Openstack.
•Worked on Virtual hardware manager which is responsible for creation of the virtual machines based on the NE configuration. Added a new Openstack extension APIs to handle the VHM requests from horizon. Implemented changes in Horizon, Nova client and nova module to process the VHM requests.
Environment/Tools: Python, Openstack Cloud, HaProxy, Galeria replication, Linux.
ALLTEL, GA Mar 10 to Mar’13
Senior Software Engineer
Responsibilities:
•Having good experience in Product development using C, C++.
•Worked on BFD support for Routing Protocols feature. As part of this feature, we provided BFD support for control plane protocols OSPFv2/3 and BGP. BFD is a detection protocol designed to provide fast forwarding path failure detection times for all media types, encapsulations, topologies, and routing.
•Worked on Multi-Protocol support for BFD feature. As part of this feature, MIB Stub side implementation is extended to provide support for handling Rx interval, Tx interval, and detect multi parameters when multiple protocols are subscribed to the same BFD Session. Multiple protocols from source to destination should be multiplexed to a single BFD session. Add Modify and deletion of protocols of single BFD sessions are handled as part of this project.
•Worked on Alarm support for BFD feature. As part of this feature, Support for BFD Alarm is provided by the flexrd-agent with the diagnostic codes when the session goes into down state. Alarm is cleared once the session comes back to up state.
•Worked on KBFD integration with BFD stub feature. As part of this feature, developed a frame work to integrate BFD stub with KBFD to form BFD sessions between the peers/neighbours.
Environment/Tools: C++, Linux, BFD Protocol, OSPF, BGP, IP Networking.