Priya Arora
Phone: +1-949-***-****
Sr. Python Developer
Email ID: *************@*****.***
PROFESSIONAL SUMMARY:
• 7+ years of Professional IT experience in Analysis, Design, Development, Testing Quality Analysis and Audits of Enterprise Applications and Database Development.
• Extensive knowledge and strong coding skills on Python, Shell, SQL, Javascript
• Experience in working with AWS Lambda, AWS Connect, Amazon Lex, AWS CLI, AWS SDK with Python-Boto3, Serverless and PaaS toolkit, EMR, Kinesis, Glue, S3, RDS and other core AWS services.
• Good exposure to UI design using Bootstrap, HTML, CSS, JavaScript.
• Web development using Python and advanced Django based web framework. Can leverage between Django and flask.
• Good experience of software development in Python (libraries used Beautiful Soup, NumPy, SciPy, matplotlib, python-twitter, Pandas data frame, network, urllib2, MySQL dB for database connectivity) and IDEs - sublime text, Spyder, PyCharm, emacs.
• Experience in Implementation of MVC/ MVW architecture using Django, and RESTful, Soap Web Service and SoapUI.
• Experience in application security using OAuth, SAML and OKTA
• In depth knowledge and expertise in Data Structures and Algorithms, Design Patterns, proficient in UNIX Shell Scripting, Python Scripting, Bash scripting and Sql Query building (sql query with join, sub query, correlated query and analytical query).
• Working experience in Database design, Data Modelling, Performance tuning and query optimization with large volume of multi-dimensional data.
• Built CI/CD pipeline as part of Automation setup using AWS Cloudformation and Elastic BeanStalk
• Extensively worked on Push and Pull architectures, Ansible, Docker and Kubernetes.
• Can leverage between Swarm and Kubernetes.
• Worked on a project driven on AWS Connect where we enabled communication between Lex, Lambda and customer.
• Managed security in AWS Infrastructure using AWS Secret Manager, Key Management Service(CMKs)
• Worked extensively on DynamoDB, Redshift DB, Polly, Lex, Comprehend, S3, Code Star,CodePipeline, CodeBuild, CodeDeploy, CodeCommit, CloudFormation, CloudWatch, CloudFront, WebSocket, AppSync.
• Implemented CI CD pipelines using Jenkins to make every step containerized.
• Integrated and extensively worked on Data Dog, New Relic, Sumo Logic, Splunk, Raygun etc.
& was able to leverage them depending on use case where they fit in
• Extensive knowledge in RDBMS (MySQL, Oracle) & Big Data Databases.
• Hands on experience in data extraction, data cleansing and automation of data pipelines using python.
• Experienced in developing the data pipelines and managing the ETL end to end.
• Extensive experience in all stages of Data Engineering from loading data to different source systems, transforming and scheduling the data jobs.
• Strong knowledge in SQL concepts - CRUD operations and aggregation framework.
• Experience in the design of MongoDB database - Indexing and Sharding.
• Worked on Waterfall and Agile methodologies.
• Deep analytics and understanding of Big Data and algorithms using Hadoop, Map Reduce, NoSQL and distributed computing tools.
EDUCATION:
Master of Computer Applications
Sikkim Manipal University, 2013
Bachelor of Computer Applications
Punjab University, 2011
PROFESSIONAL EXPERIENCE:
Anthem-Norfolk, VA (USA)
October,2019-April,2020
Title: Managed Accounts
Teamsize: 16
Sr. Python Developer
Description: Anthem is a leading health benefits company dedicated to improving lives and communities, and making healthcare simpler. Its health plans have created a variety of PPOs, HMOs, and various hybrid and specialty products.
Roles & Responsibilities:
• Extensively working with AWS services like S3, EC2, ELB, EBS, Lambda, Auto-Scaling, Route53, CloudFront, IAM, Cloud Watch, and RDS etc.
• Involved in infrastructure as code, execution plans, and resource graph and change automation using Terraform. Managed AWS infrastructure as code using Terraform and CloudFormation.
• Developed Merge jobs in Python to extract and load data into MySQL database.
• Design and Develop ETL Processes in AWS Glue to migrate Campaign data from external sources like S3, ORC/Parquet/Text Files into AWS Redshift.
• Created Terraform scripts for EC2 instances, Elastic Load balancers and S3 buckets.
• Implemented server-side encryption to encrypt the objects in S3 bucket with Customer Managed Keys(CMK) stored in AWS Key Management Service.
• Manage service account credentials, database credentials, passwords through AWS Secrets Manager
• Created continuous deployment pipeline by creating component processes, Applications and adding environments into handling deployments using code pipeline and AWS code deployment for various deployment scenarios like Tomcat deployment, multiple applications deployments.
• Implemented Terraform to manage the AWS infrastructure and managed servers using configuration management tools like Chef and Ansible.
• Worked on MVC architecture using PHP, Python as controller.
• Built numerous Lambda functions using python and automated the process using the event created.
• Created an AWS Lambda architecture to monitor AWS S3 Buckets and triggers for processing source data.
• Developed a data science pipeline using AWS Sagemaker and scheduled it successfully in production.
• Coordinated with DevOps team to deploy the application on AWS Resources
• Worked on packages like socket, REST API, Django.
• Building/Maintaining Docker container clusters managed by Kubernetes Linux, Bash, GIT, Docker, on AWS.
• Utilized Kubernetes for the runtime environment of the CI/CD system to build, test deploy.
• Performed S3 buckets creation, policies on IAM role-based policies, MFA and customizing the JSON template
• Automated various service and application deployments with ANSIBLE on CentOS and RHEL in AWS.
• Wrote ANSIBLE Playbooks with Python, SSH as the Wrapper to Manage Configurations of AWS Nodes and Test Playbooks on AWS instances using Python. Run Ansible Scripts to provision Dev servers.
• Worked on MySQL database on simple queries and writing Stored Procedures for normalization.
• Deployed the project into Jenkins using the GIT version control system.
• Worked on Python advanced packages and modules like Requests, Scrapy, BeautifulSoup, Multithreading, Pandas, NumPy, SciPy, Matplotlib, wxPython, QT, Regex expressions, SQLAlchemy, SQL based database connections JDBC/ODBC and Py-Virtual Environment.
• Understanding of secure-cloud configuration, Cloud Trail, cloud-security technologies (VPC, Security Groups, etc.) and cloud-permission systems (IAM)
• Used version-controlling systems like GIT and SVN.
• Loaded the data into Spark RDD and did memory data Computation to generate the Output response.
• Responsible for installing and administering the SonarQube for code quality check and Nexus repository and generating reports for different projects. Also, integrated them into Jenkins.
• Consumed web services performing CRUD operations.
• Used Python Library Beautiful Soup 4 for Web Scraping to extract data for building graphs.
• Used AngularJS as the development framework to build a single-page application.
• Involved in Unit Testing and developed the unit test cases using PyUnit framework. Environment: Amazon Web Services (AWS), Cloud Environment, Lambda, AJAX, Angular, DynamoDB, Python 3.7, Django, API Gateway, REST API, PySpark-Spark SQL, Spark Streaming, Amazon S3, CloudWatch, AWS Glue, Pycharm, MS-SQL Server, GIT, Jira, AWS Secret Manager, KMS, Terraform, Cloudformation
Nityam Software Solutions, India
Title: Commercial servicing (Client: Lloyds Banking) June 2017 – Apr 2019
Teamsize: 30
Sr. Python Developer
Description:
Lloyds Banking Group is a leading UK based financial services group providing a wide range of banking and financial services, focused on personal and commercial customers. The group’s main business activities are retail, commercial and corporate banking, general insurance, and life, pensions and investment provision.
Roles & Responsibilities:
• Developed views and templates with Python and Django view controller and templating language to create a user-friendly interface to perform in a high-level.
• Used the Django Framework to develop the application and Build all database mapping classes using Django models.
• Debugging the application and following messages in log files, to figure out the error if existing.
• Assisted with development of web applications Flask, Django.
• Implemented application security using OKTA and enabled SSL for encryption in transit.
• Managed service account's password, database credentials in Cyberark Vault.
• Developed Views and Templates with Python using Django's view controller and template language.
• Built an Interface between Django and Salesforce with REST API.
• Worked on infrastructure with Docker containerization.
• Kubernetes is being used to orchestrate the deployment, scaling and management of Docker Containers.
• Worked on Micro Services deployments on AWS ECS and EC2 instances
• Refactored existing batch jobs and migrated existing legacy extracts from Informatica to Python based micro services and deployed in AWS with minimal downtime.
• Created AWS Security Groups for deploying and configuring AWS EC2 instances.
• Added support for Amazon AWS S3 and RDS to host files and the database into Amazon Cloud.
• Enabled CI CD Pipelines using code pipeline, codestar, code build, code deploy, cloudformation, Elastic Beanstalk and Lambda for our DevOps pipelines. Used Cloud Formation templates to deploy these pipelines
• Knowledge of AWS ML services: Sagemaker and algorithms: K-Mean, Binary Classification, K-Mean.
• Worked on migrating from redshift to Snowflake database to achieve a more centralized and optimized data warehousing solution which is a scalable database solution.
• Creating snapshots and Amazon machine images of the instances for backup and creating clone instances.
• Performed configuration, deployment and support of cloud services including Amazon Web Services.
• Established data pipeline using Redshift DB and scheduled ETLs using Apache Airflow in server less mode.
• Involved in the setting up of Micro services using API Gateway, Lambda, and DynamoDB that connect to UI.
• Designed and created backend data access modules using PL/SQL & PostgreSQL stored procedures.
• Wrote SQL Queries and implemented stored procedures, functions, packages, tables, views, Cursors, triggers.
• Designed ETL pipelines, data extraction, data managing and scheduled ETL data jobs using airflow.
• Used collections in Oracle for manipulating and looping through different user defined objects.
• Used GitHub for Python source code version control, Jenkins for automating the build Docker containers, and deploying in Mesos.
• Created applications for software package, software framework and hardware platform using SDK
• Involved in service based RESTful technologies and used Bootstrap and Angular.js for page design
• Created a web service, provided its information to the service registry, and made the information regarding the web service available to any potential requester using SOA.
• Utilized PyUnit, the Python unit test framework and used PyTest for all Python applications.
• Creating unit test/regression test framework for working/new code.
• Assisted with writing effective user stories and divide the stories into SCRUM tasks.
• Developed Single Page Applications (SPA) using JavaScript frameworks like ReactJS and Angular2.
• Worked with Docker, deployment of applications inside software containers.
• Used NumPy and Pandas in python for Data Manipulation.
• Extensively used EC2, Autoscaling, LoadBalancing, enabled blue green deployments, containerized various applications, built and migrated applications into Elastic beanstalk for better deployments
Environment: Python, MySQL, Django, Flask, Pyramid, AWS, Docker, SOA, REST, GitHub, OAuth, Okta, LINUX, NumPy, Pandas, ReactJS, Angular2 Linux, Windows. Nityam Software Solutions, India
Title: IBE(Cient:Joob)
Sep 2015 - May 2017
Teamsize: 28
Python Developer
Description: Joob, Internet booking engine (IBE), which will facilitate customers to make online bookings for air, hotel, and insurance. The Point of Sale (POS) will enable the sales staff of Joob to serve the walk-in customers and call center customers and perform booking management functions like search, create bookings and send booking communication. This will be specifically designed for the sales staff or POS users to include features like multi-tasking capability, cryptic commands to enable the POS users to service the customers faster and more efficiently. The Admin module will govern both B2C and POS applications and will manage users, roles, affiliates, markups, commissions and affiliates.
Roles & Responsibilities:
• Gathered requirements, system analysis, design, development, testing and deployment.
• The system is a full micro services architecture written in Python utilizing distributed message passing via Kafka with JSON as data exchange formats.
• Created six micro services that have the capability to transfer the input files to machine readable format and pass through respective payment channels.
• Worked as an Application experience with model, view and template in Django framework.
• Designed and maintained databases using Python and developed a Python based API (RESTful Web Service) using Flask, SQLAlchemy, MongoDB, Redis Cache, PostgreSQL.
• Automated CM and maintained a CI/CD data pipeline, deployment tools such as Chef, Puppet, or Ansible.
• Involved in analyzing the problems using transaction breakdown, network monitoring and resource monitoring to find the performance bottleneck.
• Developed a monitoring application which captures the error related data and stores it in the database.
• Involved in a scrubbing project, which updates the existing data with hashed values.
• Reverse engineer and re-implement legacy back-end software into Python with minimal downtime.
• Written complex SQL queries for data validation based on ETL mapping specifications.
• Built servers using AWS, importing volumes, launching EC2, RDS, creating security groups, auto-scaling, load balancers (ELBs) in the defined virtual private connection.
• Worked with a project leads to learn about the Azure services, environment deployment, and integration with tooling.
• Optimized system performance and managed the API lifecycle
• Extensively worked on writing UNIX shell scripts for scheduling the sessions for my testing process of ETL.
• Worked closely with leads in implementation exposure on service based, SOAP, RESTful technologies.
• Strong experience in developing Micro-services using Drop wizard, Spring Boot, Lagom.
• Involved in development of Web Services using SOAP for sending and getting data from the external interface in the XML format.
• Participated in the complete SDLC process & Worked in Test Driven Development.
• Created a Python based GUI application For Freight Tracking and processing.
• Implemented the validation, error handling, and caching framework with Oracle Coherence cache.
• Created a database using MySQL, wrote several queries to extract data from the database. Setup automated con jobs to upload data into the database, generate graphs, bar charts, upload these charts to wiki, and backup the database.
Environment: Python, Drop wizard, Spring Boot, Lagom, Kafka, JSON, GitHub, LINUX, Django, Flask, Varnish, Nginx SOA, RESTful, ORM-SQLAlchemy, Redis Cache, MongoDB Nityam Software Solutions, India
Title: Worbix (Generic Collaboration Tool)
June 2013 to Aug 2015
Teamsize: 18
Jr. Python Developer
Description:
Worbix provides an online collaboration platform where organized communities can set aspirations, explore alternative solutions and execute plans towards maximal value creation. It provides modules like Message, Task, Document, Meeting, etc., which are used to manage projects or setup an online collaboration community. Roles & Responsibilities:
• Gathered requirements, system analysis, design, development, testing and deployment.
• Developed user interface using CSS, HTML, JavaScript and jQuery.
• Created a database using MySQL wrote several queries to extract/store data.
• Wrote Python modules to extract/load asset data from the MySQL source database.
• Designed and implemented a dedicated MYSQL database server to drive the web apps and report on daily progress.
• Developed views and templates with Python and Django & view controller and templating language to create a user-friendly website interface.
• Used Django framework for application development.
• Created the entire application using Python, Django, MySQL and Linux.
• Enhanced existing automated solutions, such as the Inquiry Tool for automated Asset Department reporting and added new features and fixed bugs.
• Embedded AJAX in UI to update small portions of the web page avoiding the need to reload the entire page.
• Created most important Business Rules, which are useful for the scope of project, and- needs of customers.
• Improved performance by using a more modularized approach and using more in-built methods.
• Involved in development of Web Services using SOAP for sending and getting data from the external interface in the XML format.
• Integrated data quality plans as a part of ETL processes.
• Performed data manipulation/storage for incoming test data using lxml /etree libraries.
• Developed API modularizing existing python modules with the help of pyyaml libraries.
• Designed and configured database and back end applications and programs.
• Performed research to explore and identify new technological platforms.
• Collaborated with internal teams to convert end user feedback into meaningful and improved solutions.
Environment: Python 2.7, Django 1.4, Puppet Rspec, Jenkins, Grafana/Graphite, MySQL, Linux, HTML, CSS, jQuery, JavaScript, Apache, Linux, Git, Perl, Cassandra. Aptech Institute, India
Apr, 2011-Jan, 2012
Trainer
Description:
Aptech Computer Education is a premier IT education Institute. Established in 1986, Aptech is a pioneer in IT software & hardware training.
Roles & Responsibilities:
• Traveled to client business worksites and provided easy to understand training in Java programming and Agile environments.
• Assisted business clients with implementing and using news applications.
• Incorporated the use of media to make training more effective.
• Worked with staff at all company levels.
• Coded test programs and evaluated existing engineering processes.
• Designed and configured database and back end applications and programs.
• Performed research to explore and identify new technological platforms.