Post Job Free

Resume

Sign in

Sr Python AWS Developer

Location:
Dallas, TX
Salary:
60/HR
Posted:
June 12, 2023

Contact this candidate

Resume:

Kishore Kumar

Python AWS Developer

Contact: 609-***-**** Email: adxn7o@r.postjobfree.com

Professional Experience:

• 8+ years of experience in Design, Development, Management, and Implementation of various stand-alone, client-server enterprise applications using Python scripting, Django, Angular JS, JavaScript and Node.js.

• Good knowledge in maintaining various version controls systems such as SVN (Centralized Version Control System) and GIT (Distributed Version Control System).

• Hands-on experience in WAMP (Windows, Apache, MYSQL, Python /PHP) and LAMP (Linux, Apache, MySQL, and Python) Architecture.

• Expertise in docker, AWS Cloud platform and its features which includes EC2, AMI, EBS Cloud watch, glue jobs, AWS Config, Auto-scaling, IAM user management and AWS S3.

• Good experience in developing web applications implementing Model View Control architecture using Django and Flask web application frameworks.

• Good knowledge in using PySpark for Apache Spark.

• Knowledge of the Software Development Life Cycle (SDLC), Agile and Waterfall Methodologies and active participation in full development life cycle (includes requirements, design, architecture, development, Implementation, and testing).

• Experienced in NoSQL technologies like MongoDB, Cassandra, Redis and relational databases like Oracle, SQL Lite, PostgreSQL and MySQL databases.

• Excellent experience with Requests, NumPy, Matplotlib, SciPy, PySpark and Pandas python libraries during development lifecycle and experience in developing APIs for the application using Python, Django, MongoDB, Express, ReactJS and NodeJS.

• Strong ETL testing experience and verification utilizing SQL querying skills (PL/SQL, Toad, or other tool) and formal testing methodologies.

• Experience in Single Page Application (SPA) using AngularJS, created Multiple & Nested Views, Routing, Controllers, Services and Custom Directives

• Experienced in Working on Big Data Integration and Analytics based on Hadoop and Kafka.

• Good knowledge of using IDE Tools like IDLE, Eclipse, NetBeans, PyCharm, and Sublime Text.

• Experienced in using Python libraries like Beautiful Soup, NumPy, SciPy, matplotlib, Python-twitter, urllib2, and MySQL for database connectivity.

• Working experience in UNIX and Linux shell environments using command line utilities.

• Experience in UNIX and Linux shell environments using command line utilities.

• Experience with Unit Testing/ Test Driven Development (TDD), Load Testing and Integration Testing.

• Good experience on MongoDB scaling across data centers and/or in-depth understanding of MongoDB HA strategies, including replica sets.

• Worked on various micro web applications development using Flask and SQL Alchemy.

• Experience with Django, a high-level Python Web framework.

• Experience in writing Sub Queries, Stored Procedures, Triggers, Cursors, and Functions on MySQL and PostgreSQL database.

• Comfortable working with MEAN (MongoDB, Express, Angular, NodeJS) stack.

• Familiar with tools including Bugzilla, Jira, and Confluence.

• Good Experience in Linux Bash scripting and following PEP Guidelines in Python.

• Familiarity in front end development using ReactJS a JAVA Script library for user interface components.

• Implemented machine learning algorithms in Spark and Python, for segmentation of data.

• Experience in implementation of MVC/ MVW architecture using Django, and RESTful, SOAP web service and SOAPUI.

• Well versed with design and development of presentation layer for web applications using technologies like HTML5, CSS3, JavaScript, jQuery, AJAX, AngularJS, Bootstrap, XML, Backbone JSON.

• Proficiency in Agile Methodologies, Scrum stories and sprints experience in a Python-based environment, along with data analytics, data wrangling, and Excel data extracts.

Education:

• Master of Science in Computer Science, University of South Dakota, Vermillion, SD

• Bachelors in Computer Science and Engineering, Lovely Professional University, India

Technical Skills:

Languages Python 3.8,3.9,2.7/2.4, C++, Java, Shell Script, Perl, SQL

Python Framework Django, Flask

IDE Tools Eclipse, PyCharm, RAD

Build Tools Ant, Maven, Gradle,

Continuous Integration Tools Jenkins

Processes Agile-Scrum, Waterfall

Cloud Technology AWS, OpenStack, GCP

Source Version Control Tool Subversion (SVN), Git,

Databases MySQL 5.1, SQL Server 2008, Oracle 10g, Siebel

VIRTUALIZATION SKILLS: ESX 4.1/4.0, vSphere 3/4, VMware Workstation 10/11, Oracle Virtual Box

Operating Systems Windows (XP, Vista & 7, 8), LINUX, UNIX, ubuntu

Professional Experience:

Client: Pacific Gas and Electric, Dallas, TX Nov 2021 – Present

Role: Sr Python AWS Developer

Description: Worked on web-based intranet system that supports Organization across the country. The system provides functionality for recruit, select, and hires process applicants into Organization. The project contains different processes, which includes applicant, admin, HR, vacancy management system, position management system, ambassador management system etc. Each process contains its own logs systems and reporting services

Responsibilities:

• Build high-performance data pipelines and prototypes that enable business use of the data.

• Understands business requirements and applies them to complex software engineering and analysis.

• Perform debugging using the logging system on AWS and application.

• Handles Spark sessions and Docker environment for the application.

• Analyzed the performance of Cassandra cluster using Node tool TP stats and CFstats for thread analysis and latency analysis.

• Improved the application performance with about 40%.

• Partners with team members to understand and incorporate standards information and requirements into work procedures.

• Identifies, analyzes, and provides feedback to departmental standards, norms, and new goals/objectives.

• Hands on experience with Apache Spark using Scala. Implemented spark solution to enable real time report from Cassandra data.

• Created documentation for benchmarking the Cassandra cluster for the designed tables.

• Analyzes existing applications and systems and formulates logic for new systems, devises logic procedures, logical database design, performs coding and tests/debugs programs with an operational mindset.

• Works on complex data & analytics-centric problems having broad impact that require in depth analysis and judgment to obtain results.

• Designs and deploys new complex Enterprise systems and enhancements to existing systems ensuring compatibility and interoperability.

• Resolves application programming analysis problems of broad scope within procedural guidelines. May seek assistance from the supervisor or more skilled programmers/analysts on various problems that cross multiple functional/technology areas.

• Understands the infrastructure that allows big data to be accessed and analyzed.

• Utilizes department standard issue tracking, source control, and documentation tools.

Environment: Python, AWS, EC2, EBS, S3, VPC, PyCharm, jQuery, MySQL, HTML, CSS, JavaScript, Ajax, Web Services, JSON, Angular.js, MongoDB, SQL Workbench.

Client: National Grid, Boston, MA Nov 2019 – Oct 2021

Role: Python AWS Developer

Responsibilities:

• Developed, tested, and deployed Business feature set in Node.js with Express and MongoDB backend, incorporating APIs.

• Developed application using Amazon Web Service (AWS) like EC2, cloud search, Elastic load balancer ELB, Cloud Deploy and monitor scalable infrastructure on Amazon web services (AWS) & configuration management using puppet.

• Rewrote one of the key pages, which allows users to manage their content. The task involved investigation of the AngularJS UI-Grid as well as refactoring of several backend methods.

• Migrated an existing on-premises application to AWS. Used AWS services like EC2 and S3 for small data sets processing and storage, Experienced in Maintaining the Hadoop cluster on AWS EMR.

• Maintained and updated a GraphQL layer to allow retrieval and updates of user interactions with PostgreSQL database.

• Collaborated with the backend team to design, define, and implement GraphQL types and resolvers to provide the necessary data for frontend development while maintaining minimal calls to the database.

• Maintained and updated unit and integration tests for both the GraphQL and lib level to validate behavior.

• Monitored and administered automated and manual data integration and ETL jobs to verify execution and measure performance.

• Handle escalated Support tickets till closure for MS Azure IaaS platform.

• Used NOSQL database Amazon dynamo DB to store data of reporting Application.

• Automated Analytic platform solutions hosted in AWS, leveraging AWS managed services of EMR, S3, Lambda

• Used Lambda function for automation process of data ingestion and manipulation.

• Interacted with 3rd party APIs and built RESTful APIs using NodeJS.

• Experience in using microservices for scalability and testing.

• Created, executed, and document unit test plans for ETL and data integration processes and programs.

• Experience in using Pyspark a Python API to integrate and work on RDDs.

• Experience in using frameworks for API security to detect any kind of usage of traffic, IP address and endpoints of the source.

• Worked on React JS to develop and build user interface components.

• Experience in using Pyspark for processing structured and semi structured datasets to provide an optimized API to read the data from multiple data sources from different file formats.

• Implemented a 'server less' architecture using API Gateway, Lambda, and Dynamo DB.

• Configured Spark streaming to receive real time data from the Kafka and store the stream data to HDFS using Scale.

• Implement machine learning algorithms and models, Data presentation, visualization, etc.

• Involved in writing SQL queries implementing functions, triggers, cursors, object types, sequences, indexes etc.

• Created Data tables utilizing MySQL utilized Jinja to access data and display it in the front end.

• Worked on Automation of data pulls from SQL Server to Hadoop eco system via SQOOP.

• Contributed to the design and creation of RESTful APIs using Python/Django/Django Rest Framework.

• Strong knowledge of all phases of SDLC and Strong working knowledge of Software testing (Functional Testing, Regression Testing, Load Testing).

• Successfully implemented Apache Spark and Spark Streaming applications for large scale data.

• Writing API documentation for onboarding developers on microservices platform of Autodesk.

Environment: Python, AWS, EC2, EBS, S3, VPC, PyCharm, jQuery, MySQL, HTML, CSS, JavaScript, Ajax, Web Services, JSON, Angular.js, MongoDB, SQL Workbench.

Client: Prescient Pvt Ltd, India Nov 2016 – Dec 2018

Role: Python AWS Developer

Responsibilities:

• Used Python-based GUI components for the front-end functionality such as selection criteria.

• Developed monitoring and notification tools using Python.

• Implemented user interface guidelines and standards throughout the development and maintenance of the website using HTML, and JavaScript.

• Developed Business Logic using Python on Flask Web Framework.

• Involved in developing the embedded software Analysis of C, C++, and Python.

• Used HTML/CSS and JavaScript for UI development.

• Implemented the presentation layer with HTML.

• Created Data tables utilizing PyQt to display customer and policy information and add, delete, and update customer records.

• Worked on Elastic search to convert raw data such as log files or message files into internal documents and stored them in a basic data structure like a JSON object.

• Used MongoDB database concepts such as locking, transactions, indexes, replication, and schema design to streamline application design.

• Having knowledge of advanced reporting using ELK (Elastic Search, Logstash, Kibana) Stack.

• Assisted in installing Ansible Registry for local upload and download of Docker images and even from Docker Hub.

• Developed the required XML Schema documents and implemented the framework for parsing XML documents.

• Used Python scripts to update content in the database and manipulate files.

• Managed, developed, and designed a dashboard control panel for customers and Administrators using Django, HTML, Bootstrap, and REST API calls using JSON.

• Integrating the application with the Django framework for building the API.

• Developing scripts for build, deployment, maintenance, and related task using Python, and Bash.

• Integrate Python with data source and delivery systems.

• Automated tasks with tools like Puppet and Ansible.

• Managed code versioning with GitHub, and Bitbucket and deployment to staging and production servers.

• Involved in the entire lifecycle of the projects including Design, Development, Deployment, Testing Implementation, and support.

Environment: Python, HTML, JavaScript, Flask, C, C++, CSS, UI, PyQt, XML, Django, Bootstrap, Rest API, JSON, Bash, Puppet, Ansible, GitHub, Bitbucket, Elastic Search, Mongo DB.

Client: WebCraft IT, India Aug 2014 to Oct 2016

Role: Python AWS Developer

Responsibilities:

• Developed a fully automated continuous integration system using Git, Gerrit, Jenkins, MySQL and custom tools developed in Python and Bash.

• Having Knowledge on AWS Lambda, Auto scaling, Cloud Front, RDS, Route53, AWS SNS, SQS, SES.

• Gained Knowledge on Deploying apps using AWS Cloud Formation.

• Written Cloud formation templates and deployed AWS resourcing.

• Developed and executed White box test cases using Python, Unitest/Pytest/Robot framework & PyCharm/Ride.

• Extended Unitest/Pytest/ Robot framework by adding some helper classes and methods.

• Build application and database servers using AWS EC2 and create AMIs also using RDS for Oracle DB.

• Managed, developed, and designed a dashboard control panel for customers and Administrators using Django, Oracle DB, PostgreSQL and VMWare API calls.

• Developed spark applications in python (PySpark) on distributed environment to load huge number of CSV files.

• Deployed AWS Lambda code from Amazon S3 buckets. Created a Lambda Deployment function and configured it to receive events from your S3 bucket.

• Developed the required XML Schema documents and implemented the framework for parsing XML documents.

• Experience in data driven, keyword driven and hybrid test automation frameworks.

• Implemented PLSQL to write programmable functions, blocks, procedures, triggers, and packages.

• Experience in using various version control systems like Git, CVS, GitHub, Heroku and Amazon EC2.

• Worked on Vue JS to build web interfaces and page applications.

• ETL/Data Integration experience using the following: IBM Data Stage, Shell scripting, Oracle PL/SQL, and Microsoft SSIS.

• Developed UI using HTML, AJAX, JavaScript, and jQuery and jQueryUI, Used Mysql as backend database and Mysqldb of python as database connector to interact with Mysql server.

• Used GCP as cloud platform to perform model building and web developments.

• Used microservices on loosely coupled services as there is no requirement to rewrite the whole codebase

• Utilized Python in the handling of all hits on Django, Redis, and other applications.

• Build all database mapping classes using Django models.

• Developed object-oriented programming to enhance company product management.

• Was involved in environment code installation as well as the SVN implementation.

• Responsible for debugging and troubleshooting the web application.

• Created unit test/regression test framework for working/new code.

Environment: Python, JSON, REST, AWS, Hadoop framework, HTML, MVT, Django, Ajax, PyQT, PyUnit, JavaScript, PL/SQL, and Oracle, Mongo DB, SQL Developer.



Contact this candidate