Post Job Free

Resume

Sign in

Python Developer Client Server

Location:
Charlotte, NC
Posted:
March 03, 2024

Contact this candidate

Resume:

Manchi Reddy

Python Developer

Email: ad32mo@r.postjobfree.com

Phone: +1-910-***-****

Professional Summary:

Having 5 years IT experience in Analysis, Design, Development, Testing, and Implementation of various stand-alone and client-server architecture-based enterprise application software using various technologies, analysing business requirements & mapping them to system specifications.

Experienced in developing web applications, implementing Model View Template architecture using Django web framework.

Experienced in developing web-based applications by following model view controller architecture (MVC).

Have hands-on experience in fetching the live stream data from DB2 to HDFS using Spark Streaming and Apache Kafka.

Experience of working with relational databases like oracle, SQLite, PostgreSQL, MYSQL, DB2 and on NoSQL databases like Apache Cassandra and MongoDB.

Working knowledge on CCAR, FRY-14Q regulatory reporting and experienced knowledge on Bank of America’s proprietary models like CAQF, CSM.

Designed and developed APIs to share data with cross functional teams using Hug and Fast API frameworks.

Handled Business logics by backend Python programming to achieve optimal results and Wrote Python scripts to parse XML, CSV and text files and load the data into AWS S3 buckets and call files from AWS S3 to Tableau using AWS Athena Service.

Good experience in using Object-oriented design patterns, multi-threading, multi-processing, exception handling and knowledge in client server environment.

Designing the user interactive web pages/ templates as the front-end part of the application using various technologies like HTML, CSS, JavaScript, jQuery, JSON and implementing Bootstrap framework for better user experience.

Experience in developing applications using amazon web services like EC2, Cloud Search, Elastic Load balancer ELB, S3, CloudFront, and Route 53.

Experience in working with continuous deployment using Heroku and Jenkins.

Well versed with Agile, SCRUM and Test-driven development methodologies.

Hands on experience in using version control systems Git and GitHub and GitLab.

Worked with Terraform to create AWS components like EC2, IAM, VPC, ELB, Security groups.

Hands on experience with bug tracking tools JIRA.

Excellent interpersonal and communication skills, efficient time management and organization skills, ability to handle multiple tasks and work well in a team environment.

Education:

Master of Science in Computer and Information Systems Security

Wilmington University, DE 2021

Bachelors – CVR collage of Engineering - 2018

Professional Experience:

CLIENT: Bank of America

Application Architect/Python Developer Feb 2023-Present

Project-Description

The Global Risk Analytics platform empowers analysts by providing them with a user-friendly interface to execute models and conduct risk assessments tailored specifically for Federal reporting. These reports encompass a wide array of requirements, including CCAR, IFRS9, CECL, 14Q, CRELL, and ABL, allowing for comprehensive comparison with Challenger models. The platform's model delivers a granular, quarter-by-quarter forecast of critical metrics, such as Net Credit Losses, non-performing loans, and Asset Quality indicators, extending up to nine quarters into the future. Additionally, it offers quarter-by-quarter forecasts for Asset Quality migrations. These forecasts aren't restricted to a single scenario; instead, they span across baseline, internal stress scenarios, and regulatory stress scenarios, ensuring thorough analysis and preparedness for various potential outcomes.

Developed and maintained microservices using Python Fast API and tested them using swagger.

Onboarded the Gbreg Global Banking Regulatory System Application to the Non prod, and Prod env.

Developed UI code using REACTJS, showcasing proficiency in creating dynamic and responsive user interfaces.

Developed JIL jobs for Autosys involves validating the syntax of the scripts, simulating job executions, and verifying that dependencies are properly configured.

The JIL scripts are deployed to the Autosys environment, where they are loaded into the scheduler's database.

Achieved a commendable minimum unit test coverage of 80%, ensuring robust testing practices and validati ng the functionality of services across diverse scenarios.

Worked on test cases using PyTest framework – test cases, test suites and bug fixing,

Developed Data Quality checks are implemented as per use case design.

Developed the stored procedures and complex views on Oracle Exadata.

Developed views and templates with Python and Djangos view controller and templating language to create a user-friendly website interface. Integrated FastAPl with various data storage systems including MySQL, PostgreSQL and retrieved data from within the microservices.

Proficient in writing SQL queries, Stored procedures, functions, packages, tables, views, triggers using relational databases like Oracle, My SQL Server.

Wrote Python scripts to parse the XML documents and load the data in the database.

Experience in setting up the CI/CD pipe for the Python, react app using Ansible, Jenkins, OpenShift.

Onboarded all the configurational and setup for the enterprise application.

Managed the on-site and offshore team members with the code reviews and any production activities.

Performed efficient delivery of code based on principles of Test-Driven Development and continuous integration to keep in line with Agile Software Methodology Principles.

Responsible for deployments for the database, python and react app on the Lower Lane env and creating the production templates for each monthly release.

Worked on monthly release activities, like task list for the release management team, Jira extract, design details and release.

Environment: Python, PL/SQL, Oracle, XLR, Ansible tower, Bitbucket, OpenShift, ReactJs, JavaScript, CSS, HTML, Agile, Jira, CI/CD, Jenkins, Bit bucket, Autosys, MYSQL.

Professional Experience:

CLIENT: Union Pacific March 2022– Jan2023

Python Developer

Project Description:

software solutions aimed at enhancing the efficiency and safety of railroad operations. Our primary goal was to ensure the safe and timely transportation of locomotives and freight across the system, from Point A to Point B. Railroad operations present unique challenges due to their complexity, requiring a diverse team of hard-working individuals dedicated to maintaining operational excellence. Designing algorithms to optimize train routing, implementing predictive maintenance systems to prevent downtime, or developing real-time monitoring solutions to ensure safety compliance.

Responsibilities:

Developed a web-based reporting system with Java, J2EE, Servlets, EJB and JSP using spring framework HTML, JavaScript.

Developed python code using oracle to retrieve data from oracle database, also retrieved data from different data models and passed the data through other data models.

Wrote and executed various MYSQL database queries from python using Python MySQL connector and MySQL dB package.

Develop framework for converting existing PowerCenter mappings and to PySpark (Python and Spark) Jobs. Good understanding of Spark Architecture with Databricks and Structured Streaming.

Worked on HTML5, CSS3, JavaScript, AngularJS, Node.JS, Git, REST API, MongoDB.

Created Pyspark frame to bring data from DB2 to Amazon S3.

Build numerous Lambda functions using python and automated the process using the event created.

Developed tools using Python, Shell scripting, XML to automate some of the menial tasks.

Used Pandas API to put the data as time series and tabular format for east timestamp data manipulation and retrieval.

Setting up the CI/CD pipeline using GitHub, Jenkins, Maven, Chef, Terraform and AWS.

Experienced in NoSQL technologies like MongoDB, CouchDB, Cassandra, Redis and relational databases like Oracle, SQLite, PostgreSQL, and MySQL databases.

Updating the Test Automation suite regularly to ensure its accuracy and usefulness.

Designed and created backend data access modules using PL/SQL stored procedures and Oracle.

Experience in creating Kafka producer and Kafka consumer for Spark streaming.

Developed an information pipeline utilizing Kafka and Storm to store data into HDFS.

Maintained and developed Docker images for a tech stack including Cassandra, Kafka, Apache.

Designed some of the SAS data models using Base SAS and SAS Macros.

Modified and created SAS datasets from various input sources like flat files, CSV, and other formats, created reports and tables from existing SAS datasets.

Worked on AWS EC2/VPC/S3/SQS/SNS based on automation Terraform, Ansible, Python, Bash Scripts.

Designed the Analytical application using Python, Spark, HDFS, AWS EMR.

Extracted Data from Multiple Systems and Sources using Python and Loaded the Data into AWS EMR.

Worked closely with the QA Manager, Team lead and developers to evaluate and enhance automation script to cover test area and test cases.

Good Experience in Database Backups and Recovery Strategies and Expert experience in Hot and Cold Backup of databases.

Developed Complex transformations, Mapplets using Informatica to Extract, Transform and load (ETL) data into Data marts, Enterprise Data warehouse (EDW) and Operational data store (ODS).

Strong knowledge of Oracle utilities like SQL*Loader, Export/Import, Data Pump and External Table.

Environment: Python 2.7, Base SAS, SAS Macros, CI/CD, Flask, oracle database, SAS Enterprise guide, putty, jQuery, WinSCP, Cognos, MySQL, HTML5, CSS3, Impala, Hive, JavaScript, Toad, XML, Restful Web Services, JSON, Terraform, IBM Sterling, EMR, Bootstrap, PL/SQL, SQL, Jenkins, Jira, confluence, eclipse, IntelliJ, Spark, Linux.

CLIENT: Capital One Feb 2021 – Feb 2022

Python Developer

Reports provide insights into performance trends, highlighting areas of growth and identifying potential challenges. To enhance usability, the system incorporates features for data validation to ensure accuracy and completeness of the entered metrics. Additionally, users have the flexibility to customize reports based on specific criteria such as product categories, sales regions, or customer segments.

Responsibilities:

Developed entire frontend and backend modules using Python on Django Web Framework.

Worked on designing, coding, and developing the application in Python using Django MVC.

Experience in working with Python ORM Libraries including Django ORM.

Worked on integrating python with Web development tools and Web Services.

Responsible for writing code in Object Oriented Programming supported by Ruby on Rails in Agile SCRUM environment.

Developed tools using Python, Shell scripting, XML to automate some of the menial tasks. Interfacing with supervisors, artists, systems administrators, and production to ensure production deadlines are met.

Writing REST APIs, as part of developing web-based applications for insurance premium calculations, using Django’s REST framework.

Correspondingly involved in writing REST APIs using Django framework for data exchange and business logic implementation.

Developed REST Microservices which are like API’s used for Home Automation. They also keep the data in synchronization between two database services.

Configuring auto scalable and highly available microservices set with monitoring and logging using AWS, Docker, Jenkins and Splunk

Developed spark applications in python (PySpark) on distributed environment to load huge number of CSV files with different schema in to Hive ORC tables.

Designed web UI components for various modules and used JavaScript for client-side validation.

Involved in building database Model, APIs and Views utilizing Python, to build an interactive web-based solution.

Monitoring spark jobs using Yarn application.

Developed Spark/Scala code to ingest data leveraging memory and optimizing performance.

Assist in the migration of existing SAS programs from SAS 9.2 to SAS 9.4 and validate the resultant datasets.

Used Golang to log the different host system event and alert information to Cassandra database.

Deployed Core Kubernetes Clusters to manage Docker containers in the production environment with light weight Docker Images as base files.

Modified and created SAS datasets from various input sources like flat files, CSV, and other formats, created reports and tables from existing SAS datasets.

Worked on different data formats such as JSON, XML and performed machine learning algorithms in Python.

Used Amazon Web Services (AWS) for improved efficiency of storage and fast access.

Built a scalable, cost effective, and fault tolerant data warehouse system on Amazon EC2 Cloud.

Developed a functional design of AWS Elastic Map Reduce (EMR) specifications with respect to business requirements and technology alternatives.

Configuration of AWS EC2 Auto Scaling groups and auto scaling policies.

Involved in Developing a Restful API'S service using Python Flask framework.

Created Data tables utilizing PyQt to display customer and policy information and add, delete, update customer records.

Used Scala to convert Hive/SQL queries into RDD transformations in Apache Spark.

Environment: Python 3.0, Pycharm, Django, Docker, Amazon Web Services, AWS Lambda, AWS S3, jQuery, PyQuery, MySQL, HTML5, CSS3, JavaScript, Ajax, XML, Restful Web Services,, JSON, EMR, Bootstrap, AngularJS, NodeJS, Flask, SQL, MySQL, Jenkins, Ansible, Git, GitHub, Linux.

CLIENT: Dell Technologies, India Jan 2018 -Dec 2018

Python Developer

Responsibilities:

Used SDLC process and used PHP to develop website functionality.

Designed and developed the UI of the website using HTML, NodeJS, XHTML, AJAX, CSS, and JavaScript.

Developed entire frontend and backend modules using Python on Django Web Framework on MySQL

Used Django API for database access.

Designed and developed data management system using MySQL. Built application logic using Python 2.7.

Developed embedded firmware in C to create custom images on Oscilloscope, DSP: Digital Communication, Adaptive.

• Targeted to pique interest on DSP and embedded systems for beginners

Worked on development of SQL and stored procedures, trigger, and function on MYSQL.

Developed shopping cart for Library and integrated web services to access the payment (E-commerce)

Designed and developed a horizontally scalable APIs using Python Flask.

Designed Cassandra schema for the APIs.

Enabling of the embedded Linux systems to support the development of the enterprise the network Unified Thread Management (UTM) products.

Used PhP language on lamp server to develop page.

Developed dynamic interaction page on .net MS visual basic 2014, using SQL developer tools.

Environment: Python 2.6/2.7, JavaScript, Django Framework 1.3, SQL, MySQL, LAMP, jQuery, Adobe Dreamweaver, Apache web server, PHP, Lamp, underscore JS. SQL developer tool.



Contact this candidate