Post Job Free
Sign in

Python Developer Web Services

Location:
Farmington Hills, MI
Salary:
60$ per hr
Posted:
April 16, 2025

Contact this candidate

Resume:

Name: Vishnu Priya

Phone: 248-***-****

Email: *************************@*****.***

Sr. Python Developer

SUMMARY

● Having 10+ years of experience in SDLC of developing and designing Web Based, Multi - tier distributed applications using the latest analytical programming Python, Django, Flask, Java, J2EE and spring technologies.

● Experience with all aspects of technology projects including Business Requirements, Technical Architecture, Design Specification, Development and Deployment.

● Experience in Amazon Web Services (Amazon EC2, Amazon S3, Amazon Lambda, Amazon SimpleDB, Amazon RDS, Amazon Elastic Load Balancing, Amazon SQS, Redshift, AWS Identity and access).

● Hands-on experience working in WAMP (Windows, Apache, MySQL, and Python/PHP) and LAMP (Linux, Apache, Spark, MySQL and Python/PHP) Architecture.

● Experienced in developing web-based applications using Python, Django, PHP, XML, CSS, HTML, JavaScript and jQuery.

● Performed Unit testing, Integration testing and generating of test cases for web applications using Junit and Python Unit test framework.

● Experience on technologies and platforms including JAVA, Node.js, Jenkins, Subversion, Git, Unix/Linux, windows server, VMWare, Docker,

● Expertise in practicing the SDLC models, Agile methods with Scrum, Extreme Programming, Ticketing systems using JIRA and TDD.

● Performed automation tasks on various Docker components like Docker Hub, Docker Engine, Docker Machine, Compose and Docker Registry.

● Highly experienced in generator function, generator expression, object-oriented programming and package development.

● Experience in Microsoft Azure/Cloud Services like SQL Data Warehouse, Azure SQL Server, Azure Databricks, Azure Data Lake, Azure Blob Storage, Azure Data Factory.

● Good experience in developing web applications and implementing Model View Control (MVC) architecture using server-side applications like Django, Flask and Pyramid.

● Expert in using JSON based REST Web services and Amazon Web services.

● Experienced in NoSQL technologies like MongoDB, Cassandra and relational databases like Oracle, SQLite, PostgreSQL, DynamoDB and MySQL databases.

● Experienced in Application Development using Python, RDBMS and Linux shell scripting and performance tuning.

TECHNICAL SKILLS

Frameworks Django, web2py, pyramid, Flask, Struts, pylons and CSS Bootstrap Technologies HTML, CSS, DOM, SAX, JavaScript, jQuery, AJAX, XML, Angular JS, SAP, GEN AI

Programming Languages C, C++, Java, SQL, Python-3.5 & 2.7, SAS/SQL and PL/SQL Python Libraries Python, Django, R language, Flask, Beautiful Soup, HTML/CSS Bootstrap, jQuery, NumPy, MatPlotLib, Pickle, PySide, SciPy, wxPython, PyTables, pdb, Ruby, Vertex AI, PaLM API Build Tools CI/CD, Jenkins, Bamboo, Docker, Anthill Pro, Deploy. Maven, ANT, Gradle

Version Control GIT (GitHub), SVN, CVS, Bitbucket, Subversion, TFS, Artifactory Databases Oracle, MySQL, PostgreSQL, MongoDB, Spark, R, BigQuery IDE's/ Development Tools PyCharm, and Sublime Text Web Services AWS, Rackspace Cloud, Amazon S3, EC2, Data Pipeline, Elastic Search, Redshift, Google Cloud Platform (GCP), Google AI Tools, Google Cloud Functions

Operating Systems Windows, RedHat Linux, MAC OSX, Microsoft Azure Protocols TCP/P, FTP, FTPS, HTTP/HTTPS, SOAP, SNMP, SFTP, SMTP, REST Tracking Tools Bugzilla and JIRA

Methodologies Agile, Scrum and Waterfall

PROFESSIONAL EXPERIENCE:

Client: Amway – Ada, MI. Jan 2024-Till Date

Role: Sr. Full Stack Python Engineer

Responsibilities:

● Developed web applications in Django Framework's model view control (MVC) architecture.

● Performed efficient delivery of code based on principles of Test-Driven Development (TDD) and continuous integration to keep in line with Agile Software Methodology principles.

● Designed and implemented Infrastructure as Code using Terraform, ensuring consistent and reproducible deployment of cloud resources across Azure, AWS, and GCP.

● Integrated GCP services, including BigQuery and Vertex AI, to process large-scale data analytics and AI model training.

● Implemented continuous integration and deployment (CI/CD) pipelines using Azure DevOps and Google Cloud Build for .NET applications.

● Provisioned multi-cloud infrastructure (Azure, AWS, and GCP) using Terraform, providing flexibility and avoiding vendor lock-in.

● Developed and utilized Terraform modules for modular and reusable infrastructure components, optimizing code organization and maintenance.

● Implemented various deployment strategies on Kubernetes, including rolling updates, blue- green deployments, and canary releases, utilizing Google Kubernetes Engine (GKE) for AI based microservices.

● Utilized Kubernetes manifests and Helm charts to define and manage infrastructure as code, ensuring consistency across environments.

● Created and managed Kubernetes pods and services, facilitating the communication and scaling of containerized applications.

● Also created AngularJS controllers, directives, models for different modules of the application.

● Worked on Jenkins for continuous integration and continuous delivery (CI/CD).

● PySpark DataFrames are similar to tables in relational databases. They provide a high-level API for working with structured and semi-structured data.

● Designed and implemented distributed database systems using Comdb2 and Redis for scalable and high-availability data storage solutions integrated with SAP HANA for enhanced data management.

● Built and optimized data ingestion pipelines using Kafka and RabbitMQ, ensuring reliable and high-throughput message processing for microservices architectures.

● Implemented message queuing systems with RabbitMQ and Kafka, facilitating efficient asynchronous communication between microservices and enhancing system scalability.

● Dockerized applications by creating Docker images from Docker file.

● Managed version control and collaboration on code repositories using GitHub, ensuring efficient team workflows.

● Implemented GitHub Actions for automated testing, building, and deployment of applications.

● Utilized GitHub for code reviews, pull requests, and merge conflict resolution.

● Configured GitHub Pages for hosting static websites and project documentation.

● Collaborated with DevOps teams to integrate CI/CD pipelines across Azure DevOps, AWS CodePipeline, and Google Cloud Build in Agile workflows.

● Deployed Generative AI models on GCP Vertex AI and AWS Bedrock, enabling real-time AI driven automation solutions.

● Monitored Agile metrics, such as velocity and cycle time, to track team performance. Environment: Python 2.7 and 3.5, Django 1.6.1, R, Selenium, Pytest, HTML 5, CSS 3, XML, MySQL, JIRA, JavaScript, Angular JS, Backbone JS, jQuery, PyQt, CSS Bootstrap, Apache, Spark, MongoDB 3.2, Pandas, Elastic Load Balancer, Elastic Search, MS SQL Server 2014, SAS, T-SQL, Eclipse, Git, GitHub, Azure,AWS, Google Cloud Platform (GCP), Redshift, BigQuery, Vertex AI, Jenkins, PySpark, Docker, Bitbucket, Linux, Shell Scripting

Client: US Bank – Minneapolis, MN. May 2022 - Dec 2023 Role: Full Stack Python Engineer

Responsibilities:

● Designed the front end of the application using Python, HTML, CSS, AJAX, JSON and jQuery.

● Used the Django Framework to develop the application.

● Designed and developed Azure DevOps pipelines to manage resources across the multiple subscriptions in Azure.

● Integrated BRS Aladdin with internal systems to streamline investment data flow and enhance portfolio management.

● Developed and maintained custom reports in BRS Aladdin for portfolio analysis and risk management.

● Configured and managed Aladdin Workbench for real-time portfolio monitoring and analytics.

● Automated data ingestion from multiple sources into BRS Aladdin using custom ETL scripts.

● Developed real-time data streaming pipelines using Apache Kafka, ensuring reliable and scalable data ingestion and processing.

● Implemented Kafka Connect to integrate Kafka with various data sources and sinks, facilitating seamless data integration across systems.

● Utilized Kafka Streams API for real-time stream processing, enabling complex event processing and analytics on data streams.

● Integrated Redis as a caching layer in JavaScript and TypeScript applications, reducing database load and improving application performance, with seamless data synchronization to SAP ERP.

● Designed and implemented Kafka consumer applications in Python, leveraging Kafka Consumer APIs for consuming and processing data from Kafka topics.

● Automated infrastructure provisioning and orchestration using Terraform, streamlining the deployment process and minimizing manual intervention.

● Utilized Terraform workspaces for managing multiple environments, such as development, testing, and production, with isolated configurations.

● Implemented secure data sharing and collaboration using Snowflake's data sharing capabilities, enabling seamless data exchange across organizational boundaries.

● Conducted performance tuning in Snowflake, optimizing query execution times and resource utilization for enhanced data processing efficiency.

● Implemented robust security measures in Snowflake, including access controls, encryption, and auditing, ensuring data integrity and compliance.

● Integrated monitoring and logging solutions into Kubernetes clusters, using tools like Prometheus, Grafana, and ELK stack.

● Implemented web applications in Flask frameworks following MVC architecture.

● Worked on front end frameworks like CSS Bootstrap for development of Web applications.

● Also, created XML with Django to be used by the Flash. Involved in deployment of projects using AWS. Special skills in developing user friendly, simple yet effective web-based applications.

● Employed API Gateway caching to enhance the performance of Java-based AWS Lambda functions, reducing response times for frequently requested data.

● Experienced in setting up EC2 instances, security groups and Setting up databases in AWS using S3 bucket.

● Imported data from SQL Server and Excel to SAS datasets and performed data manipulation by merging several datasets and extracted relevant information. Environment: Python 2.7 & 3.5, Django 1.6.1, R, Selenium, Pandas, HTML 5, CSS 3, XML, MySQL, JIRA, JavaScript, Angular JS, Backbone JS, jQuery, RDBMS, CSS Bootstrap, Apache, Spark, MongoDB 3.2, MS SQL Server 2014, T-SQL, SAS, Eclipse, Git, GitHub, AWS, Jenkins, Power BI, Bitbucket, Linux, Shell Scripting

Client: Merck & Co – Rahway, NJ. Jun 2021 – Apr 2022 Role: Full Stack Python Engineer

Responsibilities:

● Used UML Rational Rose to develop Use-case, Class and Object diagrams for OOA/OOD techniques.

● Built database Model, Views and API's using Python for interactive web-based solutions.

● Placed data into JSON files using Python to test Django websites. Used Python scripts to update the content in the database and manipulate files.

● Integrated Snowflake with various data sources and analytics tools, facilitating a unified and centralized approach to data management and analysis.

● Implemented effective data loading strategies in Snowflake, utilizing features like Snowpipe and Snowflake connectors for efficient data ingestion.

● Configured and managed data sharing and replication in Snowflake to enable real-time data synchronization and collaboration.

● Developed web-based application using Django framework with python concepts.

● Generated Python Django forms to maintain the record of online users.

● Used Django API's to access the database.

● Involved in Python OOD code for quality, logging, monitoring, and debugging code optimization.

● Wrote Python modules to view and connect the Apache Cassandra instance.

● Proficient in Python, Shell Scripting, SQL, build utilities like open make, ANT and Cruise Control.

● Implemented techniques in Pandas to optimize memory usage during data operations, ensuring efficient handling of large datasets within resource constraints.

● Created real-time streaming applications using Kafka and JavaScript, handling large volumes of data with minimal latency and high reliability.

● Employed Redis Pub/Sub and Streams for building real-time features such as live notifications and activity feeds in web applications.

● Improve speed, efficiency and scalability of the continuous integration environment, automating wherever possible using Python, Ruby, Shell and PowerShell Scripts.

● Developed an application that used IP based protocols such as TCP, UDP, HTTP, SSL, for analyzing data of L2/Land WIFI Data for QA and testing process.

● Extensively worked with Networking layer(L3) with IPv4/IPv6 and also Transport layer(L4) with TCP. Responsible for managing large databases using MySQL.

● Wrote and executed various MySQL database queries from Python-MySQL connector and MySQL db package.

● Used the Python's modules NumPy, matplotlib, etc. for generating complex graphical data, creation of histograms etc. Developed and designed automation framework using Python and Shell scripting.

● Involved in debugging and troubleshooting issues and fixed many bugs in two of the main applications which are the main source of data for customers and the internal customer service team.

● Implemented SOAP/RESTful web services in JSON format.

● Involved in debugging the applications monitored on JIRA using agile methodology.

● Managed Terraform upgrades and migrations, ensuring compatibility with new features and maintaining infrastructure stability.

● Collaborated with cross-functional teams in collaborative Terraform development, incorporating feedback and ensuring alignment with project goals.

● Implemented testing strategies for Terraform code, including unit testing, integration testing, and validation against different cloud environments.

● Attended many day-to-day meetings with developers and users and performed QA testing on the application. Experienced in using containers like Docker. Environment: Python 2.7,3.3, Django 1.4, HTML, CSS, AJAX, Ruby, Tomcat, Apache HTTP, Angular.js, JSON, Restful, XML, JavaScript, OOD, Shell Scripting, GITHub, Docker, MYSQL. Cassandra, JIRA Client: Kellton Tech – Hyderabad, India. Jan 2017 – Dec 2020 Role: Python Engineer

Responsibilities:

● Used MySQL as backend database and MySQL of python as database connector to interact with MySQL server.

● Developed Python/Django application for Analytics aggregation and reporting. Used Django configuration to manage URLS and application parameters.

● Rewrote existing Python/Django modules to deliver a certain format of data.

● Managed datasets using Panda data frames and MySQL queries, MySQL database queries from Python using

● Development experience in python, Django and flask, have used MySQL, SQLAlchemy and sqlite3.

● Designed and developed a Mark-to-Market application to mark the securities with market prices to evaluate the collateral margin exposure. Utilized C++ templates, OO Design Patterns.

● Implemented SOAP/RESTful web services in JSON format.

● Published and shared Power BI reports and dashboards via Power BI Service, enabling collaboration and accessibility.

● Utilized GitHub Gists for sharing code snippets and scripts.

● Integrated GitHub with project management tools like Trello and JIRA for seamless workflow management.

● Utilized GitHub’s security features, such as Dependabot, to identify and fix vulnerabilities.

● Automated data refresh and scheduled data updates in Power BI to ensure up-to-date information.

● Utilized Power Query to transform and clean data before visualization in Power BI. Environment: MySQL, Python, Django, GitHub, flask, HTML, Oracle, Django, SQLAlchemy, sqlite3, CSS, Excel, SQL Server, HTML, MS Office, SQL, MySQL, Linux, Windows Server Client: Single Point Solutions – Hyderabad, India. Jan 2014 – Dec 2016 Role: Software Developer

Responsibilities:

● Proposed Breakpoint and Array Dump tool for debug which reduces the time to debug

● Understanding the several Processor components.

● Running and debugging the Python harassers on the Linux environment

● Debugging the failure issues by capturing the array and register dumps using Python scripts, traces and performing several experiments by interacting with the design team

● Rewrote existing Python/Django modules to deliver a certain format of data.

● Developed scripts in Python and Excel BA to automate the data analysis, generating statistics isolating trends in memory failures.

● Development of data structures, xml parsing using Python.

● Development of user-defined data structures and library functions using C and C languages

● Automating the hardware flow using the batch files and shell scripting

● Maintained the versions using GIT and sending the release notes for each release.

● Maintaining user data using Microsoft Excel

● Implemented code in Python to retrieve and manipulate data.

● Developed Wrapper in Python for instantiating multi-threaded applications.

● Managed datasets using Panda data frames and MySQL queries, MySQL database queries from Python using Python-MySQL connector MySQL DB package to retrieve information.

● Development of the break point tool using Perl and Java User Interface

● Created unit test/regression test framework for working/new code.

● Development experience in python, Django and flask, have used MySQL, SQLAlchemy and sqlite3.

Environment: MySQL, Python, Django, GitHub, flask, HTML, Oracle, Django, SQLAlchemy, sqlite3, CSS, Excel, SQL Server, HTML, MS Office, SQL, MySQL, Linux, Windows Server



Contact this candidate