Post Job Free
Sign in

Software Developer Python

Location:
Jersey City, NJ
Salary:
75/hr
Posted:
March 18, 2025

Contact this candidate

Resume:

Pranay Sadh

SUPPLIER COMMENTS

●Pranay is an experienced Sr Software Developer, specializing in Python & Cloud Services with focus on developing software solutions.

●Have very good experience in building new serverless solutions to expose API endpoints through Functions as a code(AWS Lambda).

●Have deployed Python applications on AWS ECS, ensuring high availability and seamless scalability.

●Worked extensively with the Amazon AWS API designing and developing a new set of functionalities to deploy, monitor and govern ELBs, EC2s, security groups, auto scaling group, rules, Amazon RDS and S3 buckets

●Conversant with all Python Data Structures - List, Dictionary, Set and Tuple, skilled in Database Connectivity using Python, created web application components using Python.

AVAILABILITY/LOCATION

●Phone/Video Interview: Anytime within 24-48 hrs. notice

●To Start: Immediately

●Location: Jersey City, NJ. Comfortable to work remotely.

SUMMARY:

●Senior Python Developer with a strong focus on backend development.

●Python expert with recent exposure to, PySpark, AWS, Microservices,Poetry,Docker, Kubernetes, SQL database, and ETL tools.

●Subject Matter Expertise in designing scalable and automated infrastructure.

●Utilized AWS and in-house products.

●Effective as an individual contributor as well as team worker.

EDUCATION

●Bachelor of Computer Science, RGPV, 2012

PROFESSIONAL EXPERIENCE:

Corteva Agriscience/Consultadd Inc, Remote (March 2023 – PRESENT)

Sr Software Developer (Python)

●Develop and maintain Python scripts for data processing and analysis on EMR (Elastic MapReduce) clusters.

●Design and implement data pipelines using EMR technologies to ingest, transform, and store large datasets.

●Create and optimize SQL queries to extract relevant information from databases and data lakes.

●Utilized Poetry as the primary package manager for dependency management and project building, ensuring consistent and reproducible environments.

●Implemented Kubernetes pods for scalable deployment of data processing tasks, ensuring seamless execution and resource utilization.

●Led the migration of an existing project from traditional package management to Poetry, resulting in improved dependency management and version control.

●Utilized Python and Pandas to perform data cleansing, transformation, and aggregation operations on large datasets.

●Created a RESTful API using Python and Flask to provide access to a machine learning model for predictive analysis.

●Utilized Kubernetes to deploy the API in a containerized environment, allowing for easy scaling and management.

●Created a helper function within python so that we can connect to AWS, Hbase & EMR Clusters.

●Utilize HBase, a distributed NoSQL database, to design and manage data storage systems for efficient data retrieval.

●Collaborate with data scientists and analysts to understand data requirements and translate them into technical. Perform data

●quality assessments and implement data validation techniques to ensure accuracy and integrity of the data.

●Monitor and troubleshoot data processing jobs and performance issues in EMR and HBase environments.

●Implement data security and access controls to protect sensitive information stored in EMR and HBase.

●Conduct performance tuning and optimization activities to improve the efficiency of data processing workflows.

●Weather Dashboard :

●Developed a weather dashboard using JavaScript and external APIs.

●Enabled users to search for and display current weather conditions and forecasts for specified locations.

●Implemented responsive design for seamless use across different devices.

FannieMae/Consultadd Inc (Remote) Oct 2022- Jan 2023

Python Developer

●Created a task management application using Express.js and MongoDB.

●Designed RESTful APIs for CRUD operations on tasks, supporting user authentication.

●Implemented real-time updates using WebSockets for collaborative task tracking.

●Developed and maintained Python-based applications leveraging AWS Glue for data extraction, transformation, and loading (ETL) processes from various data sources into AWS S3.

●Implemented AWS SQS for message queuing to enable decoupled and scalable communication between microservices.

●Utilized AWS Step Functions to design and orchestrate complex workflows, improving the overall efficiency and reliability of application processes.

●Worked on TIBCO Refactor 2.0 project and was involved in Architecture diagrams.

●Deployed Python applications on AWS ECS, ensuring high availability and seamless scalability.

●Automated the build and deployment process using Jenkins, reducing deployment time and enhancing team productivity.

●Utilized Boto3 library for interacting with various AWS services programmatically, ensuring seamless integration of Python applications with AWS services.

●Leveraged pandas for data manipulation, analysis, and visualization, streamlining data processing and enhancing data-driven decision-making.

●Worked with psycopg2 to connect and interact with PostgreSQL databases, optimizing database queries and ensuring data integrity.

Village MD/Consultadd Inc, Chicago, IL March 2022 - Oct 2022

Sr. Python/AWS Engineer

●Designed, Developed & deployed data pipelines including ETL processes for getting, processing, and delivering data using

●Apache Spark Framework. Worked on ETL/Data Application development & version control systems such as Git. Wrote and tested spark jobs for ETL.

●Utilized Kubernetes to deploy and manage containerized applications, ensuring scalability and high availability of services.

●Involved in development of python-based source code including wheel files/setup files. Enable glue to jobs/ crawlers/ catalog/ configure & run the jobs in the glue studio. Learnt and implemented data flow into snowflakes. Loaded data from Postgres to snowflake.

●Utilized Poetry to create and manage virtual environments, simplifying the installation of project dependencies.

●Wrote/configured/tested Step functions to enable parallel running glue jobs. Configured python-based environment to test python-based source code locally.

●Wrote unit tests against python-based functionality using pytest framework. Performed data validation, pre-processing & postprocessing.

●Used boto3 SDK to be able to configure AWS resources locally. Created and managed multiple environment-based git branches(dev/test/prod). Actively participate in doing code review over PR’s.

●Coordinated with users to understand data needs and delivery of the data with a focus on data quality, data reuse, consistency, security & regulatory compliance. Monitored, managed, validate, and test data extraction, movement, transformation, loading, normalization, cleansing & updating processes.

●Picked and implemented JIRA based tickets & performed end to end testing according to the acceptance criteria.

●Proposed design implementation for better execution of data mapping using platforms like AWS.

●Collaborated with team-members on Data Models, schema in our data warehouse, and documenting source-to-target mapping.

Charles Schwab/Consultadd Inc, Westlake, TX March 2020-February 2022

Sr. Python Developer

Charles Schwab ‘s Trading & Investment Team

●Sr. Python Engineer, to increase user base as well as to enable users to trade-on-the-go, developed an online-trading application with rich user experience and robust features to allow users to trade on various stock exchanges for equity, commodities, and currency markets. Incorporated unique value-added features such as Mock Trading Championship, Pay-in, Pay-out Funds, performance of key market indices (Sensex, Nifty, SGX, and USD), personalized watch list/market watch and much more.

●Provided the highest levels of user security in application, with the same encryption and physical security. Application merges real time financial information with data visualization technique in the form of graphs and charts with real time prices of current share Flexibility to get history of company’s performance in share market.

●Utilized Poetry extensively for dependency management, simplifying package installation, and project management, resulting in streamlined development processes and improved team collaboration.

●Involved in designing, developing, and deploying Python APIs (including supporting infrastructure). Executed full product life cycle (inception through deprecation) to create highly scalable & flexible APIs to enable an infinite number of digital products and was the primary point for transitioning around 35 Python APIs to decouple serverless architecture on AWS.

●Designed AWS Lambda functions in Python for triggering the shell script to ingest the data into MongoDB and exporting data from MongoDB to consumers and created an AWS Lambda architecture to monitor AWS S3 Buckets.

●Assisted in creating technical documentation and guidelines for using Poetry in the development workflow.

●Designed multiple components and services to tackle REST API to fetch data which was getting filtered out in one of the microservice to get only the data we need. Automated data processing CI/CD pipelines so that customers can use the service to create EC2-based lab environments to facilitate product evaluations or create internal training environments.

●Led the team, facilitated smoother transition towards modernization of Charles backends, including building streamlined CI/CD pipelines, moving python applications to Kubernetes, and designing better monitoring strategies for optics into the health of our systems. Utilized Python 3 to build interactive web-based solutions and developed back-end components to improve responsiveness, overall performance, and integrated user-facing elements into applications.

●Converted a monolithic application to microservices architecture using Django by breaking large projects into loosely coupled modules to get an independent and flexible system. Also, Used Microservice architecture to structure application to make it highly maintainable and testable, utilized Microservices Assessment Platform for improving the built code.

●Worked with Docker and Kubernetes on multiple cloud providers, from helping developers build and containerize their application (CI/ CD) pipelines to deploying either on public or private cloud.

Full Stack Python Developer March 2018 – Feb 2020

Charter Communications, Denver, CO

●Senior Python Engineer in Charter’s Cellular Network Automation Project:

●Responsible for developing, testing, and deploying wireless access network automation applications (Python. The application under development is to detect/classify RAN network interference and mitigate it in an automated way. The solution will help recover cell coverage due to uplink interference and maintain service.

●Championed the integration of unit tests into our development and deployment process to ensure successful deployments.

●Building and re-inventing the wheel for new application development as a solution for handling more than millions of numbers of call on our system for which our on-premises based solution was a little slow and had problems returning response on time due to which I build the new serverless solution to expose our API endpoint through Functions as a code(AWS Lambda) to handle out event-driven architecture and build later stages by communicating to messaging services like SQS and SNS to have a fanout situation.

●Managed containerized workloads using Kubernetes and then, to deliver the software quickly, separated the application from infrastructure by utilizing Docker (PaaS).

●Implemented in Jenkins for Continuous Integration and for automating all builds and deployments and Build Jenkins jobs to create AWS infrastructure from GitHub Repos containing terraform code and Installed and Administered Jenkins CI for Maven Builds.

●Member of Mobile Development Team

●Developed a python-based application for Charter, used to keep a record of all the customers, potential leads and the schemes which were introduced to the leads. Responsible for writing microservices using Django to implement all the crud operations on the database to handle and retrieve records of each customer. Also, Involved in Deployment of data of AWS DynamoDB via Code Pipeline and implemented security using OAUTH2 based on numerical analysis via API hitting on data of transactions done by each customer.

●Created and contributed to various microservices to personalize the customer experience on Charter’s primary customer-facing website. Involved in developing new REST APIs and enhanced existing APIs to add capabilities for innovation functionalities to customers. Used Maven as Build tool to automate build process for entire application.

●Created custom directives in angular.js, used modules and filters objects according to application requirements.

●Created an application platform interface for AWS Lambda using API Gateway, enabled authentication and authorization for IAM users via configuring IAM policies as well as created S3 bucket for a backup of DynamoDB tables via writing scripts in Lambda which was triggered regularly for updated backup.

CareDx, Brisbane, CA May 2016 – Feb 2018

Python Developer

●Python Engineer for CareDx Labs team,

●Responsible for designing and building a python-based platform of services for the broader CareDx Technology community of software developers. Worked collaboratively with Cloud, Security, Network, and Identity teams to provide customised solutions to enterprise level applications as well as to emerging customers within Vanderbilt University Medical Centre.

●Supporting and developing python-based software applications for dispensing and distribution automation to large commercial pharmacies. Responsibility included analysis system failure and locating troubling part of the code and developing new reliable solutions.

●Used Kubernetes to deploy, scale and manage docker containers with multiple named spaced versions.

●Working extensively with the Amazon AWS API designing and developing a new set of functionalities to deploy, monitor and govern ELBs, EC2s, security groups, auto scaling group, rules, Amazon RDS and S3 buckets to modify the CareDx application code and extending the CareDx project platform to allow customer a better experience.

●Designed and implemented the key components of the platform to perform real time ingestion of the events from over 75 data sources both in structured and unstructured form.

●Developed REST Microservices (similar to API’s) to keep data in synchronization b/w two database services.

●Maintaining complex PL/SQL packages, function and procedures in multi-environment Oracle Database resulting in stable business processes. Utilized CI/CD technologies such GIT and Jenkins for developing and deploying web services and used Rest API for logging and debugging and deployed codes on AWS.



Contact this candidate