Post Job Free

Resume

Sign in

Data Management Python Developer

Location:
Ashburn, VA
Posted:
July 04, 2023

Contact this candidate

Resume:

Saraswathi Moram

adx3bm@r.postjobfree.com

Ph: 276-***-****/ 703-***-****

https://www.linkedin.com/in/saraswathi-moram-20a221114/

Professional Summary

10+ years of IT experience with deep involvement in all aspects of Development, Design, Analysis, Testing, and Deployment.

7+ experience in analyzing massive data sets and programming for data management, microservices (REST API) and object-oriented development.

Subject matter Expert with hands-on experience in AWS cloud services and packages, including and not limited to S3, Lambda, APIGW, step-functions, EKS, SNS, SQS, Secret Manager, Cloud Formation, VPC, Security Groups, Redshift, Athena, Cloud Watch, Kinesis, Redshift, RDS.

Data Management with scripts on databases of Oracle, PostgreSQL, Mysql, Snowflake; application log dashboards’creation using Splunk and Kibana; container experience with Docker

Expert in Python modules Fastapi, Flask, Pyspark, SQLAlchamy, Boto3, Pandas, Numpy, Multi-Threading, Unitest, Pytest, Code Coverage, and AWS deployment scripts using Terraform and Cloud Formation.

Experience using ETL with oracle.

Code Management and Integration with Continuous integration and continuous deployment (CICD) tools such as GitHub, Jenkins, Concourse.

Worked with tracking tools such as JIRA, Rally, V1 and Service-Now, as well as team collaboration software such as Confluence.

Good team player, quick learner, skilled performer for timely and qualitative deliverable.

Skills Matrix:

Languages: Python 3.9, Serverless, Terraform, CSS, VBA

Databases: Oracle, MySQL, PostgreSQL, DynamoDB, Redshift, Snowflake

Tools: Eclipse, Visul Sudio, Pycharm GITHUB, Jenkins, Concourse.

Platforms: Windows, UNIX/LINUX, AWS

Methodologies: Agile/Scrum, Waterfall

Cloud Competencies: AWS (Lambda, APIgw, Ec2, VPC, Redshift, s3, Cloud Watch, SSN, SQS, kinesis, Athena, Cloud Formation, Serverless, Terraform)

Test Framework: Unitest, Pytest, OpenERP

Professional Experience

Client: LFG(Lincoln Financial Group) – (MindTree) Aug 2022 to Mar 2023

Role: Technical Lead

Project Description:

Lincoln Financial Group is a financial services firm offering a variety of financial and retirement planning services, as well as life insurance, annuities, and investment management solutions. The application was “Plan Conversion”, which is utilized for eligibility verification and data analysis in retirement planning services. As part of the project, I was responsible for the migration of the application to cloud and data management on the Plan Conversion API, that the business clients used to change the products from one to another in the Retirement Plan portfolio.

Roles and Responsibilities:

As a team lead, I oversaw the Design to Development of the application. I also made sure that the requirements of the on-premises’ application for migration to cloud is well documented.

Streamlined database connectivity and data query optimization by developing Micro Services using Fine Grain and Coarse Grain Architecture, with FastAPI Framework

Implemented the Python SQLAlchamy – a SQL tool to retrieve and store the data from the Aurora PostGreSQL Database.

Created the parallel execution queries on the API using AWS Step functions by orchestration with Coarse Grain Lambdas

Implemented the cloud formation templates to automate AWS component deployment (Lambda, API-GW, CloudWatch, StepFunctions)

Updated the GitLab deployment scripts to integrate with serverless and built lambda layers.

Prototyped the data and components to validate the code and check business functionality by writing MOCK UnitTest.

Supported UI team to integrate the micro services and played a key role in production deployment.

Updated Stories and project status in the Project Management tool V1(Version1) timely

Used Splunk tool to create dashboards from the data that CloudWatch sent on the application logs.

Environment: Python 3.9, FalstAPI, AWS (Lambda, Lambda Layers, APIgw, Stepfuctions, Cloudwatch, Cloudformation, Secret Manager, RDS), SQLALchemy, GitLab,

Client: Comcast – (Cognizant) Aug 2018 to Aug 2022

Role: Python Full Stack Developer

Project Description:

Xfinity, a brand of Comcast markets consumer cable television, internet, telephone, and wireless services provided by the company. Our application was a “Consent Approval System” to consolidate and track the service orders for products that customers would submit via different APIs. My role was to work on the data integration to the Ecommerce Consent UI, where I managed the data from different sources and implemented business rules to be sent for approval or consent. We also automated all the payments. The other service of Billing UI was also automated using REST APIs.

Roles and Responsibilities:

Implemented RestAPI’s by using Flask/Chalice framework and AWS Lambda.

Implemented Terraform/Serverless scripts to automate & deploy AWS components (Lambda, API-GW, CloudWatch, SNS, Kinesis, Dynamo database) and integrated same with Jenkins/Concourse deployment process

Prototyped the Mock unittests to examine the code coverage and to validate the code.

Involved in the deployment and testing of the complete solution with a monthly cadence.

Created the Kibana dashboards to analyze the data flow for application.

On a timely basis, wore different hats across requirements, design, development and testing.

Experience using Python or Scala.

Production tracking and documentation work in Rally and Confluence accordingly

Environment: Python3.6, Flask, PySpark, Oracle, AWS (Lambda, Dynamo DB, Lambda Layers, API-GW, Cloudwatch, CloudFormation, Secret Manager, Kinesis, SNS, SQS), Serverless, terraform, Concourse, Git, Kibana.

Client: Amazon - (Capgemini) Oct 2016 to Aug 2018

Role: Functional Automation And Data Reconciliation

Project Description:

This project involved data mining and finding anomalies in transaction payments caused by customer-refund, over charges, shipment fee, rectifying the issues and correcting them for claims or collections. Our team was responsible for automating the data sources and analyzing the anomalies. We developed Python scripts to implement the business rules and automate the loading of the data into Redshift.

Roles and Responsibilities:

Developed and mined the data to uncover different scenarios (erroneous, payment failures, shipment fee) involving the anomalies of the data that would include affected online transactions.

Created the files from respective databases for analytics and research using AWK commands.

Leading and mentoring a small team of Software/Data Engineers.

Various files available on S3 were read by Python and were loaded back to PostGreSQL.

Built transformation and automation scripts using Python. The analytic datasets are later loaded to the Redshift databases

Developed GIT workflows to process and deploy the python scripts.

Environment:Python2.7, ETL, AWS(s3, Athena, Redshift(PostgreSQL), Oracle), Git, Gitworkflows.

Client: TransUnion - Wells Fargo (Capgemini) Mar 2014 to Sep 2016

Role: Python Developer

Project Description: TransUnion is a global information and insights company that provides data solutions and credit reporting services to businesses and consumers. The application Envision is a decision system, used by TransUnion to filter out credit report of a subject based criteria. Wells forgo is one of the lob to it by using decision sets to achieve the same. Rules and matrix are applied and filtered offer cards, loans or different offers based on lob. The project's goal was to extract user information from SOAP data in order to evaluate card eligibility.

Roles and Responsibilities:

Understanding the existing application and implement.

Worked with Banking Domain, to check the customer eligibility of cards.

Use the SOAP API calls to retrieve the data and process the same.

Worked with xmllib2 python packages to extract the SOAP data and send it to decision system to take decisions

The system roles need to be update based on the requirement.

Ensure the best possible performance, quality, and responsiveness of applications.

Created python unites scripts and Help maintain code quality, organization, and automation.

Environment:Python2.7, Decision System application(TU internal application), SVN.

Client: Castle – (Techhighway) April 2013 to Sep 2014

Role: Python Developer

Project Description: Castle is a wholesale distributor of building materials and construction supplies. The project was Sale order management and the ability to deposit against an order and convert the quote to an order once the deposit is completed. The order progresses through the workflow based on a variety of parameters, including weather the product is available and so can be reserved etc..

Responsibilities:

Participated in the system study to understand the functionality sales functionality.

Developed the Application in the Django framework and deployed to client location.

Involved in design and development of UI component.

Implemented to Import the huge data files (products, chart of account) into the PostgreSQL.

Create sales reports.

Environment: Python2.6, Django, PostgreSQL.

Client: Whole sale Pendent (Ktree) Dec 2011 to Mar 2013

Role: Python OpenErp Developer

Project Description:

Ecommerce and OpenERP integration and import product, product category, sale order from Ecommerce. OpenErp Integrated with DHL for import shipping information of customer and generate airway bill report. Create sale analytic xls and pdf reports for monthly, yearly, weekly wise. From delivery orders created dummy invoice form based on alternate product configured in product form.

Responsibilities:

Developed the Application using the OpenErp MVC framework.

Involved in design and development of UI component, which includes XML-RPC.

Implemented data store and Generated reporst for monthly, weekly, day.

Integrated with DHL system to track the shipment data Create sales reports.

Configured LDAP to fetch all the email to openERP database to store the orders.

Environment: Python2.6, OpenErp, PostgreSQL, RML.

Client: Infronics Jul 2010 to Dec 2011

Role: Python OpenErp Developer

Project Description:

Infronics Systems Limited is an Indian company that provides technology solutions in the fields of embedded systems. The project was to manages the work flow of complete purchase orders,buy indent and sends it for approvals before converting it into a purchase order.

Responsibilities:

Understanding the existing framework OpenErp MVC Concepts and Create the new modules.

Implemented to Import the Goods/Products, Customers into the postgresql from Excel files and integrated with OpenErp framework.

Printer module configuration with openERP for printing the pdf reports.

Environment: Python, OpenErp(XML-RPC) Frameworkd, PostgreSQL, RML.



Contact this candidate