Taniya Mathur
Software Developer
Email: **************@*****.***
Linkedin: https://www.linkedin.com/in/taniyamathur/ Ph #: 240-***-****
Professional Summary
7.5 years of experience in Analysis, Design, Development, and Implementation of microservices, web applications, B2B integration, and client-server applications, with a strong focus on Big Data solutions and AWS technologies.
Developed high-performance, scalable data products end-to-end, utilizing PySpark, AWS Glue, S3, and Lambda to process and analyze large datasets for multiple projects at Amazon.
Designed, built, and optimized ETL pipelines for data ingestion, storage, processing, and analysis using tools like AWS Glue, EMR, PySpark, Lambda, and Athena with various data formats, including Parquet, CSV, and JSON.
Expertise in data modeling and schema design for both SQL and NoSQL databases, enhancing storage and query performance for large-scale datasets.
Hands-on experience with AWS services, including S3, Lambda, CloudWatch, CloudFormation, IAM, SNS, SQS, Redshift, Glue Catalog, Step Functions, and Athena, to architect and manage data- driven solutions.
Skilled in AWS infrastructure as code using AWS CDK with TypeScript, Terraforms and Cloud formation to automate deployment and management of cloud resources.
Experienced in implementing performance tuning through caching, scaling strategies, and optimizing data flows to enhance system performance.
Proficient in Go Lang, building RESTful microservices with dependency injection, unit testing, and ensuring maintainable and scalable backend solutions.
Extensive experience in Test-Driven Development (TDD) and working with Agile and Scrum methodologies, contributing to efficient and reliable code delivery.
Hands-on experience with B2B EDI integration using Sterling Integrator, handling transactions in ANSI X12 formats and providing production support for integration solutions.
Adept in Object-Oriented Programming (OOP) principles like multi- threading, exception handling, and collections to deliver high-quality software solutions.
Proficient in writing and integrating unit tests using UnitTest/PyTest, Go Test, ensuring high code quality and seamless integration with 1
CI/CD pipelines.
Strong communication, time management, and team collaboration skills, ensuring efficient delivery of solutions while maintaining high standards.
Education:
PGBDM in Computer Applications, Pune University India 2010-2012
Diploma in Electronics and Telecom Engineering, Maharashtra Technical Board, India (2004-2008)
Technical Skills:
Languages Python, Go, PySpark, Java, JavaScript, TypeScript, Perl Web
technologies
React, Django, HTML5, CSS3, XHTML, Jquery, Flask
Cloud AWS Serverless (Lambda, ECS, CDK, Glue, S3, SQS, SNS, Athena, Step functions, Route53, Load balancer, Auto Scaling, EMR)
Micro Services SOAP, RESTful
Data
Enginnering
Pyspark, Pandas, Numpy, AWS Glue, Athena
Databases MySQL, Redis, DynamoDB, SQLite
Defect
Tracking
JIRA
Version
Control
Git-Hub
Operating
System
UNIX, Linux, Windows
B2B
Integration
IBM Sterling integrator (Linux), ANSI X12, AS2
InfrastructureA
sCod
AWS CDK, Terraform and Cloud formation
Professional Experience:
Client: Amazon Web Services (AWS), Remote
Oct 2022 - Dec 2024
Role: Software Developer
Responsibilities:
2
Developed highly available, scalable and efficient data product using AWS Step function to process data request using multiple lambda's calling down-stream API's and data intergation using S3, SQS, Dynamo DB and SNS topics.
Implemented server-less batch orchestrator using AWS step functions, lambda, SQS, DynamoDB and s3 services. The state-machine uses SNS wait logic for the orchestration components to complete.
Designed the orchestrator workflow using design inspector and documented HLD and LLD in wiki page.
Designed and orchestrated AWS Glue jobs for multiple data pipelines in Python and Spark to process parquet, json and csv files from AWS S3 and perform multiple transformations utilizing UDF using AWS Lambda created in Python, Go, and Java. The output parquet dataset cataloged using Glue and exposed to consumers using IAM role.
Implemented several lambda in python, java and go to consume REST endpoints and return response to UDF call from AWS Glue job.
Implemented the infrastructure code in TypeScript using AWS CDK for deploying Glue job, Lambda, Glue trigger, Crawler, and event processing framework using lambda to trigger glue job.
Monitor Glue job run and debug any errors by investigating cloud watch logs.
Create detailed pull request with detailed explanation of changes requested in the PR, description of changes, evidence of testing completed and extent of code coverage.
Perform code review for team members and monitor deployment pipeline.
Implement performance tuning by implementing auto scaling, optimizing lambda concurrency, implementing caching etc.
Review requirements to be implemented in the ticket and coordinate with stakeholders to clarify any questions.
Create a detailed wiki page to capture any design changes or new implementations to be reviewed by different team members.
On-board projects with a different team product, review on-boarding documentation and coordinate with project spec if necessary to onboard in all environments.
Create metrics, alarms, and dashboards in Cloud Watch for monitoring and alerting for each application deployed in production.
Create deployment release notes for the deployment team to deploy and validate the application in pre-prod and prod environments. Environment: Python, Java, Go, PySpark, AWS Glue, Athena, Step functions, Crawler, Catalog, S3, Dynamo DB, TypeScript, AWS Cdk, PyTest, Agile, Linux, Pandas, Numpy
3
Professional Experience:
Client: Axiom Financial Services, Remote
Sep 2019 – Oct 2022
Role: Software Developer
Responsibilities:
Developed backend application in Python using Django framework and deployed in AWS EC2 using AWS-managed MySQL DB.
Implemented RESTul based microservices in Go lang and Django rest framework for different websites.
Implemented front end in HTML, CSS, Java Script (jquery) and React Js.
Comsumed REST end-points exosed from Django rest framework and Go Gin servers from front end in React.
Deployed website in AWS using EC2, Route53, Nginx and S3.
Developed Infrastructure code in AWS CDK using Type script.
Test-driven unit testing in py-test and go testing framework.
Created DNS configuration and routing using Route 53.
Developed admin.py for different components to define and customize Django admin.
Created Apache web server in EC2 instance for routing requests to Django backend.
Used Python-based GUI parts for the front-end usefulness, for example, determination criteria, made test saddle to empower exhaustive testing using Python.
Used React for consuming REST API web services using HTTP / AJAX libraries.
Built various graphs for business decision-making, using the Python library.
Perfrom data analysis and visualization using Python pandas framework and perform data enrichment and curification.
Used Amazon Web Services (AWS) for enhanced proficiency of capacity and quick access.
Worked on PEP-8 coding standard and tested a program by running it across test cases to ensure validity and effectiveness of code using PyChecker and PyLint.
Developed tools using Python, Shell scripting, and XML to automate some of the menial tasks.
Participate in design discussion, perform code review, and monitor 4
deployment in non-prod and prod environments.
In-corporated Agile, Jira, and Scrum techniques to manage requirements and enhance the application.
Environment: Python, Django, React, Go, MySql, AWS EC2, Route 53, S3, CDK, Cloudwatch, HTML, CSS, Shell Scripting, Pytest, Agile, Linux. Professional Experience:
Client: SAS Group Of Companies, Remote
Aug 2017 – Sep 2019
Role: EDI Developer
Responsibilities:
Worked on supply-chain B2B projects and executed projects in each step of the software development life cycle including requirement analysis, design, development, unit testing, integration testing, deployment, and production support.
Created EDI implementation guides utilizing the EDISIM tool and participated in project workshops with business partners during different stages of project development.
Set up B2B external partner and onboard using Sterling integrator on different EDI transactions and perform necessary connectivity setup using AS2 and sftp.
Develop complex business processes and maps in Sterling integrator for processing B2B transactions including EDI to EDI and EDI to application including flat file for mainframes and IDOC for SAP for inbound, and backend application to EDI formats for outbound transactions.
Experience with file transfer protocols (FTPS, SFTP, AS2).
Production support and On-call
Implement custom Java code in business processes using Java task service.
Invoke scripts implemented in Python and Perl using a command line adaptor from business processes. These scripts were primarily created for implementing pre-processing logic and housekeeping.
Document support procedure, outage procedure, gap analysis, and technical, functional, and architectural specifications.
Identify system improvements and implement solutions to scale system performance, optimize business processes, map, and tuning business process, and thread load balance.
Improved system performance by optimizing core inbound and outbound business process in Sterling Integrator.
Upgrade Sterling integrator version and installation of patches and 5
perform disaster recovery testing.
Environment: Sterling Integrator, Java, ANSI X12 (214, 204, 850, 810, 856, 820, 997), EDI, Perl, Python, XML, Shell Script, Linux. 6