Suhasini Athimamula
Mail ID – ***********@*****.***
Phone: 1-201-***-****
Data Engineer/Snowflake Developer/Etl Informatica Developer
Profile Summary
8+ years of IT experience in Data Warehousing technology and all phases of Software Development Life Cycle (SDLC) including Business Requirement Analysis, Application designing, Development, Implementations and Testing of Data warehousing and Database business systems for Banking, Financial, Insurance and Retail domains. Excellent experience in designing and development of ETL Methodology using Informatica PowerCenter and Talend. Experience in working with Snowflake, Oracle, Teradata, Netezza, and SQL Server databases. Extensive experience with Data Extraction, Transformation, and Loading (ETL) from Multiple Sources. Worked extensively in design and analysis of Requirements, Development, testing and Production deployment. Prepared various types of documents like ETL specification, Data Mapping, Test cases, Data dictionary etc. Extensive knowledge of Informatica tuning and SQL tuning. Designed and implemented data models using DBT.
Executed AWS DMS jobs for data migration, ensuring seamless transfer and validation. Ran IICS jobs in development and QA environments to enable unit testing and data integration.
Technical
Data warehousing, Data modeling (E-R and Dimensional modeling), Star schema, Snowflake schema, Data Analytics, Data quality, Data Extraction, Code conversion, Migration, Data visualization and reporting, Unix scripting and Performance Tuning. Code deployment using Gitlab CI/CD Pipeline.
Tools, Languages, RDBMS, and OS
Snowflake, DBT, SQL, AWS, Airflow,IICS,Talend Open Studio, Informatica Power center, SQL Server management studio, Microsoft Azure storage Explore, Netezza, Teradata studio, Run deck, Control M, Chronicler, Gitlab, Unix, Tableau, Data visualization and reporting, DB2, Mainframe and Windows OS.
Cloud Technologies
Experience in Snowflake
Exposure in AWS (Amazon Web Services), Azure
Skills Summary
Solutioning
Client Relation
Team Building
Data Integration
Data Extraction
CI/CD Integration
Data Modeling
Unit Testing
Data warehousing
Process Compliance
Performance Tuning
Production Support
Data testing
Data Migration
Scheduling / Automation
Risk Management
Unix/SQL Scripting
Programming Skills
Major Demonstrated Strengths
Result oriented, analytical and self-driven professional seasoned with years of working in a multi role of
Snowflake Developer, Designing, developing, implementing & maintaining applications of ETL applications, Data warehousing, ETL workflow design and implementation in informatica for enterprise application.
Courses and Certifications:
AWS Certified Cloud Practitioner
Snowflake –The Complete Master Class
Career Essentials in Business Analysis
Certified Tester Foundation Level (CTFL) - ISTQB
Work History
MAY 2023 – Till date – Data Engineer at Optum, USA
Snowflake JIM and Dwaas Project: JIM is an Enterprise level Data Management Capability which is built to create an Interoperable data stream to support businesses by connecting all issue data across Enterprise into a common Enterprise-level-end-to-end visibility across UHC and Optum. JIM is a streaming platform. JIM is designed with the principle of getting full payload at any point in time. JIM is leveraging Kafka streaming (Kafka as a Service) to connect and share to have near real time data stream.
Responsibilities:
Good Understanding of Snowflake architecture.
Hands-on experience on Snowflake data loading process.
Developed ELT process to ingest source data.
Loaded UDW claims data to Snowflake Environment.
Automated monthly file download Process from Rest API to Snowflake using Python Executable file.
Created Talend Packages to Extract huge volume of data from UDW to Snowflake.
Performed the initial Technical Analysis and project feasibility for new business requirements.
Understanding of row level and column level security provided by Snowflake.
Tested the codes and deployed it to the QA environment for UAT.
Translated business rules and functional requirements into ETL procedures.
Validated Source data from ORS and cross validating in JIM at Snowflake.
Reviewed the code changes that were done by the team members of the project.
Provided the review comments for changes/improvements identified in the existing/new code.
Handled all production support related activities like job monitoring.
Verified Prod Email alerts and data quality checks for all components involved.
Maintain a positive working relationship with all levels of the organization.
Knowledge of User and Access management in Snowflake.
Developed CI/CD pipeline using GitHub Cloud.
Involved in Data Validations of different Integrations in Production.
Worked with Service Now Tool (Incident Management System) and made sure to report the incidents being created during the production cycle.
Technologies Used – Snowflake, DBT, Talend, GitHub, Snowflake,Snow SQL, AWS DMS, Apache Airflow, Bitbucket, SQL Server management studio, Microsoft Azure storage Explore, Teradata studio, Run deck, Chronicler, IntelliJ, Open Lens, Tableau, SQL, Oracle, Rally etc.
NOV 2021 – APR 2023 – Sr. Project Engineer at USAA, USA
Netezza-Snowflake REPLATFORM Project: United services Automobile Association USAA is a Fortune 500 financial services company offering banking, investment and insurance to people and families that serve, or served in the United States military.
USAA is product-based company in the areas like Automobile, Insurance, P&C, Bank and wide range of products
Environment: Informatica Power center 10.5,10.2,9.6, Snowflake, Oracle 11g/10g, MySQL, SQL Server 2012/2014, Netezza, Control M, Gitlab, Unix
Responsibilities:
Worked in Informatica, Snowflake, Netezza for the project of Netezza to Snowflake migration.
Performed the initial Technical Analysis and project feasibility for new business requirements.
Recreated the informatica jobs from Netezza to snowflake.
Tested the codes and deployed it to the QA environment for UAT.
Create Control-M jobs to point to Snowflake.
Reviewed the code changes that are done by the team members of the project.
Fix informatica code which is converted for snowflake.
Develop mappings, sessions, workflows and enhancement of the existing mappings using informatica power center 10.5.
Expertise in snowflake data modelling, ELT using snowflake SQL, Snowflake Task Orchestration implementing complex stored Procedures and standard DWH and ETL concepts.
Create and update Control-M scheduling working with application team and project manager to design complex batch automation.
Schedule jobs in Control M & Integration of multiple applications in Control-M.
Monitoring and Optimization of existing Control-M Batch Jobs.
Extensive use of Quantitative and Control resources for workload balance and to make the jobs wait at the time of server/application maintenance.
Provide plans to carry out major migrations and upgrades.
Create standard requirements for testing of jobs prior to movement of jobs to production and for the migration to production.
Performed data validation after the successful End to End tests and appropriate error handling in ETL processes.
Responsible for interaction with the client on a day-to-day basis, attending status meetings.
Apr 2019 – Oct2021 – Commonwealth of Pennsylvania worked through Deloitte, Pennsylvania, USA
Text Messaging System – The Commonwealth of Pennsylvania employs thousands of people across the state to deliver programs and services to the Citizens of Pennsylvania. This project involves building a data warehouse by consolidating data from a variety of sources into a central data warehouse to solve multiple use cases to support downstream application.
Responsibilities:
Involved in all phases of SDLC from requirement gathering, design, development, testing, Production user training and support for production environment.
Actively involved in interacting with business users to record user requirements and Business Analysis.
Outlined the complete process flow and documented the data conversion, integration and load mechanisms to verify specifications for the project.
Used SQL tools like TOAD to run SQL queries to view and validate the data loaded into the warehouse.
I wrote hundreds of SQL queries for data analysis to validate the input data against business rules.
Parsing high-level design spec to simple ETL coding and mapping standards.
Creating and maintaining source-target mapping documents for ETL development team.
Worked with PowerCenter Designer tools in developing mappings to extract and load the data from flat files and SQL server database.
Maintained warehouse metadata, naming standards and warehouse standards for future application development.
Created the design and technical specifications for the ETL process of the project.
Used Informatica as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load into staging tables from various sources.
Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups using Connected, Unconnected, Expression, Aggregator, Update strategy& Store Procedure transformation.
Worked with various complex mapping, designed slowly changing
Performance tuning of the process at the mapping level, session level, Source level and the Target level.
Created Workflows containing command email, session, decision and a wide variety of tasks.
Tuning the mappings based on criteria, creating partitions in case of performance issues.
Performed data validation after the successful End to End tests and appropriate error handling in ETL processes
Developed Parameter files for passing values to the mappings as per requirement.
Scheduled batch and sessions within informatica using informatica scheduler and also customized pre-written shell scripts for job scheduling.
Technologies Used – Informatica PowerCenter 10.4, Oracle 11g, Tableau 2020.3, Cognos 11, SQL, Oracle 19, Opkon etc.
Feb 2019 – Apr 2021 – Horizon BCBSNJ, Pennington, NJ
Claim Data Mart - Horizon Blue Cross Sheild of New Jersey provides health insurance coverage to more than 3.8 million people throughout all of North, Central and South Jersey. Horizon Blue Cross Blue Sheild of New Jersey is best known for our managed care and traditional indemnity plans for individuals and employers – the cornerstone of our health care business. This project integrates multiple application data into enterprise data warehouse.
The goal of the project was to build an Analytics for Medicaid and Medicare programs under the government Program, providing the capability to identify past improper payments, prevent future improper payments and to strengthen the existing anti-fraud strategies. Relevant data was integrated from various source systems including QNXT, Leon and EZCAP using ETL informatica tools. Reports and interactive dashboards were developed that allowed the company to track payments for better reporting.
Responsibilities:
Worked on Business users to get the requirement and to involve in complete SDLC of this project.
Analyzed source data and gathered business requirements from the business users and source data team
Created Technical Design Document from business requirements document (BRD)
Analyzed business and system requirements to identify system impacts.
Created the Detail Technical Design Documents which have ETL technical specifications for the given functionality.
Prepared source to target mapping and conducted meetings with the business to understand data/ transformation logic.
Analyzed the existing mapping logic to determine the reusability of the code.
Created Mapping Parameters, Session parameters, Mapping variables and Session variables.
Done the extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
Created Unit test plans and did unit testing using different scenarios separately for every process. Involved in System test, Regression test and supported the UAT for the client.
Performing ETL and database code migrations across environments using deployment groups.
Populating the business rules using mappings into the target tables.
Involved in end-to-end system testing, performance and regression testing and data validations.
Managed performance and tuning of SQL queries and fixed the slow running queries in production.
Created batch scripts for automated database build deployment.
Modified UNIX scripts to run informatica workflow.
Technologies Used – Informatica PowerCenter 9.6.1, SQL Server 2012, Shell Scripts, ORACLE 11g, SQL, PL/SQL, UNIX, Toad, SQL Developer, Cognos 9, Autosys, JIRA
Mar 2017 – Feb 2019 - AT&T, Inc., Bedminster, NJ
Customer Data Mart - AT&T Inc. provides telecommunication, media, and technology services worldwide. The company provides wireless and wireline telecom, video, and broadband and Internet services; video entertainment services using satellite. This project falls under Order management system. AT&T sells their product through an ecommerce channel; this project integrates from multiple applications, from the time order is created to the time when the goods and services are delivered and billed. Informatica is being used to integrate data into ecommerce order data mart. The reporting tools are Cognos and tableau. The objective this data marts is to enable improved decision-making, strategic plans, support and solutions that favorably impact costs, quality of care, outcomes, and customer satisfaction through an information-driven environment that leverages their integrated data assets for competitive advantage.
Responsibilities:
Co-ordination from various business users’ stakeholders and SME to get Functional expertise, design and business test scenarios review, UAT participation and validation of data from multiple sources.
Worked on Power Center Designer tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
Used Debugger within the Mapping Designer to test the data flow between source and target and to troubleshoot the invalid mappings.
Created Mappings using Mapping Designer to load the data from various sources, using different transformations like Source Qualifier, Expression, Lookup (Connected and Unconnected), Aggregator, Update Strategy, Joiner, Filter, and Sorter transformations.
Used SQL to analyze source data, performed data analysis and validate the data.
Scheduled Informatica Jobs through Autosys scheduling tool.
Studying the existing system and conducting reviews to provide a unified view of the program.
Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables and Session Parameters.
Involved in creating Informatica mappings, Mapplets, worklets and workflows to populate the data from different sources to warehouse.
Responsible for facilitating load testing and benchmarking the developed product with the set performance standards.
Involved in testing the database using complex SQL scripts and handled the performance issues effectively.
Involved in Onsite & Offshore coordination to ensure the completeness of Deliverables.
Technologies Used – Informatica Power Center 8.6, Oracle 10g, SQL*Plus, SQL Server 20012, Teradata 14, Cognos, JIRA
Education
2006, bachelor's in science, – Kakatiya University, India