Resume

Sign in

Data Analyst/ETL Lead

Location:
Apex, NC
Posted:
May 14, 2020

Contact this candidate

Resume:

SRILAXMI NEMANI

Email: adc7l7@r.postjobfree.com

Mobile no: +1-919-***-****

LinkedIn: https://www.linkedin.com/in/srilaxmi-nemani-293089162/

Professional Summary

Innovative professional with around 9+ years of experience within the IT industry and the skills to manage all aspects of IT Application Services for Banking, Insurance and Retail clients.

Proficient in Informatica PowerCenter, SQL, Oracle, Unix/Linux shell scripting.

Extensive experience in Data Warehouse applications using Informatica PowerCenter on Windows and UNIX platforms.

Expertise in working with relational databases such as Oracle 11g/10g, Teradata and File systems such as Hadoop.

Strong experience in Extraction, Transformation and Loading (ETL) data from various sources such as Hadoop files, Oracle, Teradata and formats like flat-files, COBOL files, XML files using Informatica Power Center (Repository Manager, Designer, Workflow Manager and Workflow Monitor) and load into targets such as Oracle and Teradata.

Expertise to understand the source data and come up with business logic to create Business/Functional requirements documents.

Experienced in documenting High Level Design, Low level Design, Unit test plan, Unit test case and post development documents.

Developed tactical and strategic plans to implement technology solutions and effectively manage client expectations.

Developed effective working relationships with client team to understand support requirements.

Excellent interpersonal and communication skills, technically competent and result-oriented with problem solving skills and ability to work independently and use sound judgment.

Ability to handle multiple tasks simultaneously.

Good interpersonal and team co-operation skills.

Proactive leader with refined acumen and exemplary people skills.

Facilitate a team approach to achieve organizational objectives, increase productivity and enhance employee morale.

Quick study, with an ability to easily grasp and apply to the application with new ideas, concepts, methods and technologies. Dedicated, innovative and self-motivated team player/builder.

Exceptional leadership, organizational, oral/written communication, interpersonal, analytical, and problem resolution skills. Thrive in both independent and collaborative work environments.

Avid learner and adaptable to any new technology.

Technical Skills

Tools : Informatica Power Center 10.2/9.6/9.1, Teradata SQL Assistant 13.0

Databases : Oracle 11g/10g, DB2, Teradata 13.0

File System : Big Data Hadoop

Languages : SQL, PL/SQL, Python, Hive, Spark,Unix/Linux shell scripting, JCL

Database Tools : HUE, SQL Developer,, Advanced Query Tool

Operating System : Windows 7/8/10, OS 390, Unix, Linux

Tools : Quality Center, JIRA, Tivoli, Advanced Query Tool, Maestro

Education

Bachelor of Technology (Electronics and Communications Engineering) from Prasad.V.Potluri Siddhartha Institute of Technology (JNTU Affliated), India

Professional Experience

Client: CareFirst BlueCrossBlueShield, Maryland Oct 2019 - Till Date

Role: Technical Business Analyst/ETL Lead

Description:

CareFirst is a Non-Profit Healthcare organization which is acquired by BlueCrossBlueShield which generates invoices for people who are enrolled and submit claims. As part of this project we source data from Flat files, Oracle marts and load into Oracle marts for invoices to be generated.

Responsibilities:

Involved in gathering and analyzing the business requirements provided by Data Science team.

Analyzed the source data in Oracle marts to create the Business/Functional Requirements Documents to extract the data.

Worked with Informatica PowerCenter, created spark SQL’s to extract and load data as per the requirements.

Extracted data from Oracle marts and flat files and transformed it in accordance with the Business logic and loaded data into Oracle data marts using ETL tool.

Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the Business requirements.

Created Unix shell scripts to execute the Informatica Workflows.

Involved in Performance Tuning of mappings in Informatica.

Performed unit testing at various levels of the ETL and actively involved in team code reviews.

Participated in weekly status meetings, conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.

Environment and Tools: Informatica PowerCenter 10.2, Oracle 11g, SQL, SQL Developer, UNIX, Flat files.

Client: MetLife, Cary NC Jun 2018 – Sep 2019

Role: Data Analyst/ETL Lead

Description:

The Project FIRST refers to the long-term Financial Information and Reporting Standardization Transformation program, which is aimed at providing an integrated, standardized financial reporting solution to produce both internal management reports and external regulatory filings. The FIRST program focused on establishing an environment and process to capture Policy Administrative System data from the applications supporting the MetLife block of business from different regions such as Japan, Korea, Poland and Chile.

Responsibilities:

Gather requirements from the business owner and perform requirements analysis.

Understand Mapping documents and Non-functional requirements to create detailed design specifications.

Using Informatica Power Center Designer analyze the source data to Extract & Transform from source systems by incorporating business rules using different objects and functions that the tool supports.

Created mappings, mapplets and User defined functions to transform the data according to the business rules.

Used various transformations like Source Qualifier, Joiner, Lookup, Normalizer, SQL, Router, Filter, Expression, Aggregator and Update Strategy.

Created and Configured Workflows and Sessions to load the data from Big Data source to Oracle staging tables.

Created shell scripts as per the project requirement to schedule workflows.

Used Maestro as a scheduler to run the UNIX scripts.

Constantly interacted with business users to discuss and clarify the requirements.

Assigned and monitored work allocated to the team.

Attended everyday status meetings and ensured the work is delivered as per the schedule agreed.

Overseeing quality procedures for the project and ensuring process adherence.

Review of deliverables, Implementation & verifying final results.

Preparation of Work status and process improvement related reports.

Worked with dependent teams to make sure required data has been extracted and loaded and performed the Unit Testing and fixed the errors to meet the requirements.

Copied/Exported/Imported the mappings/sessions/Worklets /workflows from development to Test Repositories and promoted to Production.

Used Workflow variables, Session parameters, Mapping parameters and created Parameter files for imparting flexible runs of workflows based on file specific values.

Performed Unit testing and Integration testing. Resolved issues identified in QA, UAT and promoted the code to Production.

Prepared required documents for handling post development issues.

Environment and Tools: Informatica Power Center 9.6, Oracle 11g, Hive, SQL, SQL Developer, Unix, Linux, Flat files, Maestro, Advanced Query tool, Shell scripting.

Client: AVIVA, Norwich UK Aug 2014 – Sep 2015

Tata Consultancy Services, India

Role: Informatica Power Center Developer

Description:

AVIVA is the UK's largest multinational insurer and one of Europe's leading providers of life and general insurance. UPP (Underwriting, Pricing and Product) is an Analytics database where data from various sources is loaded for further analysis by Business to decide the pricing of various insurance products in order to address the competitive market. As part of this UPP re-engineering project data which was being loaded to UPP (Oracle Database) through legacy mainframe application is now being migrated to ETL using Informatica & Teradata Marts.

Responsibilities:

Followed Agile-Scrum methodology to design, develop, test and deliver the code.

Designed ETL flows to implement using Informatica Power Center as per the mapping sheets provided.

Involved in creating databases, views of required Teradata datamarts to represent the source tables & staging data marts.

Involved in ETL development, creating required mappings for the data flow using Informatica.

Used transformations like Filter, Aggregator, Joiner, Expression, Lookup, Router, Sorter and Union.

Involved in developing the mapplet, reusable transformations and User defined functions according to the data flow requirements.

Performance tuning done by implementing all mappings to come under (PDO) Push down Optimization.

Used BTEQ to create and edit Teradata table/view definitions.

Used Fast Export to export data from Teradata tables to Flat files.

Used Fast Load to load data from Flat files to Teradata tables.

Teradata Parallel transporter connection was used for loading data into Teradata data marts.

Developed UNIX scripts to schedule the workflows.

Used mapping & workflow parameters to make the ETL process more flexible.

Validated the output according to the mapping sheet specifications.

Developed Mainframe jobs to schedule the UNIX scripts which run the Informatica workflows as per the business requirements.

Enhancements in existing mappings according to the business logic.

Attended daily scrum calls to post the required status update.

Testing the data against the existing logic and defect fixing by correcting the ETL logic.

Testing activities involve Unit Test Plan Preparation, Unit Testing, Sprint Testing, UAT and performance testing.

Environment and Tools: Informatica PowerCenter 9.1, Teradata SQL Assistant, SQL, JCL, Oracle 11g, Teradata 13.0, UNIX, SQL Developer, JCL

Client: AVIVA, Norwich UK June 2013 – Aug 2014

Tata Consultancy Services, India

Role: Informatica Power Center Developer

Description:

AVIVA is the UK's largest multinational insurer and one of Europe's leading providers of life and general insurance. HRP (High Risk Products) is a monthly process of loading data of 6 high risk products defined by business into Teradata datamarts. Data will be received from business in the form of excel sheets. As part of this project source files are being transformed to XML’s using B2B data transformation and then being Ftp’d to UNIX servers, which is then being extracted, transformed according to the business logic and loaded to Teradata marts for further analysis by business.

Responsibilities:

Followed Agile-Scrum methodology to design, develop, test and deliver the code.

Designed ETL flows, mapping sheets by analyzing the XML data to implement the same in Informatica.

Involved in creating equivalent Teradata datamarts to load the data of each product.

Involved in ETL developing, creating required mappings for the data flow using Informatica.

Used transformations like XML source qualifier, Source qualifier, Filter, Aggregator, Joiner, Expression, Lookup, Router, Sorter and Union.

Involved in developing the mapplet, reusable transformations and User defined functions according to the data flow requirements.

Developed UNIX scripts to schedule the workflows.

Developed Mainframe jobs to schedule the UNIX scripts which run the Informatica workflows as per the business requirements.

Validated the output according to the specifications.

Attended daily scrum calls to post the required status update.

Testing activities involve Unit Test Plan Preparation, Unit Testing, Sprint Testing, UAT and performance testing.

Environment and Tools: Informatica PowerCenter 9.1, Teradata SQL Assistant, SQL, JCL, Teradata 13.0, Unix, JCL, JIRA

Client: Aviva, Edinburgh UK April 2012 – May 2013

Tata Consultancy Services, India

Role: IBM Content Manager Developer/Support

Description:

AVIVA is the UK's largest insurer and one of Europe's leading providers of life and general insurance. Content manager is an IBM tool used to archive/retrieve and perform many other User required functions on the individual Policy documents which are archived as a digital image. As part of this project we are required to maintain the application as an admin team, create new tables and make enhancements to existing tables to store data as per the business requirements.

Responsibilities:

Responsible for the management of Development and Maintenance related projects directed towards strategic business and other organizational objectives.

Client Interaction on new requirements.

Analysis of client specifications, design of business artifacts.

Involved in Development project with emphasis on Requirements gathering, Design specifications, estimations and supporting in all phases of SDLC.

Work allocation to team members and guiding them.

Net meetings and calls with users as part of resolution of requests.

Incident Management and On call – Adhering to SLA

Coach, mentor and lead personnel within a technical team environment.

Overseeing quality procedures for the project and ensuring process adherence.

Review of deliverables, Implementation & verifying final results.

Preparation of Work status and process improvement related reports.

Testing activities involve Unit Test Plan Preparation, Unit Testing and UAT.

Environment and Tools: IBM Content Manager, File Aid, Endevor, TWS-OPC, JIRA, COBOL, JCL, DB2

Client: Scottish Widows, Edinburgh UK June 2009 – July 2011

Tata Consultancy Services India

Role: Mainframes Developer/Support

Description:

Scottish Widows is an insurance provider taken over by Lloyd's TSB. Policy Administration System is one of the Scottish widow’s systems for administering policies. PAS administers policies right from the completion of new business until the beginning of Claims process for Life insurance policies. Various administration tasks carried out by PAS are increments, decrements, regular withdrawals, switches, illustrations, etc.

Responsibilities:

Enhancements to the existing functionality.

Support tasks for answering user queries and provide the required information to the users in the format that they requested.

Analysis of issues encountered in Live environment and taking responsibility to resolve them by providing best solutions, either by code changes or database changes and track them till closure.

Solving Batch incidents.

Analyzing and solving user & problem tickets within the described SLA.

Preparing monthly reports on tickets resolved and tracking the progress of all the team members.

Environment and Tools: File Aid, Easytrieve, CA-7 Scheduler, JCL, COBOL, IMS

Awards

Received “ON THE SPOT AWARDs” for resolving issues in project critical times.

Appreciated by Client and project teams for providing simple solutions for complex business requirements.

Received “Smiles” award from Talent acquisition group for actively participating in recruitment activities as a Technical interviewer.



Contact this candidate