Post Job Free

Resume

Sign in

Data Analyst

Location:
Maiden, NC
Posted:
September 08, 2020

Contact this candidate

Resume:

RENUKA MULA

+1-651-***-**** adfxb5@r.postjobfree.com

Dallas, Tx

Objective:

Dedicated, efficient and experienced ETL test lead with hands on experience working with varied tools, changing methodologies (waterall,agile) and teams across world. Excel at delivering projects on time and working closely with Business on all testing activities – Requirement Gathering, Defining strategies, test scripts and providing QA Sign-off.

9+ years of experience with focus on financial domain; working with Tier1 and Tier2 banks in USA and playing key positions in successful implementation of crucial projects like Know Your Customer (KYC).

Core Skills:

ETL Testing:

Experienced working with different source and target systems like Files – Mainframe, SAS and databases –SQL Developer, TOAD, Teradata. Implemented source to target testing using Query Surge, TOAD Automation, Abinitio -GDE and DQE.

Developed moderate and complex python scripts for testing systems.

Experienced working with reports developed using BI tools like OBIEE.

Expertise in understanding volume of data, doing research to determine suitable tool based on Project requirements, budget and available resources.

Expertise in data conditioning using front end Mainframe source systems/ web applications that reduces dependency on other teams and increases coverage of all possible scenarios required for data transformations as data flows across different intermediary systems.

UNIX is used extensively for storage of source files, reading of files, creating new directories.

Hands on experience reviewing mapping documents, writing test scenarios, test scripts and raising defects and retesting.

Experience in big data testing using hive queries through beeline and SPARK commands and HUE. Data in source files are loaded to hive systems which require intense testing of hive data against source and understanding Hive default values and matching it against source.

Implemented Agile methodology using JIRA.

Project Lead and Co-Ordinator:

Have more than 5 years of onsite experience co-ordinating with teams across different time zones, ensuring smooth hands shake everyday by conducting daily status calls.

Open for ideas from team members that result in quicker and efficient way of large volumes of data testing.

Prepare daily status report during execution phase on daily basis and share with stakeholders, development and other testing teams that report percentage of execution, count of test scripts Pass / Failed,roadblocks, status and description of defects.

Abinitio Development:

Have 1 year experience developing graphs using Abinitio GDE that is used for validating rules for each field and generate a report with source, target values and status corresponding to each field.

Graphs developed using Abinito are used in Data Quality Environment (DQE) for extensive testing of all data rules, validation of null values for Primary key fields, counts of distinct values for each field, percentage of fail and pass rows.

Presentation and Communication Skills:

Effectively communicate to developers through emails or defect descriptions for any issues encountered during scenario discussions or test execution.

Providing presentations to clients and stakeholders on all the automation and time saving efforts followed by team and putting a road map on future initiatives.

Acted as a bridge between client and vendor and to meet expectations from both the ends.

Report Testing:

Extensive experience in testing reports migration from SSRS 2005 to 2008.

Conduct continuous review meetings to ensure requirements developed meet the analytical needs of the report users.

Skill Set:

TOOLS

ETL /BI

Query Surge, Abinitio- GDE, DQE, OBIEE reporting, Haddop, HIVE

DATABASES

ORACLE 11i, SQL SERVER 2005, TOAD, SQL DEVELOPER,

TERADATA

AUTOMATION

QUALITY CENTER-ALM, TOAD,TFS,JIRA

WEBSERVICES

SOAPUI

ENVIRONMENTS WORKED ON

UNIX, J2EE,XML,WEB 2.0, AJAX,.Net, PYTHON

DOMAIN KNOWLEDGE

FINANCE, INSURANCE

PROFESSIONAL EXPERIENCE:

Apple,TX, July 2019- Present (Cognizant Technology Solutions)

July 2019-Present

Project details cannot be revealed.

Tools used: GitHub, Jenkins, Python, Teradata, Oracle

CITI Bank, TX, June 2018 – July 2019

Data Management Analytical Reporting (DMAR): ETL Test Lead

This agile project focuses on gathering data from different sources and transforming each source data and loading it to store it in target HIVE systems. Data is stored as tables in lower environments (SIT) and as View in UAT. Data from Views is used for reporting and analytical purposes. Each sprint has 2 weeks of testing and involves testing of 1 source data transformation to target.

Responsible for understanding HIVE systems and how input from source systems (mainframe / SAS files, Teradata, oracle data) are converted when loaded in HIVE UNIX systems. Beeline commands are used to export table data that can further be used as input for automation tools.

Transformation logic is tested by defining rules for source and target fields in DQE and running the appconfigs created using DQE.

Establishing connectivity of Hive tables with Abinitio GDE and DQE helped in thorough validation of 20 + million data.

Developed PYTHON scripts to validate NULL and DISTINCT for all columns in each table.

Accurately articulated detailed data issues and ability to handle data issues using automation to upper management.

Worked on solving issues while testing data using hive and data testing using SQL as Hive does not support functionalities like union, time conversions using unix_time etc.

Responsible for testing CCID merge/ split, Master Customer Repository (MCR) hierarchy for customer accounts loaded in HIVE systems.

Monitored file generation from file systems and ensured all the generated files are loaded to target systems based on CITI cycle calendar.

Developed Abinito graphs using rollup, normalize, reformat and sort components to test full file reconciliation process and loading.

Mainframes is used as front-end application to create test data. Trained onsite and offshore teams in using Mainframe systems to create, modify, remove, swap users in accounts and add balance to single and joint accounts.

Test Cases are designed in ALM covering all the user stories and tasks are created in JIRA for every testing activity. Scrum calls are conducted every day to discuss issues, roadblocks and execution.

Highlighted risks, issues, dependencies in daily status report sent at onsite hours and discussed in weekly meeting. Worked with Data Governance team to update user stories in JIRA after every review meeting.

Worked extensively with reporting team to understand report requirements to highlight data reject reasons and mitigation plan on rejected records.

Environment: HUE, UNIX, JIRA, Abinito GDE, DQE, Application Lifecycle Management (ALM), SharePoint, MSSQL, SQL Server 2000, Windows Server, SharePoint, Mainframes, HTTP, web services MS Excel, Teradata

US Bank, MN, April 2016- May 2018

Customer View Analytics, ETM Financial Crimes and SAS: ETL Test Lead

This project is a part of Know Your Customer implementation to detect fraudulent data and suspicious activities within the bank. Data is cleansed, transformations are performed and MDM process is performed to form clusters and survivors and data is stored in core tables after MDM process is complete. The project is agile and group of multiple source data is handled in each sprint.

Responsible for requirement gatherings, arranging daily stand up meetings with management and development teams and maintaining all documentation like –source to target mappings, design specification documents, interaction with SAS team.

Python scripts are developed to kick off jobs for source to target data loading.

Responsible for writing transformation logic in source query to match target query eg. Writing query in source incorporating standardization of address or phone to match the cleansed data in other stage.

Data in load ready region is modified based on transformation rules and standardization rules and is queried using hive. All the source to comparison data is performed using hive and SQL.

Worked on solving issues while testing data using hive and data testing using SQL as Hive does not support functionalities like union, time conversions using unix_time etc.

Responsible for testing MDM process from MDM stage to MDM which involves party information, matching, clustering, forming survivorship data using SAS.

Extensively worked on to convert ORACLE scripts into Teradata scripts

Developed numerous Teradata SQL Queries by creating SET or MULTISET Tables, Views, Volatile Tables, using Inner and Outer Joins, Using Date Function, String Function and Advanced techniques like RANK and ROW NUMBER functions.

Hive tables are queried to verify data present against send maps document.

Query Surge is the tool used to compare data from source to target. This tool provides a flexibility to compare data from different data sources like Hive tables and Oracle tables.

TOAD is used a tool for data retention.

TOAD automation is used to perform distinct, null and data type test cases that can be used for different stages and for different source of records.

SAS studio is used to verify views and SAS programming is used for querying.

Responsible for Actimize testing – verifying data, risk scores and transaction details flow to Simple Query Tool and Atomise. Integration testing is done to ensure correct flow.

Responsible for working with Data Steward Team to change risk scores, threshold levels and perform testing.

Environment: TOAD, Query Surge, Application Lifecycle Management (ALM), SharePoint, MSSQL, SQL Server 2000, Windows Server, SharePoint, Mainframes, HTTP, web services MS Excel, Teradata.

Wells Fargo, MN, March 2013- August 2014, March 2015- March 2016,

Nexsure, Salesforce, Data Integration and Maintenance, Report Migration-Canned to Custom; Senior ETL Tester and Data Analyst

This project deals with migration of data from an existing SSRS 2005 server to 2008 server that involves source to target testing, application upgrades, data completeness testing, data accuracy testing and incremental ETL testing.

Responsible for leading upgrade activities, providing test timelines, arranging meetings with CET and developmental teams, initiating co-ordination calls with offshore.

Expert in preparing MTP, Requirements Traceability Matrix, QA sign-offs, daily execution reports and sending across the teams.

Expert in writing SQL queries to fetch data from different tables to verify the data in columns of the report.

Data cubes are validated for joins, primary and foreign keys constraints.

Gathered data from OLTP database and non-OLTP systems such as text files, spreadsheets and transformed to match the data warehouse schema.

Involved from initial stages of ETL testing by identifying the data sources and requirements, acquiring data from different resources, preparing data mapping sheets, implementing business logics, populating data and reporting.

All the data is transformed to data warehouse format and various types of keys like primary key, foreign key, and composite keys are defined correctly.

Data cleaning is done to delete the data, identifying null values, resolving incompatible data issues.

Used Toad and SQL Plus for testing execution of ETL Processes' PL/SQL procedures, packages for business rules.

Testing reports (developed using SSRS) after ETL process is performed for every cycle. Migration testing of reports from SSRS 2005 to 2008 server is also a part of the testing.

Expert in handling quarterly upgrades of lower environments with production data and maintenance of production environment and data backup in a mock-up production environment.

Worked extensively verifying data in Excel, HTML, XML and CSV formats in reports.

TFS is extensively used to maintain daily activities and status of the project from QA perspective.

Supported UAT in gathering test data, providing documents of the requirements introduced while testing and ensured that there is 100% coverage.

Extensively worked with business/BTS/CET and developers in discussing the QA scope, end dates and deferred defects.

Environment: QTP, OBIEE 11g,SQL,Informatica,DataStage, Application Lifecycle Management(ALM), SharePoint, MS SQL,SQL Server 2000, Windows Server, SharePoint,Mainframes, HTTP, web services MS Excel, TFS,PHP,TOAD,UNIX shell scripts.

Crystal Cabinets, MN, October 2012- March 2013

New Harmony; Quality Assurance Architect and Analyst

Involved in requirements gathering and prepared User Requirements Document explaining the functionalities and expectations from screens.

Validated reports developed using OBIEE in BI Analytics.

As part of QA team was involved in implementation of ETL best practices.

Created connection pools, physical tables, defined joins and implemented authorizations.

Worked closely with the clients and developer in developing phases for deployment of codes and provided Master Test Plan.

Performed Verification, Validation, and Transformations on the Input data (Text files, XML files) before loading into target database.

Enhancing the cases by unit testing the scripts before creating scenario based tests in Test-Lab Module of Quality Centre.

Executed UNIX Shell scripts to execute different SQL(tm)s to test the sanctity and integrity of the tables.

Ab initio is used for extraction of data from different sources and data extracted and transformed from Ab initio is loaded to oracle tables.

Verified different rule behaviour based on the restrictions imposed and implemented different rules for different functionalities.

Created user templates, security restrictions, customer creations in sales force application that sinks data to users with access in Harmony application.

Responsible for migration of data in sales force to new application.

Thoroughly compared the existing Harmony with the expectations of New Harmony and suggested implementations that were accepted by the client and performed a complete regression of the functionality to ensure there is no loss of data or functionality.

Delivered thorough QA testing reports that determined product quality and release readiness.

Mentored group of people in planning quality assurance tasks and provided QA estimates using MS Excel and TFS.

Provided estimates, timelines and resource allocation to clients and got approvals. Share point was used to maintain the documentation of BRD, Enhancements, Release Notes, QA sign-offs etc.

MS Excel is used to write test cases, run the test sets and find the execution status and percentage.

Submitted test execution status on daily basis to the client. Different fields are created and many drop-downs are created using excel that enhanced my proficiency using MS Excel.

All the defects are logged on day-to-day basis and defects are regularly monitored.

Assisted the clients in figuring out the new pattern implementation and suggested implementation plans.

Estimates were submitted perfectly in accordance with the requirements and the criticality and execution was achieved within the specified estimates with User Acceptance.

Environment: Quality Center10.0,Application Lifecycle Management(ALM), SharePoint, MS SQL, SQL Server 2000, Windows Server, SharePoint,Ab-initio,Mainframes, HTTP, web services-Java, MS Excel.

Deluxe Corporation, MN,May 2009- May 2012

Order pro Waive Reduction; Senior Software Quality Assurance Analyst

Supervised a five-member software QA testing team in developing and implementing quality assurance and quality-control methodologies to ensure compliance with QA standards. Orderpro application data is stored in a back-end mainframe application called Oneplus. Order processing and verification of data is performed in mainframe region.

Created and executed automated software test plans, cases and scripts to uncover, identify and document software problems and their causes. Led QA testing that:

oAssured all the newly introduced complicated waive reductions in combination with the type of subscription opted by privileged customers are implemented correctly that would help Deluxe Corporation in retaining their customers

oPinpointed previously undiscovered flaw in the waive reductions offered to customers that resulted in loss to Deluxe Corporation

oAnalyzed exhaustively on each waive reduction process implemented previously and prepared reports based on which new processes for waive reduction are introduced

Transitioned software development efforts to a test-driven development (TDD) process, which brought QA testing in on the front-end of the development cycle for gains in code quality, software functionality and programmer productivity.

Responsible for analysing data in legacy systems and integrating it with sales force.

Sales force login creation and its access from legacy systems bases on used rights are extensively tested.

Used TOAD for SQL Server to write SQL queries for validating constraints, indexes

Maintained, back up and restored the Informatica ETL repositories for the QA team

Tested data folders, sources, global targets, ODBC connections, data profiling and manage metadata.

New accounts were created to group different users with different access in salesforce.

Main Frame was used to test the back-end application.

MS Excel was used to write test cases and report status and maintain execution status.

Quality Centre was used to report defects and to maintain status of defects.

Worked closely with the clients and development team in preparing requirements and helped them with analysis, reports and results.

Conducted formal and informal product design reviews throughout the software development lifecycle to provide input on functional requirements, product designs, schedules and potential issues. Leveraged developer background to communicate effectively with software design team, quickly gaining their respect and becoming a valued, “go-to” team member on challenging test cases.

Helped testing team members in understanding actual scenario and expected new scenarios before the execution started that resulted in flawless execution and defect free delivery for this highly prestigious and complicated project for Deluxe Corporation

Environment: Load Runner9.5, QTP10.0, Quality Center10.0, Application Lifecycle Management(ALM), SharePoint, MS SQL, SQL Server 2000, Windows Server, SharePoint,Mainframes, HTTP, web services-Java, MS Excel

Education:

Bachelor of Technology in Computer Science Engineering with Distinction of 85.4% equivalent to 8.83 GPA.

Achievements:

1)Best performer award for 2017-18 at US Bank from client side.

2)Client Managers giving opportunity to work with them for longer tenures with trust and faith in my implementation techniques.

3)School and College topper with percentages of 88.2 and 97.2 respectively (equivalent to 3.8 and 4.0 GPA respectively).



Contact this candidate