Post Job Free

Resume

Sign in

Data Etl

Location:
Hyderabad, Telangana, India
Posted:
April 03, 2020

Contact this candidate

Resume:

Pind Archana

ETL Tester

adcl2g@r.postjobfree.com

Phone: 949-***-****

Summary

Over 6+ Years of Extensive experience in IT Industry with experience in all phases of software development with emphasis on Quality Assurance & SAP Tester/ Data Analyst.

Involved in Full Life Cycle Development (Waterfall & Agile) of building a Data Warehouse on Windows and Unix Platforms for Investment Banking, Financial and Health Care Industries.

Expert knowledge in working with Data Warehousing tools (ETL tools) like Informatica Power Center 9.1/8.6.1/8.1.1/7.1/6.1, Power Mart, Power Exchange and Informatica MDM.

Work experience reviewing and testing of data maps between various legacy systems and relational databases.

Worked and reviewed along with Business Analysts and Functional Team members to translate Business requirements into ETL technical specifications.

Tested ETL applications in relational databases and flat files using Informatica and tested the reports generated.

Excellent experience in Data mining with querying and mining large datasets to discover transition patterns and examine financial data.

Extensive data cleansing and analysis, using pivot tables, formulas (v-lookup and others), data validation, conditional formatting, and graph and chart manipulation.

Extensively used Informatica Warehouse Designer to create and manipulate Source and Target definitions, Mappings, Mapplets and Transformations such as such as Source Qualifier, Lookup, Filter, Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, XML, Sorter and Sequence Generator.

Proficient experience with Microsoft Excel (e.g., filtering data, sorting columns, hiding columns, Pivot tables, basic functions like Count, Sum, Average, Cucumber Testing)

Hands-on experience and strong understanding of Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC).

Expert user of Microsoft Office (Work, Excel, Access, Project, and PowerPoint) as well as Visio (in order to prepare all documents, presentations, tables, briefings, and worksheets), requires expertise in Microsoft Access programming and report development

Experience in UNIX working environment, writing UNIX shell scripts for Informatica pre & post session operations.

Expert in different types of testing that includes Black Box testing, Smoke testing, Functional testing, System Integration testing, End-to-End Testing, Regression testing & User Acceptance testing (UAT)

Good understanding of XML schema/XML manipulation and validation

Proficiency in Back-End Testing/Database Testing specifically in developing and executing SQL queries to interact with databases.

Expertise in use of HP Quality Center for test execution, defect management, defect tracking, traceability matrix and Bug Reporting

Knowledge of Agile project management tools like Version One

Good working knowledge of major Operating Systems and tested applications on windows 98/NT/2000/XP, LINUX & UNIX environments.

Involved in Troubleshooting, resolving and escalating data related issues and validating data to improve data quality.

An effective communicator with strong analytical abilities combined with skills to plan, implement & presentation of projects.

Excellent team player and self-starter with good ability to work independently and possess good analytical, problem solving and logical skills and manage business expectations with a delivery-focused approach

Technical Skills:

ETL Tools

Tools

Informatica Power center 9.0.1/8.6/8.1, & Power Exchange, Informatica MDM, Informatica Data Quality (IDQ), DVO.

Doors, Rational ClearCase/ClearQuest/Rose/RequisitePro, ERWIN4.0, MS Visio, MS Excel, MS Power Point, MS Project

Reporting Tools

Cognos 8.1, Business Objects xi 1, MS Excel, Word, Visio, MS Project

Databases

Oracle 11g/10g/9i/8i, Teradata V6/V12, DB2, SQL Server 05/08, DB2, Netezza TF24.

Big Data

Data Modeling

Hive, Impala, HDFS, MapReduce, Sqoop.

Erwin 4.0/4.1, Visio 2007.

Programming Skills

Java Scripting, SQL, PL/SQL. JAVA, C++, C, JavaScript. Cucumber

Scripting Languages

Unix Shell/korn shell Scripting.

Testing Tools

MS Suite/ Project Tools

Operating Systems

HP Quality center, HPQC- ALM11, HPQC- ALM12, DOORS, Version One

MS Office (Word, Excel, PowerPoint, Outlook), MS Project, MS SharePoint

UNIX Sun Solaris 2.x, AIX 6, Win 95/NT/98/2000/XP.

Utilities

SQL* PLUS, Toad 10/7.4, Oracle designer, Visio

Education:

Master’s in Computer Applications, JNTU, India

Bachelor’s Degree in Electronics and communications engineering, India

Quest Diagnostics, PA Jan 2018– Till Date

ETL/Big Data Tester

Responsibilities:

Reviewed and analyzed functional requirements, mapping documents

Involved in the data analysis for source and target systems

Provided insight and analytics (especially graphing and exception reporting) on demand

Experienced in performing data cleansing and build an Excel user interface to assistance in data collection and metrics.

Experienced in Using Excel pivot tables to manipulate large amounts of data in order to perform data analysis, position involved extensive routine operational reporting, hoc reporting, and data manipulation to produce routine metrics and dashboards for management.

Responsible for the development and preparation of a broad range of reports and complex analysis focused on program performance and project deliverables.

Worked on several pallets including Http, Soap, WSDL, File, Ftp, General, etc.

Used XPath for the data validation and XMLSPY for defining schemas and instances used in the business processes

Worked on XML validations, Flat File validations

Experience includes developing reports based on user-entered data parameters and developing the associated data parameter input screens. The data parameters entered are then used to filter the records from the data sources, utilizing views that I’ve defined in SQL.

Analyze data to determine if there are outliers or data accuracy issues and work with related departments to make the corrections

Executed test cases to verify accuracy and completeness of ETL process

Created ETL test data for all ETL mapping rules to test the functionality of the Informatica Mapping

Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.

Used Informatica Analyst tool to create and validate the test data

Involved in validation of business rules for large datasets.

Created Complex Excel formulas to validate the fixed width COBOL source files against the data loaded in to database.

Validated data using the Hive Queries for analyzing data in Hive warehouse using Hive Query Language (HQL).

Ran the scripts on the edge node server to land the files into HDFS and ran the loop wrapper scripts to validate and populate the data into Hive/Impala.

Involved in writing Hive Queries for analyzing data in Hive warehouse using Hive Query Language (HQL).

Involved in Map Reduce programming model for analyzing the data stored in HDFS.

Involved in validation of Map Reduce codes as per business requirements.

Involved in User Interface Testing and Web Services Testing using Soap Scope.

Built complex SQL queries as per the STM.

Developed ETL mappings to test some complex ETL logic which involves multiple joins to tables from other environments/layers (DA, EDM and TDS).

Fetched data from different sources like mainframes and Teradata and queried using Teradata SQL.

Performed Verification, Validation, and Transformations on the Input data (Text files, XML files) before loading into target database.

Experience in creating UNIX scripts for file transfer and file manipulation.

Communicated with business customers to discuss the issues and requirements.

Created, optimized, reviewed, and executed SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables

Participated in User Acceptance testing (UAT) and involved in UAT test cases, executed test cases, documented defects, resolved defects and received Sign Off from Application.

Built test cases and test scripts to test the Purge process, restartability and file load order.

Developed ETL code to schedule the test scripts (Unix/sql) for daily loads testing.

Updated the test status in ALM and generated open/closed defect reports daily.

Environment: Informatica Power Center 9.X/8.X, Informatica Analyst, Informatica MDM, AbInitio, Oracle 11g/10g, SQL Developer, SQL Data Modeler, Teradata, Hive, Impala, DVO, PL/SQL, Windows 7, UNIX, Agile Methodologies, HP QC ALM, SharePoint, Cognos, SSRS, Postman, SoapUI.

Evolent Health, Arlington VA Sep 2015 – Dec 2017

ETL/Big Data Tester

Responsibilities:

Analyzed the business and wrote Business Rules Document.

Implemented and followed a Scrum Agile development methodology within the cross functional team and acted as a liaison between the business user group and the technical team.

Gathered requirements and created Use Cases, Use Case Diagrams, Activity Diagrams using MS Visio.

Created management reports in MS Excel that became company standard for recording financial information.

Performed Gap Analysis to check the compatibility of the existing system infrastructure with the new business requirements.

Assisted with HBX enrollment and claims processes; details, captured in HBX status report

Tested various kinds of tools like Facets, ICD converter Availability, involved in various kinds of testing of the Facets application modules like Provider, Enrollment, Membership, ICD 10 and claims.

Checked the HIPAA compliance of the manually created 27X transactions using the Edifecs Analyzer tool

Tested different master detail, summary reports, ad-hoc reports and on demand reports using Cognos Report Studio.

Expert on general FACET application worked on FACET 5.01 for claims, pricing, provider and member module.

Worked on several pallets including Http, Soap, WSDL, File, Ftp, General, etc.

Used XPath for the data validation and XMLSPY for defining schemas and instances used in the business processes

Worked on XML validations, Flat File validations.

Validated data using the Hive Queries for analyzing data in Hive warehouse using Hive Query Language (HQL).

Ran the scripts on the edge node server to land the files into HDFS and ran the loop wrapper scripts to validate and populate the data into Hive/Impala.

Involved in writing Hive Queries for analyzing data in Hive warehouse using Hive Query Language (HQL).

Involved in Map Reduce programming model for analyzing the data stored in HDFS.

Involved in validation of Map Reduce codes as per business requirements.

Facilitated (JAD) Joint Application Development sessions to identify business rules and requirements and documented them in a format that can be reviewed and understood by both business people and technical people.

Very strong business knowledge with the Claims, Membership, Enrollment, Eligibility, Benefits, General Provisions, Cost sharing, Services, Authorizations, Billing, Revenue, Pharmacy, Vision, Dental, Medical etc

Experienced in creating multiple XML files for 5010 & ICD10 837 & 835 XML manually using XML SPY.

Tested total life cycle modules of Health Care Solutions (Enrolling the Providers, Approve Providers in Process Manager, Member enrollment, claims submissions, Flexi Financial Services).

Validated Provider Certifications according to HIPPA standards.

Worked with EDI transactions (270,271,276,277,837,835,997) and interfaces testing.

Created Provider authorizations and submitted claims (professional, Institutional and Dental) validated claim Adjudication in QNXT.

Created Claims Flexi Financial payment cycles and validated the Claims payments in the SQL Server Database.

Used MS Visio for business flow diagrams and defined the workflows.

Performed data analysis for the existing data warehouse and changed the internal schema for performance.

Participated in Designing and development of the application.

Used user stories approach for describing the requirements using the user story template.

Wrote technical papers, gathered technical requirements, and compiled them to help the design system.

Created Use Case specifications, business flow diagrams and sequence diagrams to facilitate the developers and other stakeholders to understand the business process according to their perspective with possible alternate scenarios.

Created visually impactful dashboards in Excel for data reporting by using pivot tables and VLOOKUP. Extracted, interpreted and analyzed data to identify key metrics and transform raw data into meaningful, actionable information.

Interacted with the software development team and executive team to liaison the business requirements to ensure that the application under development confines to the business requirement.

Performed data mapping from source to target.

Extensively used SQL for accessing and manipulating database systems

Participated in Design walk-through with SMEs to baseline the business architecture.

Facilitated in the overall management of the project including and mitigation, status reports, client presentations, defining milestones, deliverables and establishing critical success factors.

Was jointly responsible for monitoring the progress of the development and QA team.

Worked closely with QA team and developers to clarify/understand functionality, resolve issues and provided feedback to nail down the bugs.

Created Use Case Diagrams using UML to define the Functional requirements of the application.

Performed Acceptance Testing (UAT), Unit Testing and Documenting.

Maintained a close and strong working relationship with team mates and management staff to achieve expected results for the project team.

Organized meetings with the development teams to elicit use cases and document them.

Wrote manual Test Cases for checking the application.

Used Agile methodology for repeated testing.

Involved in Manual Testing by checking of various validations.

Created, maintained, and executed manual test scripts.

Developed Functional Specifications and Testing Requirements using Test Director to conform to user needs.

Environment: Informatica Power Center 7.1/8.6 (Power Center Designer, workflow manager, workflow monitor), UNIX, Oracle8i, DVO, TOAD, Teradata, Hive, Impala, Sqoop, HDFS, IBM DOORS, HP QC ALM, Rapid SQL, Version One, SharePoint, Cognos, SSRS, SAS.

Volkswagen Group of America, Auburn Hills-MI Sep 2013 – Aug 2015 ETL QA Tester

Responsibilities:

Responsible for Business analysis and requirements gathering.

Responsible for coordinating testing throughout both systems ODS and EDW which were two different teams.

Experience in Data Quality and MDM Analyst. Involved in development, testing and implementation of various projects.

Tested several complex reports generated by Cognos including Dashboard, Summary Reports, Master Detailed, Drill Down and Score Cards

Promoted Unix/Informatica application releases from development to QA and to UAT environments

Proficient in quality assurance testing by manually and using Automation tools (QTP, ALM/Quality Center)

Involved in backend testing for the front end of the application using SQL Queries in Teradata data base.

Used Agile Test Methods to provide rapid feedback to the developers significantly helping them uncover important risks.

Performed data analysis and data profiling using SQL and Informatica Data Explorer on various sources systems including Oracle and Teradata

Used Control-M for job scheduling.

Experienced working with Customer Information Data Mart for business reporting thru MDM and Reference Data

Experience in Leading the Offsite Project, managing a team of 8 Offshore and 4 Onsite consultants.

Responsible for leading team and co-coordinating with offshore team.

Developed data quality test plans and manually executed ETL and BI test cases.

Worked in AGILE Methodology.

Designed and kept track of Requirement Traceability Matrix

Quality Center updates and test cases loading and writing Test Plan and executing Test Cases and printing status report for the team meetings.

Created and Developed Reports and Graphs using ALM

Involved in Integrating and Functional System Testing for the entire Data Warehousing Application.

Used Quality Center for defect tracking.

Performed Teradata SQL Queries, creating Tables, and Views by following Teradata Best Practices.

Performed data quality analysis using advanced SQL skills.

Tested slides for data flow and process flows using PowerPoint and Microsoft Visio

Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted

Extensively used Informatica power center for extraction, transformation and loading process.

Tuned ETL jobs/procedures/scripts, SQL queries, PL/SQL procedures to improve the system performance.

Worked as Onsite Coordinator for getting the work done from offshore team.

Testing the reports and maintenance of daily, monthly and yearly.

Extensively tested several Cognos reports for data quality, fonts, headers & cosmetic

Involved in Writing Detailed Level Test Documentation for reports and Universe testing.

Involved in data warehouse testing by checking ETL procedures/mappings

Implemented and maintained tools required to support data warehouse testing.

Performed the tests in both the FIT, QA and contingency/backup environments

Performed all aspects of verification, validation including functional, structural, regression, load and system testing

Involved in developing detailed test plan, test cases and test scripts using Quality Center for Functional and Regression Testing.

Worked on test data and completed unit testing to check all business rules and requirements are met. Also tested for negative data to check that the job fails on any critical error.

Tested several data migration application for security, data protection and data corruption during transfer

Responsible to help testing team on creating test cases to make sure the data originating from source is making into target properly in the right format.

Functioned as the Onsite / Offshore coordinator and Team Lead

Tested Cognos reports and written test cases using HP Quality Center.

Wrote SQL and PL/SQL scripts to perform database testing and to verify data integrity.

Written several complex SQL queries for validating Cognos Reports.

Created different user defined functions for applying appropriate business rules

Environment: Informatica 9.1, Teradata, Quality Center & ALM 11.0, SQL, PL/SQL, Cognos 8.0 Series, Mainframe Flat Files, Agile, COBOL II, IBM Infosphere MDM, UNIX, Control-M, Korn Shell Scripting, Oracle 10G, DB2, TOAD9.7



Contact this candidate