POONAM PREET KOUR
*******@*****.***
[Visa – EAD ( GC in few months ) ]
SUMMARY:
Highly Skilled Professional 7+ years of experience in Software Design, Development, Analysis, Testing, Data Warehouse and Business Intelligence tools.
Experience in EDW with Master data management (MDM), Data Mining, Data flow analysis, data quality, data validation, Star Schema/snowflakes schema, Fact and Dimension tables, writing test cases and executing them in QC, logging and tracking defects.
Experience with slowly changing dimension methodology like type 1, type 2, type 3
Solid experience with Database migration and data cleansing
Excellent Analytical, methodical, and resourceful approach to problem solving, identifying and documenting root causes and corrective actions to meet short and long term business and system requirements.
Experience in creating source to target mapping documents for data integrity
Experience working in waterfall testing process and Agile approach. Participated in iteration planning, retrospective meeting, demo and refinement meetings.
Experience in solution assessment and validation and black Box Testing techniques like Integration Testing, System Testing, Smoke Testing, Functional Testing, Regression Testing and User Acceptance Testing.
Experience in creating test plans, test strategies and test scenarios in addition to functional testing
Experience in estimating QA efforts at various phases of testing
Experience in ETL testing, end to end testing, from source to target (ODS staging to EDW).
Environment:
ETL Tools: Informatica, Datastage, Abinitio
BI Tools: Cognos 8 BI Suite, BO XI/6.5, OBIEE
Databases: MS SQL Server 2005/2008, Oracle 8i/9i /10g/11g, Teradata, IBM DB2, Sybase
Client Side Skills: SQL, T-SQL, PL/SQL, UNIX shell scripting, Java, HTML, XML, SSRS, API
OS: Windows 2000/NT, UNIX/Solaris, Red Hat Linux, AIX
Tools: Toad, Erwin, MS Visio, Microsoft Office Products Microsoft Sharepoint, IDQ
PROFESSIONAL EXPERIENCE:
Southwest Airlines, Dallas, TX May’16- Present
Position – Sr. ETL Tester
Description – This is a data conversion/data migration project from MXI to Southwest airlines database. I am working as a Sr. ETL Tester/ BA Analyst.
This project is intended to be specific to this conversion. This project focus on aircraft maintenance. MXI is a company that provides maintenance tool ( MOSAIC is a maintenance tool ). Maintenance means flight maintenance. All features about maintenance and flight parts comes in flat files from external vendors to southwest airlines database. I am responsible to validate the usage and location features.
Job Responsibilities –
Responsible for communicating to Business product owner and understanding business requirement.
Creating test queries to validate structure of the tables, data in the tables, constraints in the tables in the staging and operational tables; looking for duplicates of an identifier; counts of records on source and target tables. Checking for datatypes and data sizes as per STM.
Working on HP ALM Quality center tool to run the test cases and comparing expected and actual results. Then raising defects in case of any test case fail.
Documenting test plan, test strategy for end to end testing and test scenarios based on business requirements.
Writing SQL scripts to do manual testing
Involved in data validation, data scrubbing, data cleansing for data coming from sources during data migration/conversion.
Documenting testing document with results and screenshots of results.
Testing the front end data with staging and operational tables.
Participating in defect triage meetings to work closely with developers to inform them about defects. Experience in defect management cycle.
Participating in daily standups or scrum meetings to talk about daily tasks and status.
Environment – Oracle, Sharepoint, HP ALM Quality Center Tool, PL/SQL Developer, JIRA
Frontier Communications, Richardson, TX Feb’16-Apr’16
Position – Sr. Data warehouse Tester/Sr Data Quality Analyst
Description – This is a data conversion/data migration project from Verizon to Frontiers database. I am working as a Sr. Data Quality/QA Analyst.
Frontier is migrating and convert relevant Customer Care and Billing (CCB) data from Verizon’s billing, customer care, trouble ticketing, service order, and inside plant application systems to the corresponding support system within the Frontier application environment for residential, business, and wholesale customers. Frontier is also converting network Operational Support Systems (OSS) data from Verizon to corresponding Frontier systems, as well as Human Resources, Compensation, and other corporate systems.
Job Responsibilities-
Writing functional test cases depending on TR documents.
Working on HP ALM Quality center tool to run the test cases and comparing expected and actual results. Then raising defects in case of any test case fail.
Performing different data quality checks like null check, length, validity, alphanumeric. on all different fields and columns and creating dashboard which shows profiling statistics of defects in each field. Verifying data integrity ie data validation.
Researching more on bad data and categorizing them as source issue, ETL issue or historical issue and worked towards resolving this issue by talking to business contact (for source issue) and development team ( ETL issue or if any loading of data failed).
Involved in Unit testing, system testing, regression testing which includes tables testing ( structure of tables, counts on source and target, duplicates), different metrics testing(like orders received on time, cancelled, disconnect, preorders, exclusions ), and report testing ( which includes comparing results from cognos created reports and querying in oracle and Teradata)
Involved in end to end testing and user acceptance testing.
Analysis the different files in Unix environment which lands from sources to its respective folders. Then SQL Loader defines the data in terms of columns. From where files move to ODS (staging area )
Responsible for QA sign offs for projects involving enrollments, billing, migrations and several other aspects of customer life cycle
Involved in data validation, data scrubbing, data cleansing for data coming from sources during data migration/conversion.
Environment – Oracle, Sharepoint, HP ALM Quality Center Tool, Teradata SQL Assistant
GM Financial, Arlington, TX Oct’15-Feb’16
Position –Sr. QA Analyst
Description - This project primarily focus on building the interface between GM and GMF, where GMF sends an invoice or vehicle identification number (VIN) request and retrieves the corresponding invoice from GM. At that time, the retrieved GM invoice data is placed in the GMF Data Warehouse. The project scope includes Secure a data sharing agreement between GM and GMF, build an common interface between GM and GMF to send/receive VIS data (GM development and system components are already in place), build an “interactive“ process that provides a GMF team member the ability to retrieve invoice information from the VIS service and build a “batch” process where invoice requests can be processed systematically with minimal manual intervention. In inbound, once the invoice number or VIN is entered in VIS home input page, then xml file is generated, which is then send to middleware and then to ODS. In outbound ETL, GMF receives invoice details from GM, then xml is generated for each invoice number or VIN, which flows to GM server, then to middleware and finally to ODS(VIS tables).
Job Responsibilities –
Responsible for communicating to Business product owner and gather and understanding business requirement.
Communicating business requirements to the team and making sure the requirements are well communicated and understood by the team including developers.
I am responsible for front end(UI) and backend(ETL) testing which involves data integrity checks.
Involved in creating test plans, test scenarios, test strategies.
Involved in functional testing, smoke testing, regression testing and making sure that implemented in database and in front end is according to business requirements documents.
Documenting test plan, test strategy for end to end testing and test scenarios based on business requirements and creating traceability matrix.
Writing SQL scripts to do manual testing
Writing functional test cases based on business requirements documents and use cases in technical requirement document
Execution of test cases and logging in defect in rally and HP QC tool and tracking them.
Participating in defect triage meetings to work closely with developers to inform them about defects. Experience in defect management cycle.
For front end testing(web services), testing the UI by entering valid VIN, invoice number.
Doing data integrity checks on invoice details and making sure it is maintained from UI to XML generated and XML generated to table.
Experience in data validation of invoice details of all invoice numbers and VIN
Environment – Oracle, Sharepoint, HP ALM Quality Center Tool, XML, SOAP UI, Abinitio
Pier1 Imports, Fort Worth, TX Jan’15-Sep’15
Position- Sr. ETL QA Tester/Sr. Business Analyst
Description – Pier1 follows the agile approach which involves daily standups, creating user stories and tasks in Rally, refine the stories, iteration planning. The architecture of Pier 1 consists of transactional data coming from different sources like Promo Master (promotions, coupons), Customer master, Responsys (emails), Adobe Campaign, ADS (reward cards and credit cards) . This transactional data go to EDStaging in SQL Server. From EDStaging, data flows to EDhub which consists of data like location where customer shopped(online or in store), product(skunum), Fiscal calendar, promotions(items on promotions and customers who shopped in promotions), campaign data. From EDStaging the data also go to EDW which consists of campaign execution (direct mail, email), customer loyalty (competitor research, loyalty account), sales, orders, customer segmentation. Then the data from EDHub and EDW goes back to some of the vendors like Ephiphany, Responsys, ADS,Channel view. Customer master, customer foundation, RFM. Then finally the data flows to Campaign Data mart and marketing data mart which is used for reporting by the business.
Description – This project is Master data management ( MDM ) centered.
I own Customer Sales Matching and Customer segmentation. The customer data from Epiphany source comes in through Trillium application, and is cleanse (for example – one customer shop in different stores and ECOM(online), can have different addresses and same phone number; Trillium defined in some business rules to identify this customer and assign 1 global customer identifier to him which is called golden record) and then 1 golden record is captured per customer and Global customer Identifier is assigned to each customer . These golden records are matched to their transaction ids based on different match types (EpicorID, AssociateID, Reward card, credit card, Reactivation coupon, Cot Number). Then this helps in identifying the segments each customer falls in( Business set 0-9 segments which is calculated based on recency(the most recent transaction happened), Frequency (number of times transactions), Monetary( spending amount). Total 96 attributes that marketing identified.
Job Responsibilities –
I am responsible for communicating to Business product owner and gather business requirement. Understanding business requirements.
Experience in data validation and doing data quality checks and data profiling.
Creating test cases based on business rules to check on functional testing and traceability matrix.
Execution of test cases and logging in defect in rally and HP QC tool and tracking them.
Documenting test plan, test strategy for end to end testing and test scenarios based on business requirements.
Writing SQL scripts to do manual testing
Involved in data reconciliation to ensure quality and consistency of data.
Experience in solution assessment and validation and black box testing techniques like integration testing, system testing, smoke testing and user acceptance testing.
Doing data analysis which includes data profiling, data cleansing, data governance.
Participating in daily standups and updating team on work status
Updating and creating new tasks related to user story features in Rally tool.
Communicating to scrum master, developers, other team members to keep an update on other stories as well
Participating in defect triage meetings to work closely with developers to inform them about defects. Experience in defect management cycle.
Experience with testing scenarios in UAT.
Environment – Oracle, Teradata SQL Assistant, Sharepoint, Roadmap, HP ALM Quality Center Tool, Rally, SQL Server 2014, Tableau, Ad hoc query
Frontier Communications, Addison, TX May’14-Dec’14
Position – Sr. Functional ETL QA Tester/Data warehouse Tester
Description –I am working as a Sr Data warehouse Analyst/tester in Connecticut Conversion/Migration Project.
Frontier is migrating and convert relevant Customer Care and Billing (CCB) data from AT&T’s billing, customer care, trouble ticketing, service order, and inside plant application systems to the corresponding support system within the Frontier application environment for residential, business, and wholesale customers. Frontier is also converting network Operational Support Systems (OSS) data from AT&T to corresponding Frontier systems, as well as Human Resources, Compensation, and other corporate systems.
Roles and Responsiblities –
Creating test queries from the technical requirement documents and interacting with developers to review on those. Worked on functional and non functional requirements.
Writing test cases depending on TR documents.
Working on HP ALM Quality center tool to run the test cases and comparing expected and actual results. Then raising defects in case of any test case fail.
Performing different data quality checks like null check, length, validity, alphanumeric. on all different fields and columns.
Researching more on bad data/defects and categorizing them as source issue, ETL issue or historical issue.
Involved in Unit testing, system testing, regression testing which includes tables testing
Involved in end to end testing and user acceptance testing.
Responsible for QA sign offs for projects involving enrollments, billing, migrations and several other aspects of customer life cycle
Work in an agile methodology and update the QA status in daily scrums
Ensure traceability of test cases to requirements, working with the project Business Analyst to ensure all requirements are tested.
Environment – Oracle, Teradata Sql Assistant, Unix, Sql Loader, Sharepoint, Roadmap, HP ALM Quality Center Tool,VBA., SSRS
Capital One, Plano TX
Oct’13 – April’14
Position – Sr. Data Quality Analyst/QA/Test Analyst
Description:
I worked as a QA analyst as a part of Capital One Auto Finance Team ( COAF ). I was responsible for performing Data Quality Checks on Information Data Management (IDM). I also played a major role in resolving tickets/issues raised by customers/clients based on Auto Loans.
COAF architecture consists of CLO ( Customer Loan Origination in Oracle Database) and CS ( Customer Servicing in Best application in SQL Server). Both are source systems which gets data from different external vendors( shaw-funded loans, moneygram, western union, national bankruptcy services, etc.). Both sends data to ADW ( Analytical Data Warehouse in Teradata Oneview, SQL assistant) and ORE (Operational Reporting Environment in SQL Server). ORE stores operational or current data ie snapshot as of today (source centric data). ADW stores history data for reporting and decisioning purposes. There are different subject areas like Account Transaction – Auto Loan Accounts; Application – Loan Applications; Broker – Dealership; Campaign – Marketing, Sales Events; Collateral – Vehicle Details; Collections and Recoveries – Bankruptcy, Hardship; Customer Contact – IVR(Interactive Voice Response), Mail, Dialing; Sales and Service – Territory, Sales Zone; Score – Application Decision.
Job description:
Prepared test plan and test scenarios and integration test cases.
Authoring the Functionality/Compatibility and regression test cases, executing the same posting the issues in bug tracking tool.
Authored various Functional scenarios along with performing Gap analysis, Ad-hoc testing.
Responsible for handling full implementation cycle for multiple change requests (CR).
Strong expertise in building test strategies, plans and test scenarios based on business requirements
Performed data quality checks on all internal and external vendors files by identifying the NPI (Nonpublic Personal information) columns (like loan id, application identifier, customer first name, customer last name, phone number, etc.) and building queries for different checks like (null check, length check, validity check, numeric and alphanumeric check).
Created automation tools and dashboards which will give summary of defect counts and percentage of defect counts out of total counts. This statistics helped in analyzing the quality of the data and accordingly researching on bad data.
Experience in end-to-end integration tests that also cover APIs, enables teams to validate GUI items in the context of the larger transactions
Experienced with process flow and data flow diagrams.
Identified the root cause of bad data ie source issue, history issue or ETL issue.
Environment: Teradata, Teradata SQL assistant, Oracle 10g, Best, Putty, SQL Server management 2010, Sharepoint, Roadmap,SAS, Ad hoc query.
Walmart, Bentonville, AR May’13- Oct ’13
Position: Sr.Data Quality Analyst
Description: I worked as a Data Quality Analyst in Project Eureka/Data Café. This project focused on Sam’s data quality. The information consists of Sam’s Club data. The project consists of different subject areas such as sam’s membership, sam’s membership card holder, visit member, visit, scan, item, member category, member type, member status text, calendar, member type text, etc.
Job responsibilities:
Experience in developing data quality scorecard which included quality of information in the system.
Experience in performing quality checks on data to research more on bad data.
Experience in data governance, data mining and data cleansing and production report generation in order to increase consistency and confidence in decision making.
Environment: Teradata Sql Assistant, Microsoft Visio, Sharepoint, Erwin
Verizon Fios, Irving, TX
Position: Sr. Data Quality Analyst/Sr. ETL tester Jun’12-Apr’13
Description: The project was intended on process improvements by collecting data from different metrics in order to analyze the data and accordingly improve the productivity. This includes data mining, data quality checks, data cleansing, data validation. My work was centered on performing data quality checks and also building SQL queries to pull the data for analysis purpose from Metrics Data Repository (MDR) for the VLSS (Verizonlean six sigma). Based on the data provided, analysis was done for process improvements to improve the productivity. The different subject areas includes billing, sales, cancels and disconnects.
Job Responsibilities:
Most of the analysis was based on what the new request was about and how to get the required fields for it in order to pull data from our database system.
Created test cases for different subject areas like billing, service ordering systems, sales, cancels by analyzing tables in source and target database, then verifying it with business people.
Performed different data quality checks like null check, length, validity, alphanumeric. on all different fields and columns and created dashboard which shows profiling statistics of defects in each field.
Involved in testing subject areas, unit testing to check on ETL procedures/mappings/jobs and the reports developed. Done integration testing which involves testing planning, test cases and test execution. Wrote test scripts .
Environment: Toad, Oracle, Teradata, SQL navigator, SOA environment, Erwin, Teradata SQL assistant, EDW environment,Datastage,SSAS
United Air Express, Jamshedpur, India Jan’09 – Nov’09
Position: QA Analyst
Job Responsibilities:
Involved in Business Requirement walkthroughs right from the beginning of the project
Involved in Project status and tracking meetings with clients
Followed testing processes and approaches according to client Standards
Prepared resource plan, Execution Plan.
Created and tested testing scenarios for the business and technical and test cases.
Track the progress of Test execution, Defect tracking.
Keane India Ltd.,Gurgaon, India
Position:
Software Engineer/ Data Quality Analyst/Test Analyst Nov’07 –Sep’08
Job Responsibilities:
Served on team that developed a radiation badge monitoring and tracking site for Landauer, the world leader in personnel radiation monitoring.
Met with clients to assist them in identifying and communicating all their requirements so a complete and detailed plan could be developed.
Effectively identified and resolved problems to keep project on track and ensure client satisfaction.
Interact with business users as well as other members of IT outside of the team. Involved in preparation of specifications, development and testing of scripts.
Involved in performing extensive Back-End Testing by writing SQL queries to extract the data from SQL Database using Oracle.
Environment: C, C++, C#, Agresso, Visual Basics, Oracle, TOAD, SQL.
EDUCATION: B.Tech from SIT University, Bangalore passed out in 2007.
Masters in Business Administration from Hamline University, Minnesota passed out in 2012. Worked as Office assistant (Part time) .