Post Job Free

Resume

Sign in

Data Analyst Test Cases

Location:
Everett, WA
Posted:
July 18, 2018

Contact this candidate

Resume:

Highlights:

R Programming

AGILE Methodology

Data Science Methodology

Data Analyst

Technical design

Technical Lead

Test Cases

Vendor Testing

SDLC Expert

Banking and Financial Services

UAT Support

Personal Lines Insurance

Data Migration

Code Review

Coding and Modularization

Professional Summary:

Highly skilled software development professional bringing more than 12 years in software design, development and integration. Advanced knowledge of Oracle and PL/SQL, Data Analyst.

Technical Lead with expertise in all aspects of the software development life cycle, including requirements analysis, design, development and production support.

AGILE Methodology – Scrum Framework –attending Daily Scrum Meeting/SPRINT Review Meeting

Strong Communication and Presentation Skills substantiated in past assignments with developers, project managers, subject-matter experts, stakeholders, system implementers, and application end users.

Worked in Data Migration Projects.

Worked in Banking and Financial Projects and Life Sciences Projects and having exposure to BFS and Life sciences domain.

Self-directed and motivated Software Engineer who works effectively in a dynamic environment. Fluent in Oracle, PL/SQL, RTC/CVS,SVN, TOAD, TDP, PL/SQL developer and UNIX.

Software Engineer well-versed in creating test cases, as well as covering all test conditions and eliminating redundancy and duplications.

Expert Software Developer dedicated to constantly improving tools and infrastructure to maximize productivity, minimize system downtime and quickly respond to the changing needs of the business

Expertise in Property and Causality Insurance specifically Personal Automobile and Home Owners. Having good knowledge in insurance processes and platforms which includes Technical underwriting, Rating, Claims management, treasury and Interfaces.

Involved in New Project Proposals, Project Planning, Project Scoping, Estimations and Resource Planning.

Active member of Code Review team and involved in reviewing the design documents and code deliverables.

Worked as a Key resource in Data Analysis and perform Data Migration.

Assisted in the maintenance of the risk/issues logs, change control process, traceability matrix, business process documentation, interface with testing team for defect management, etc.

Experience in preparing test scenarios, test cases and test plans based on business requirements and functional requirements

Actively involved in successful Business Application deployment, production support and UAT support.

Build and maintain strong relationships with business partners, customers, technology teams and Management team.

Ability to work efficiently independently and as part of a team.

Critical thinker, excellent negotiating skills, writing skills and communication skills. Great team member and leader.

Aspiring to work in a real professional atmosphere that enables me to cope with the latest emerging technologies, thus helping to widen the spectrum of my knowledge as well as organizational growth.

Project Experience –

Cognizant Technology Solutions

AT&T Feb’17 – Present

Bothell, WA

Database Analyst

ORACLE, PLSQL, UNIX

Project Scope

The Common Product Catalog (CPC), an interface provides service to deliver rich product information to clients to faciliate their application flows. The system was first created in 2005 to supply market rate plans/features and equipment reference data,later it was extended to become the product catalog, providing product data customized at the FAN level to business customers. Since early 2008, it has been extended again to provide product information for other channels and applications.

ROLES AND RESPONSIBILITIES

Analyzing the requirement and discussing with the Technical Lead (@Client) to gather all the information related to the requirements.

Working closely with the Lead (@ Client) and developing the code based on the requirements

Performing the ETL - Analyzing the requirement and discussing with the Technical Lead (@Client) to gather all the information related to the requirements for each sprint.

Creating Metadata and Stored procedure/batch to perform data load in CPC database.

Creating Test cases and performing Unit Testing to validate the code following the requirements.

Delivering the code for the deployment.

Cognizant Technology Solutions

ADP June ’16 – Jan ’17

Alpharetta, GA

Data Analyst

ORACLE, PLSQL

Project Scope

ADP national payroll services are running Payforce application since 1999. This application is used to process payroll for hundreds of customers. ADP decided to move away from Payforce and migrate all its clients eventually to their newly designed product, Vantage. The present analysis is done for 750 clients selected by ADP for data migration. Although the critical path of this migration involves Security, Portal, Interfaces and Reports the spirit of this migration is to move data from Payforce to Vantage database. Analysis is performed to derive a wave based iterative approach to transfer client batch by batch to Vantage. Another objective of the analysis is to find out the most efficient data migration solution with as much automation as possible.

ROLES AND RESPONSIBILITIES

Onshore Data Analyst responsible for Data Migration from source legacy Payforce system to target Vantage Database by analyzing the data based on the functional requirement and migrated the data to Vantage database without any issues for 10 clients(Simpson Thacher, Rose International, Tillys, Triple Canopy, Goodwill of AZ, Garmin, Savation Amy, Headwaters, Planet Hollywood, Martinrea) both in UAT and Production.

Execute various jobs to load table data from source to target and analyze the data and fixed any issues faced.

Responsible for triaging the issues faced and perform corrections where necessary to migrate data to target without rejections.

Work with offshore team to load data and clarify any technical questions raised by them.

Work with ADP team to clarify queries or any technical assistance when necessary.

Creation of Snapshot and Staging databases from Database Dump which will be used as the source.

Review BRD for the client specific documentations and load data accordingly.

Communicate with various stakeholders like Business Analysts, Upgrade Consultants, and Program Managers Security testing team, Data testing team and Functional QA teams if there are any issues or doubts.

Fixing the defects which were raised by the QA team after the data load.

MAPFRE Tronweb Jan ’11 – Jan ’16

Webster, MA

Technical Lead

ORACLE, PLSQL, UNIX, Jasper Reports

TRONWeb is MAPFRE's core Insurance Processing System. TRONWeb is the enterprise wide project that will implement a single insurance processing system across all of the MAPFRE USA locations. It is a fully integrated Insurance processing system. The TRONWeb System utilizes a Client – Server Architecture. The Graphical User Interface (GUI) is created using the Java programming language (AWT). Business logic objects are implemented in PL/SQL procedures running over an Oracle 11g release 2 Database. Currently many projects in MAPFRE are using the Tronweb Database which is hosted on AIX Server. TRONWeb usage is confined to Internal Users. External users like third party, uses WebTRONWeb (WTW). TRONWeb handles the modules like Issues, Claims, Treasury, Accounting, AFE and etc.

Project Scope

Implementation of New Business of CA Home Insurance.

Implementation of New Business of AZ, FL, PA Auto Insurance.

OFAC – Office of Foreign Assets Control.

Car fax – Connected to THP Web to gather the Vehicle history details.

ROLES AND RESPONSIBILITIES

Analyzed the product requirements, scoping the requirements, analyzing input contents provided by the product managers, clarifying issues regarding the requirement and contents provided with MAPFRE product managers which involves MAPFRE specific domain knowledge

Discussed with the Users/BA to gather all the information related to the requirements if there is any GAP

Allocating tasks to offshore team based on client’s requirements and coordinating with the team on a daily basis

Performed Technical design prepared by offshore team to meet the requirements

Performed Code Review using the Cognizant Customized tool SONAR to deliver the quality code to the Client

Coordinating with the Cognizant Offshore Team to fix the code review comments if there is any deviation

Validated the Code for business logic before delivering

Reviewed Unit Test Cases prepared by offshore team

Resolved defects raised in HP ALM

Involved in Vendor Testing and worked in Vendor Server side changes to accept the files

Conducted knowledge transfer sessions to key resources for smooth implementation

Acted as SME for the Claims, Interfaces & Reports Module

Cognizant Technology Solutions

WellPoint April’10 – Dec ‘10

Mason, OH

Technical Analyst

ORACLE, PLSQL, UNIX

Project Scope

There are two main SIT environments in which the batch needs to be run. They are 4.41 and 4.71, under which 4.41 have AIMS_MAINTSYS and AIMS_MEDD and 4.71 have AIMS_SYS and WCRMEDD. The testers used to place the job request through OWEN – Online Work Entry Network tool on each and every environment. The job needs to run following the specified order.

ROLES AND RESPONSIBILITIES

Scheduled and ran the jobs in the WLM Scheduler in Unit Test environment

Monitored the job runs and provided the status to the testers(who requested the jobs) and the Client on a daily basis

Prepared the failure tracker and shared everyone

Provided Assistance and Training to the Offshore Team

Handover the jobs monitor to Offshore Team to continue

Cognizant Technology Solutions

The Northern Trust May ’08 – April ‘10

Chicago, IL

Team Lead

ORACLE, PLSQL

Project Scope

General Ledger Reporting (GLR) is one of the projects in Wealth Passport which allows the clients to view their Balance sheet, Income Statement, Transaction Activity, Interface Listing and Trial Balance through various reports. This project deals mainly with the report generation process based on the client request. These reports are very useful to track the performance of their investment. All these information’s are saved in GLR database. Apart from this, GLR supports a unique concept called ‘CUSTOMIZATION’. Through this the user can select their own preferred columns for display/grouping or sorting purposes in the reports. There are number of batch processes running on a daily basis to get data from external applications and updates that information in GLR.

ROLES AND RESPONSIBILITIES

Coordinated with systems partners to finalize designs and confirm requirements.

Implemented technical procedures and standards for preserving the integrity and security of data, reports and access.

Implemented designs, including experimentation and multiple iterations.

Modified existing software to correct errors, upgrade interfaces and improve performance.

Prepared detailed reports concerning project specifications and activities.

Consulted regularly with customers on project status, proposals and technical issues.

Worked closely with other team members to plan, design and develop robust solutions in a timely manner.

Interfaced with business analysts, developers and technical support to determine the best requirement specifications.

Prepared test cases, test data based on functional requirements and business requirements

Performed Design / Code review developed by the team members

Prepared status report and shared to Client

Cognizant Technology Solutions

IMS Europe June ’06 – Dec ‘07

Chennai, India

Team Member

ORACLE, PLSQL, UNIX

IMS Health (Inter-continental Marketing Services) (NYSE: RX) is an intern consulting and data services company that supplies the pharmaceutical industry with sales data and consulting services. IMS Health was founded in 1954 by Bill Froch and David Dow. Today IMS has operations in more than 100 countries, a global workforce of 7,600+ employees and revenues of $2bn. It recently moved its headquarters from Fairfield, CT to Norwalk, CT.

Project Scope

LPIN has a GUI built in PowerBuilder which provides access to an Oracle database that contains all LPIN data, stored procedures and triggers. The combination of UNIX shell scripts and Oracle scripts are employed for generating the LPIN extracts. NDF (National Description File) which contains product and pack details, CHRON (Pack / Product Changes Files), CPI (Common Product Interface) which contains both Product and Pack details in a Single CPI file are the files extracted in LPIN system and are transferred to various IMS systems.

Implementation of Italy, Algeria, and Tunisia Rollouts

ROLES AND RESPONSIBILITIES

Gathered information on the requirement from the user and prepared the design document and sent to Team lead for the review

Performed Coding and Test cases and sent to Team Lead for the review

Involved in Integration / Regression Testing Support and resolved defects

Prepared the daily status and shared to Team Lead

Conducted knowledge transfer sessions to key resources for smooth implementation

Polaris Software Lab Ltd

Citibank Oct ’03 – Jun ‘06

Chennai, India

Team Member

ORACLE, PRO*C, PLSQL, UNIX

Project Scope

Email ID Search – Look for the Customer details by passing Email ID.

KYC – Report to gather the Customer Inflow/OutFlow to/from the bank.

ROLES AND RESPONSIBILITIES

Gathered information on the requirement from the user and prepared the design document and sent to Team lead for the review

Performed Coding and Test cases and sent to Team Lead for the review

Involved in Integration / Regression Testing Support and resolved defects

Prepared the daily status and shared to Team Lead

Conducted knowledge transfer sessions to key resources for smooth implementation

Education:

Master of Computer Applications, University of Madras, India May 2003

Bachelor of Computer Science, University of Madras, India May 2000

Certification: Introduction to R Programming, Big Data Fundamentals, Data Science Methodology, Oracle Certified Associate, Brain Bench – Oracle, PLSQL, Completed L0 – Life Sciences



Contact this candidate