Post Job Free

Resume

Sign in

Data Test

Location:
Irving, TX
Posted:
June 16, 2017

Contact this candidate

Resume:

VAMSI PAPPALA

Performance Test/Test Automation Expert/Experienced Testing Professional

Cell: 248-***-****

Email: ac0vre@r.postjobfree.com

PROFESSIONAL SUMMARY:

** ***** ** ********** ** Information Technology with 13 years of experience in Quality Assurance Testing.

Experienced in full Software Development Life Cycle (SDLC), various Methodologies like Water Fall, Agile and Validations to ensure the Quality Assurance Control.

Worked in Various Industry domains - Travel Industry, Retail Industry, Health Insurance, HealthCare Services Telecommunications.

Technically proficient in creating intensive and advanced LoadRunner scripts, JMeter Scripts, error Handling function libraries, analyzing the test Results to support performance-testing efforts.

Well versed in using LoadRunner Protocols like PeopleSoft, Web HTTP.

Amazing experience in mentoring the team members as Mercury tools domain expert.

Experienced in determining the test strategy, designing test scenarios based on client’s requirements and system architecture.

Experienced with benchmark testing, interpret test results and work with the developers, Database Administrators, System Administrators, System Architects, for capacity planning and tuning purposes.

Extensively involved in executing the scenarios, monitoring the server response timings, monitoring the throughput, Hits/Sec, Trans/Sec.

Experienced in creating functions keeping in view of Modularity and Reusability. Created error-handling routines to handle exceptions/events that may occur during test execution.

Strong working knowledge in RDBMS (SQL, PL/SQL), Data warehousing and Data modeling.

Strong Technical knowledge of PeopleSoft Architecture and components involved.

Excellent communication and Problem solving skills, Amazing Team player, Quick leaner, organized and Self- motivated.

Technical Skills

Testing Tools

Win Runner 6.2/7.5/8.0 and Load Runner 7.5/7.6/8.0/8.1,9.5,Performance Center, Sitescope 11, Quick Test Professional9.0/9.2/10/11

Test Data Tools

Grid Tools - Flat File Data Masking, Data Maker

Test Management Tools

Test Director, Rational Clear Quest, HP Quality Center, Rally, Jira

Service Desk Tools

Unicenter

Languages

C, C++, SQL, PL/SQL, VBScript, UNIX Shell Scripting

RDBMS

Oracle9i/10g/11g, SQL Server, MS Access

Web Technologies

HTML, DHTML, XML, XSLT

Web application Servers

IIS 4.0, WebLogic 6.0, WebSphere, Tomcat

ERP & Functional Knowledge

PeopleSoft 8.9 HRMS (Payroll/E-Performance), PeopleSoft Financials(PO/GL/AP), People Tools 8.4

ETL & BI Tools

Cognos Impromptu, Cognos PowerPlay, Cognos ReportNet, Framework Manager, Informatica Power Center 7.1

Other Tools

Eclipse IDE, Jmeter, SOAP UI, Edifecs Specbuilder, Edifecs Analyzer, Edifecs Migrator

Professional Training:

PeopleSoft 8.7

Load Runner 7.8

Quick Test Professional 11

Cognos 7.3 & Cognos ReportNet 1.1 MR1

EDUCATION: Bachelor of Engineering in Mechanical Engineering and Post Graduate Diploma in Computer

Sciences

PROFESSIONAL EXPERIENCE:

TEK Systems Global Services – T.Rowe Price Dec 2016 thru till date

Lead Test Analyst

The project is to revamp the UX/UI of the TRP website’s Mutual Fund journey and my role is to lead the testing effort for the cross browser/device testing.

Recommended the tools that are helpful for the cloud based cross browser testing/device testing across program Level.

Excentus/Fuel Rewards Network, Dallas, TX Jan 2016 thru Nov2016

Senior QA Engineer

Creating Automation framework for web services using SOAPUI, Postman.

Part of SLA negotiation and adherence process for Performance Testing needs.

Provided recommendations on the System behavior after Performance Test.

Member of Product Strategy and Operational Excellence and responsible for multiple projects performance testing.

Converted Postman's functional test collections to Load test scenarios using Load Impact.

Created best practices and processes for test data creation, usage, storage to handle sensitive data and be compliant.

Created Load testing automation by integrating with Continuous Integration and & Deployment Process early in the deployment cycle.

Creating Jenkins jobs for CI/CD for Postman/Newman REST APIs.

Attend scrum routines and Rally tasks.

Working on POC using Protractor framework for Angular JS application.

Organize the work within the team members and coordinate based on the area of expertise.

Environment/Tools: JMeter, Postman, Jenkins.

TEK Systems Global Services – Worked with different clients Oct 2014 thru Jan 2016

Senior Automation QA

Client: TASC online

Worked for Customer TASC Online on a Document Dissemination application.

Worked on TestComplete Automation tool creating the automated test scripts using Jscript.

Build Data Driven framework by creating the proper file/folder structure as needed along with required functions.

Updating the function files written in Jscript, VBScript.

Adding the generic functions and the application level functions using Jscript.

Handling the negative scenarios from the excel sheets and writing the data back to the Excel for the usage along the script.

Understanding the functional requirement and creating the positive and negative scenarios to cover the functionality thru automated testing.

Wrote windows batch files for clearing the cache in browsers, handling the execution.txt updated using the html doc.

All objects identified using functions, dynamically without using the Name mapping.

Created the Script Extensions by bundling the xml files with function files.

Responsible to create all the documentation like Automation Test plan, Training Docs and presenting the Demo.

Used GitHub to deliver the project.

Environment: TestComplete 11, Java script, TASC Online

Senior QA Analyst

Client: VSP-Vision Service Plan (Provides Vision Care Insurance Plans)

Project: Worked on the Web services which supply the content for Benefits Inquiry for Member policy at a Doctor’s office via a report.

Setup Jbehave with Eclipse IDE

Responsible to write Jbehave scenarios in Eclipse to test the verbiage or the content as per the business need and User stories.

Deploy and run the Continuous Integration builds as part of the Code deployment using Jenkins.

Participate in Daily Standup meetings, Iteration Retrospective meetings, Iteration Review meetings, Story Estimation sessions, Planning Sessions as part of the agile process.

Participated in Business requirement meeting to understand the test conditions and the data requirements.

Responsible to validate the data meets the test conditions or request the required data.

Prepare weekly Project Updates for management and Jira reporting on Iterations.

Environment: JAVA, J2EE,Eclipse, Jbehave, Postman/Browser rest client, Jira, JSON Viewer, Jenkins, Git

Excentus/Fuel Rewards Network, Dallas, TX Feb 2013 thru Oct 2014

Senior QA Engineer

Excentus develops various Loyalty Marketing Programs that earns rewards for consumers to save on Gasoline at

Shell Gas Stations.

Created Test Cases using Descriptive Programming in Automation tool HP UFT checked the functionality of the application.

Created Recovery Files using Recovery Scenario Manager associated the recovery scenarios to tests to instruct QTP with specified trigger action on unexpected event, object state.

Validated web services using Groovy Scripts in SOAPUI.

Running the tests on different Test Environments and Coordinating with development team in fixing bugs.

Responsible to represent the Module at the end of the sprint retrospection meetings.

Attend Daily standup meetings, sprint level planning sessions, User story design sessions.

Provide inputs during sprint estimation sessions, to estimate the User stories for every release.

Added value to QA team by creating various excel sheets to generate the specific logic generated numbers that is used across the QA team.

Worked with MasterCard and Payment Card Liaisons for card Tokenization on PCI standards.

Worked on data scrubbing tasks for various Personal Information adhering to PCI Compliance

Responsible to validate the data by writing the SQL queries needed for testing.

Worked on Admin Portal & Batch Module to test various Module level functionality.

Tested the iOS & Android Mobile App (Shell Motorist) using JSON Viewer Curl requests to test the functionality.

Wrote tests on Cucumber as part of the BDD.

Worked with Visa, MasterCard Batch transactions which are PCI Compliant.

Environment: QTP/UFT, SQL Developer, Putty Connection Manager, Putty, Rally, SOAP UI, Winscp, JSON Viewer, Cucumber, Ruby

Texas Health Resources, Arlington, TX Sep 2012 thru Jan 2013

Test Lead

Project: Implemented HP QC/ALM, QC Reports Developer, Automation frameworks using QTP as part of CSC

Contract.

Worked on workflow to customized QC Fields and Pages in Test plan, Defects, Business Components Modules.

Developed different customizations by writing field level codes using Script Editor for all modules.

Worked on Groups and Permissions to create users in different user groups to maintain the project level access, created domains & projects for different work groups.

Worked on Selenium as part of the Tool feasibility study for recommendation.

Customized various system fields and created new user fields for various customizations.

Customized Auto mail for sending email about the Defect updates per the requirements.

Worked with THR end users for requirements gathering needed for Automation Framework.

Responsible to conduct requirements gathering meetings for application level business flow for Automation Framework.

Prepared Visio diagrams of the business flow and send pad for approvals from the business users and CSC Management.

Streamlined the onshore/offshore handshake process and provide timely support to meet the deliverables.

Developed new and modified the existing functions from the Framework developed by the CSC TCOE to fit the client needs.

Worked with HP Support team for new licenses of HP, QTP Issues.

Responsible to setup the QTP/Automation lab, install license server for the client.

Prepared various training documentation for QC users, QTP users about framework and running the scripts.

Worked with THR management for QC reports requirements sessions.

Developed various reports using SQL and Post processing (VB code) including the graphs that populates in the Excel.

Used Data security tools like Grid Tools - Flat File Data Masking, Data Maker

Developed Dashboard reports (graphs, data).

Responsible to create the Test data for reports that fits the entire the Release.

Managed an offshore team of 4 members and assigned work on a daily basis.

Environment: Quick Test Professional 11, HP QC/ALM 11,Grid Tools (Data maker & Data Masking), MS Visio, Invision, Epic, Selenium.

BCBS/Health Care Service Corporation (HCSC), Richardson, TX Feb 2011 thru Aug 2012

UNT (University of North Texas), Denton, TX May 2011 thru June 2011

SIT QA Analyst

Project: Worked on rating consolidation & Factor Maintenance Application an ACA project which is used to Rate the

Individual and Group considering various factors.

Worked on the FMA, RFRF and SRE application rating Engines.

Completed testing various web services for SRE/URE application using SOAP UI tool.

Responsible to report the test execution statistics on daily basis.

Responsible for weekly defect review meeting and end of sprint reports.

Worked with various data team to setup the data needed.

Attend daily standup scrums as part of the agile process.

Responsible to create data and run the regression test suites using the QTP scripts in Automation lab.

Debug and fix the scripts/BPT components as needed/when failed.

Project: This is a Federally mandated update to upgrade to ANSI 5010 from the existing ANSI 4010A1 EDI (Electronic Data Interchange) transaction sets that HCSC plans use to communicate with hospitals, physicians and other health care providers and business associates about payments, procedures, diagnosis and other health care information. The ANSI 5010 version will support transmission of significantly more information than 4010A1, and will handle greater amounts of detail much more efficiently. Upgrade to ANSI 5010 is a requirement for the processing of the federally mandated ICD-10 procedure and diagnosis codes.

Worked on ANSI 5010 Project with 276/277, 270/271 transactions.

Created data files for various scenarios.

Designed, Developed and executed over 500 Test cases and worked on defect module to raise defects in HP Quality Center.

Worked on Database ER-diagrams to understand the Data and created the data Flows using the Open source tools like Lucid Chart.

Worked on QC Dashboard for standard reporting and also used Excel reporting for specific needs.

Worked with Edifecs Specbuilder Analyzer to check different data files.

Worked with Edifecs Migrator to upgrade the files from 4010 format to 5010 files.

Ran various 276, 270 files in Jmeter and validated the required loops and elements of 277, 271 responses.

Created various assertions to reduce the manual work of validation in JMX threads.

Created macros in textpad to format 271 files.

Environment: Quality Center 10, Bluestar, Member Provider/NPI, Premier Provider, Bluechip, Edifecs Specbuilder 7.0, Jmeter, Textpad, SoapUI, Agile Methodology.

University of North Texas (UNT), Denton TX (During break at BCBS)

Performance Testing Consultant

Project: Worked with UNT Infrastructure team on comparison load testing of 2 different environments with a new hardware.

Responsibilities:

Developed Test plan, Data Strategy, Identified Resources and Point of contact.

Created Vugen Scripts for different business process of student administration in PeopleSoft portal.

Installed Sitescope on an external network.

Configured various monitors in sitescope like CPU/Memory

Created the Scenario in controller for the 2 environments with 1000 Vusers.

Worked on reporting and merging various metrics graphs based on the requirement of DBAs and Admins.

Worked closely with Business Analyst to create the data needed for various business processes.

Worked on Analysis and recommended necessary changes like increasing the Java heap size to handle larger cache.

Environment: Loadrunner 11, Sitescope11, PeopleSoft Student Admin

ATT - Automation Test Team July 2007 to Feb 2011

Worked with AT&T ETQC Automation Team to develop Automation scripts and support Phoenix, OPUS Applications

(Retail Applications).

Responsibilities:

Developed 200 QTP Scripts using Automation Frameworks and Business Process Components to maintain the modularity, reusability and maintainability.

Worked in associating the repositories and Functional Libraries for Action based Scripts.

Automation ROI presentation to leadership.

Designed and Developed Data-Driven Automation Framework and later Hybrid frameworks.

Read Excel data into Custom QTP Data-Tables using custom function.

Built QTP Function Library (QFL) to handle Application Work-Flow

Used For loops, Conditional statements, Select-case statements to handle application logic.

Descriptive programming, Ordinal Identifier and Regular Expressions to identify dynamic AUT objects. For other standard web objects, used Shared object Repository.

User-Defined Environment variables used as Global Data to exchange between Main Test Action and QTP Function library.

Used inbuilt VB Script methods: Timer, Date, Random, CreateObject, Repository Collections, On Error, etc.

VB Script Timer’s used to time every test data execution.

Testcases and Testscripts are stored in the excel.

Highly Proficient in using Quick Test Professional in conjunction with QC.

Proficient in identifying the validity checks and working with different kinds of Checkpoints on different stages of the scripts.

Created BANs and moved subscriber from Local to NBI markets in Telegence AMDOCS system that supports the PHOENIX application.

Worked on Object Identification problems and merging of local and global repositories.

Identified the glitches in the mercury tool and worked closely with Mercury/HP Corporation support personnel.

Created groups with multiple subscribers in Telegence AMDOCS system in need of data.

Opened Tickets on the issues of QC connectivity with the QTP, Corruption of Object repository.

Identified the process to locate the Business Components by ID in QC by querying the COMPONENT_TBL.

Well Versed in using the delivered functions and developing the Functional Libraries when necessary.

Developed Common Functions that can be used for any application by Automation Team and Functions specific for AUT.

Followed CMM level standards while developing scripts.

Developed training documents on how to run the Automation scripts and set up input parameters.

Creating the default parameters and setting up the Run time parameters for all the tests in the test plan.

Supported users by attending the Jacktrack queues and used Rational Clear Quest Tool for requesting access to different URLs at Enterprise level.

Changed the status of each subscriber when necessary in Telegence system when script demands it.

Identified a methodology to create/copy the test sets from different other projects in Quality Center.

Responsible for liaising the licenses and upgrading all the machines from 9.0 to 9.2, running the scripts for Regression testing in Automation lab.

Environment: Quick Test Professional 8.2/9.0/9.2/10, Quality Center 9.0/9.2/10, Telegence – AMDOCS, Phoenix, OPUS (ATT Retail Applications)

AT&T, Richardson, TX

Performance Test Team - Worked on various applications

Created virtual users in Load Runner to test the application under load and stress test

Designed manual scenarios in Load Runner, schedule the scenarios for User's Ramp up and Ramp down in the Controller

Helped the Architecture team on Test plan Activities by comparing production environment’s network band width, server configurations, Resources i.e. hardware, software and tools to that of the test environment

Accelerated the delivery of high-quality applications and thus helping establish service level agreements before go-live.

Effectively tested the applications to Pinpoint end-user, system-level and code-level bottlenecks

Analyzed various graphs generated by Load Runner Analyzer and documented the reports

Created and Enhanced Vuser Scripts and executed in VuGen for Point of Sale Web applications.

Enhanced Vuser scripts with Programming Logic Parameterization, Correlations and Check points.

Performed Baseline testing, Stress Testing, Load Testing with performance testing.

Developed global functions to use across different applications and by defining DLLs (Dynamic Link Libraries) in C.

Worked on Garbage Collection/memory related problems by using ‘Heap size’, ‘verbose garbage collection’, or by ‘VMSTAT’.

Involved in batch process testing and front end applications.

Created virtual users in Load Runner to test the application under load and stress test.

Configuring various monitors and finding various performance bottlenecks from different monitors like webserver monitors, application server monitors, database server monitors or network monitors in the controller’s metrics by seeing the transaction response time, throughput, no of hits per sec etc…

Analyzed various graphs generated by Load Runner Analyzer and documented the reports, working on cross-results and extracting the reports to 3rd party applications.

Used HPs Business Technology optimization Method and worked with RDP protocol, Citrix, HTTP, web protocol.

Used BAC (Business Availabilty Center) to monitor the Solaris boxes with “uptime” command in the Unix box and output to a log file periodically

Worked on various retail applications and integrated applications like Telegence, 3PP, GCP, EUAM, CSI and all front end applications like OPUS, PDC1, PDC2, Phoenix, Premier and System X.

Performed SQL bottle necks and used extensive SQL scripts as part of testing.

Integral part of Joint testing effort with wireline and mobility teams.

Citi, Tampa, FL Nov 2006 to May 2007

PeopleSoft QA Analyst

The Best in Class Project is to create a Financial reporting warehouse system for Citi from EPM 9.0.

Responsibilities:

Worked with PeopleSoft Business Analysts in identification, prioritization and resolution of issues.

Documented the processes used for implementing new releases and resolving upgrade problems.

Tested EERS Process to send the file from BIC to EERS system which is an end user entitlement reporting application.

Resposible for testing all the PeopleCode, Pages, Components and the navigation developed for Average pool process.

Created load scenario, scheduled the Vuser to generate realistic load on the server

Inserted Rendezvous points to calculate the Transaction Response time under load

Performed Data Driven Test Using QTP, Conducted Regression Testing as and when the new builds were developed.

Used Test Director for bug tracking, reporting and analysis

Tested the Data migration files, SQRs and Reports Home Page run control page.

Participated in Load test reviews, specification documents and validation of requirements for load test.

Performed Manual Testing prior to Automated Testing of the Application.

Extensively involved in Functional testing of the Country Core module doing SIT and CIT.

Collaborated with Programmers on application specifications and system enhancements to ensure appropriateness of design.

Tested the Application Engines and Online panels with complex PeopleCode and responsible for the development of the Application Packages and required PSQuery.

Responsible for database testing using SQL queries and database functions and escalate the Possible Performance issues.

Environment: Load Runner Performance Center 8.1, Test Director/Quality Center 8.2, EPM 9.0, SQR, PeopleCode, Application Engine, PSQuery, Oracle.

Cingular Wireless, Richardson, TX Aug 2006 to Nov 2006

Load Runner Performance Analyst

Project: is to test the Performance of PeopleSoft 8.9 HRMS and E-Performance applications using LoadRunner.

Defined the strategy, planning and execution of quality assurance performance testing for the various components.

Analyzed the requirements and translating them into measurable nonfunctional requirements.

Identified performance issues with Load Balancer configuration and Settings, memory leaks, Web Sphere configuration, Web Server session management, deadlock conditions, database connectivity and hardware profiling.

Measured the System, Application Server, Web Server and Database Server resources with the help of LoadRunner Online Monitors. Analyzed the Resource graphs. Performed Statistical analysis of testing attributes/results.

Responsible for creating performance test plans, detailing requirements for Benchmark, Load, Stress, and Failover testing.

Customized the Scripts by parameters, correlations and error handlings using Virtual User Generator.

Responsible for gathering baseline statistics for each individual component by testing them separately.

Discussed with PeopleSoft Architect on Vtiers and parameterize the URL.

Worked closely with developers in developing scripts for Batch Processes like Create Pay sheets, Calculate

Payroll, Confirm Payroll and Print Paychecks.

Participated in walkthroughs and technical reviews.

Responsible for scheduling, monitoring scenarios/ mixed scenarios and analyzing results for identifying performance bottlenecks.

Responsible for creating a baseline and executing performance tests on all databases.

Performed screen navigation test to ensure that the links are not broken and are navigating to the correct pages as per the functional specs provided.

Determined the scalability of the clustered configuration and benchmarked the cluster's performance under load.

Identified performance issues with customized PeopleSoft application and escalated and demonstrated to SME’s

to resolve the issues by applying patches.

Point of Contact and area of expertise on different components of the Performance Center.

Used Quality Center as a Report Management tool and also write test cases in Quality Center.

Created documentation of the workflows for all the scripts.

Environment: LoadRunner Performance Center 8.1, LoadRunner Vugen 8.1, Test Director 8.0, HTML, Java Script, Web Sphere, Windows NT

Norwegian Cruise Lines, Miami, FL Jan 2006 - Aug 2006

PeopleSoft Developer

Norwegian Cruise Lines is one of the leading cruise vacation providers in the industry.

The project was to customize PeopleSoft to serve the functional needs of client. Upgrading PeopleSoft Financial modules from version 8.8 to version 8.9 and the monthly fixes delivered by the PeopleSoft group along with the customization as required by the client. Some modifications were also done to the SQRs as needed by the customized system. Also supported the production environment.

Developed Data Mover scripts to import and export between different databases.

Involved in Unit test plans and system test plans to test customized online panels and SQR programs, tracking errors in database.

Modified purchase order panels including line, schedule and distribution panels.

Created a SQR program for PO called NCL Cut & Paste which enables the Purchasing Agents to include the item data from excel sheets to Create Purchase Order page.

Production Support team member with ticketing system Unicenter for PO and GL Modules.

Developed online pages for user access and developed SQR Program and Shell script to read the data file send by external system called Fidelio File system.

Designed and customized various Pages and Components for Accounts Receivables, General Ledger and Purchasing using Application Designer.

Customized POPO005.SQR dispatch purchase order process to include the newly developed custom records.

Developed a Shell and a SQR to Archive all the purchase orders based on the Business Units.

Designed and developed Interface programs necessary to move data from the legacy System into PeopleSoft tables using SQR as the primary tool.

Creating panels, records, menus, interface/conversion programs, and reports using People Tools, People Code, and SQR

Created SQL Scripts to update flags and correct problems for recurring processes

Created additional customizations for new business acquisitions configuring setup panels, adding additional fields, PeopleCode and SQR interfaces.

Environment: PeopleSoft 8.8/8.9, PeopleTools 8.4 (PO, GL, AP), PeopleCode, SQR, PS/Query, SQL, Oracle 9i, Toad 6.x, Unicenter, Ultra Edit, Windows NT/UNIX

EMC-Formerly Businessedge Solutions, NJ Aug 2005 – Dec 2005

Cognos Developer & Performance Analyst

Business Edge Solutions is a Technological Firm delivering the Information Technology Solutions for Industries like communications, Financial Services, Life Services & Insurance.

I was involved in MCI DW project building & testing the OLAP Data warehouse on Oracle 10g that supports all the financial data and sales and marketing data.

Developed High Level Performance Test Plan and Test Cases. Created Virtual User scripts and enhanced them with Transactions, Wait Times and Rendezvous points using LoadRunner. Parameterized the scripts with test data. Created Scenarios for Load/Stress Testing.

Checked the Design and Creation of Data Cleansing, Validation and Loading scripts for Oracle Data Warehouse using Informatica. Tested the Star Schemas & Data Mappings that are used to extract data from different source files, transform the data using filters, expressions and stored procedures then load to Oracle Data Warehouse.

Created & Tested Reports using Queries, Slice and Dice, Drill down, Functions, Formulae etc.

Developed Mappings in Informatics to load the data from various sources using different Transformations like Source Qualifier, Expression, Aggregate, Update Strategy, Joiner, Filter, Router, and Lookup. Designed Multi-dimensional model and created Power Cubes using Cognos Power Play Transformer & Catalogs using

Cognos Impromptu, organizing Catalog folders, setting up User Classes and security.

Developed Impromptu and ReportNet reports including the Framework Manager models for creating metadata packages.

Created various Prompts, calculations, filters.

Developed multi page reports, Tabular Sets and Master detail reports.

Migrated Impromptu catalogs to Reportnet using Impcat2xml Migration utility.

Query Optimization by implementing Table weights, Governor Settings and Hot Files.

Creating Calculations, Query Prompts, Conditions & filters.

Scheduling and Distributing reports through Schedule Management in Cognos connection.

Performed unit testing on the reports created before giving to Testing.

Data modeling experience in Dimensional Data Modeling like Star Schema, Snow Flake Schema, Fact and Dimensional Tables.

Environment: LoadRunner 7.8, HTML, Cognos Impromptu, Cognos ReportNet, Informatica Power center 7.1, SQL, Oracle 10g, Toad, Unicenter, Windows NT/UNIX



Contact this candidate