Jutikadevi Sivaraja
Tel: 808-***-**** E-Mail: **********@*****.***
Summary of Career:
• Over 12 + years of experience in Software and Hardware Quality Assurance.
• 10 years as Lead Performance Test Engineer at Centers for Disease Control and Prevention.
• Global remote locations load testing setup and expertise, in India, Germany, Korea, Japan and Singapore.
• Proven expertise in test planning, execution, analysis troubleshooting, debugging of existing test scripts, troubleshooting of environment setup, results data collection, and results analysis with more than 12 projects at CDC.
• Designed and Implemented Automation Standards & Best Practices to guide Testers in using automated tools, to write scripts for the business applications.
• Expert in utilizing Quick Test Pro 9.0, Load Runner 9.0, Rational Robot to test hosted web-sites, for UPS, GE, Merial, Johnson Controls and CDC servers and database with (5000 virtual users).
• Managed and setup Quality Assurance Laboratory with 200 simulated user environments.
• Background in medical, scientific resource allocation, financial and human resources app.
• Strong technical and testing skills in SAS, WEB, JAVA, SQL, HTML, TSL, Oracle Financials, SQL Server, VB, ASP, Siebel, IIS, Cold Fusion, Perl, Active X, UNIX, C++, XML, AWT, Swing, Eservlets, Corba, Active X. Java Certification, Unix Scripting.
• Responsible for analyzing test data and results from tests and to make recommendations to systems software and hardware to better improve quality and performance of systems.
• Experienced in full software life cycle management tools (Rational Rational Unified Process) to configure and develop QA process, standards, and procedures for the Dept.
• Strong knowledge of Back- end database testing of Oracle 9, DB2 database, with SQL procedures, scripts and configuration testing of JRUN, Apache, Web Sphere and Web Logic.
• Performed functional and design specifications review, developed test strategies/ test plans, tested applications including black-box, system (acceptance), regression, verified results on supported platforms, performed database queries to validate test data, entered defects, worked closely with developers in defect resolution, assisted NOC (Network Operation Center) to troubleshoot issues and trained customer support and end-users/customers in using business applications.
Computer Skills:
Operating Systems: MSDos, Windows 2000 Pro/Server ME/ XP/ NT/ 98/ 95, MAC OSX/ OS9, Unix, iMac,
Databases/ Servers: Oracle 8i/ 9, SQL Server 9, MSAccess, Apache, Tomcat, IBM Mainframe, IIS.
Testing Tool: Rational Enterprise 2003.06 (Robot, TestManager, ClearQuest, ClearCase, Purify, Quantify, RequisitePro), Rational Performance Tester 8.1 , HP Mercury (QT Pro, WinRunner 9.0, LoadRunner 9.0, Test Director), SilkTest, SilkPerformer, Parasoft SOA Test 5.1.1, Microsoft Application Center Test (.Net Studio 2005)
Networking: Windows 2000/ Server/ NT, TCP/IP, Unix, Novel, JRUN.
Windows Application Development Tools: VB 6.0, .Net Studio 2005, JAVA Eclipse
Languages: ASP, Java, Java Script, C++, Cold Fusion, HTML, SQL, Visual Basic, TSL, JSP, J2EE, XML. Shell Scripting, Perl, DB2.
Experience:
September 2002- Current
Lightning Tech Solution, Inc.
Contract for Centers for Disease Control and Prevention Atlanta, GA
Lead Functional & Performance Software Test Engineer
Performance Testing of more than 12 projects at CDC to ensure scalability and quality of systems software and hardware which ultimately contributed to successful launches of websites including the redesigned cdc.gov website. Considered Authority in Performance Testing for Centres for Disease Control.
PROJECTS
1. Web Redesign of CDC.gov website. CDC.gov Ranks among Top Performers in Government Sites: On June 19, Foresee Results released the Second Quarter 2007 Satisfaction Scores for E-Government Web sites, as measured by the American Customer Satisfaction Index (ACSI). which includes sites with a score of 80 or higher. CDC.gov was part of this top-performing group, which included about 18% of the sites.
2. (EDN) Electronic Disease Notification System notifies state departments of Health from data collected from arriving refugees and immigrants.
3. (QARS) Quarantine Activity Reporting System provides the, Quarantine and Border Health Services Branch (QBHSB) a system for collecting Border Patrol Patient Data, for travelers, coming in with illnesses especially flu like systems like H1NI.
4. (PHINMS) Public Health Information Network Messaging System The PHINMS is key to assisting CDC applications, such as the National Electronic Disease Surveillance System (NEDSS) accomplish syndromic surveillance. As NEDSS quickly collects and monitors disease and outbreak data, such as the pandemic flu virus, the PHIN Messaging System quickly transmits the data to the CDC where epidemiologists analyze it and identify trends so they can protect the nation's health.
3. CRA- Countermeasure and Response Administration Inventory Tracking is a National program to help federal and state emergency response authorities locate critical medical countermeasures during public health emergencies.
4. BT-Project developed the capacity of state and local public health systems to prepare for and respond to an epidemic caused by a bioterrorist act or an attack involving small pox.
5. STARRS - Specimen Tracking and Results Reporting System - major goal of this system is to provide a central portal for CDC investigators to link specimen data received or generated from multiple sites, including, but not limited to, field investigations, internal laboratories, state health departments, contract laboratories, and other public health partners.
6. LUNA- Laboratory USER Network Application provides Public Health Partners an easy, effective way to track the whereabouts of a specimen-related shipment, send laboratory requests made in conjunction with a specimen submission, and access laboratory results as CDC personnel enter those results into the STARRS system.
7. Web Publishing is a web application. The application enables CDC authors, reviewers, and other individuals associated with the publishing process to manage their content in an effective and efficient way through the use of version control, repurposing content, and workflow to support CDC scientific clearance process, and other content management functionality.
8. DCTM- Documentum Scientific Clearance is the automated workflow process by which information products are approved by the appropriate Centers for Disease Control and Prevention (CDC) staff members before being released to the public as stated in the CDC Clearance Policy Scientific Clearance. This process is intended to facilitate the efficient and rapid clearance of routine, urgent, and emergency information.
9. TAI- Travel Approval Interface is a system to review and process travel authorization requests (e.g. travel orders, quarterly orders, amendments, travel vouchers, 348 travel), approving officials or their delegates currently must access the CDC/IS and navigate to the CDC/IS Approval System.
10. MACCS- The MACCS® (Managing Accounting Credit Card System. Using transaction data from the credit card processing center, MACCS® is a downstream process that provides a means of ensuring that each transaction is: a valid transaction, reviewed by an authorizing official, assigned to a proper budgetary fund, paid in a timely manner, and transmitted for posting to the general ledger.
11. TASNet- the CDC Time and Attendance System (TASNet) is designed to enhance employees’ ability to manage their personal leave.
12. WIZ- Workforce Information Zone is an on-line reporting tool on a collection of CDC/ATSDR personnel databases.
Accomplishments, Responsibilities and Work Performed on above projects as Lead Software Test Engineer:
ANALYSIS/Requirements Analysis/ Test Planning- Phase
• Worked with minimum supervision to start the process and complete the full performance testing cycle.
• Completed Software Quality Assurance and Methodologies course.
• Evaluate different tool sets that will be suitable for performance testing of bank end database that stores all the data that will interface with different applications at CDC.
• Developed comprehensive test strategies, test plans and QA schedule for all tests to be performed, including improvement goals, practical approaches to meet established budgeting and planning goals.
• Responsible for test planning, execution, analysis troubleshooting, debugging of existing test scripts, troubleshooting of environment setup, results data collection, and statistical results analysis.
• Research and development of technical specifications/ business requirements/ ICD documents; creating data sets; with data analysis scenarios.
• Gathered the related documentation and requirements from Business Rules to compose Performance Test Plan for CDC.
• Participated in continuous improvement of performance testing process or test strategy.
• Quickly learned new applications and always met target time release for project.
• Create metrics for monitoring and controlling the performance of all software application.
Script Development- Phase
• Expert in Utilizing Quick Test Pro 6.0- 9.0 to write automation test scripts for regression test scripts.
• Worked with MS SQL Manager to perform Data Validation with QTP.
• Used Team Foundation Server to report, and track defects found during testing.
• Used Rational Suite 2007, 2009 tools to transform data and to perform analyses.
• Used Parasot SOA Test- to create performance unit test scripts for SOAP messages.
• Ensure that the SOAP Messages created are within acceptable performance time.
• Worked with complex script and code scenarios to create and successfully execute performance and functional test scripts with Rational Robot.
• Configured Rational Robot/ Performance Tester to perform smoke, bounce, stress, integration, architecture and functionality specific performance monitoring of a highly complex medical and scientific web application.
Execution and Test Results Gathering- Phase
• Run functional and performances tests, in Mainframe environment and analyze resulting output, data/findings into systems reports and make recommendations to better improve the quality of the application code, system hardware and needed configuration and tuning changes to the web server and application servers.
• Worked well with all the teams to improve the performances of the applications and servers.
• This is shown with the quantifiable results obtained from the improvement of the transactions in the different projects that have been tuned and debugged successfully.
• Gathered results on performance, demographics, traffic and system bottlenecks, with low, mid and high user volume.
• Determined system stability, found issues and reported on system bottlenecks.
• Implemented and maintained test hardware and software systems for the testing team.
Create monitoring tools in mainframe to monitor performance of program modules in mainframe, while simulating hundreds of users, via Rational.
• Worked with Network and Hardware Analyst to debug and find performance related bottlenecks.
• Successful execution of all planned cdc.gov performance testing.
Test Results Analysis and Performance Debugging Phase
• Analyzed statistical data from Rational Test Manager for Performance and Functional Test Reports to create Test Analysis Summary Report.
• Obtain debugging information for failed tests; maintain test programs and scripts; evaluate testing coverage; participate in the documentation process, and providing feedback to developers.
• Used regression test results to determine best hardware and software configurations for CDC systems that were successfully implemented to improve the performance of the hardware and software of the system.
• Reported all performance issues of Web Services to Development Team for debugging and regression testing.
• Worked with mainframe DBA to write Natural code to monitor performance of TASNet module in mainframe.
• Worked with developer on to functionally test the application during critical roll-out.
• Determined system stability, found issues and reported on system bottlenecks.
• Gathered results on performance, demographics, traffic and system bottlenecks, with low, mid and high user volume.
• Determined system stability, found issues and reported on system bottlenecks.
• Used regression analysis, simulations and convert data/findings into applied information to produce reports for management.
• Collaborated with operations management, client management, development team at CDC to gather testing and distribution of analytics/reporting requirements.
Environment: Quick Test Pro 9.0, SQL Query, MS SQL, Team Foundation Server, Rational Test Manager Rational Performance Tester 8.1, Rational Robot, RUP, Clear Case, Clear Quest, Rational Rose, SQL, Web Service Studio, Tom Cat, Apache, Web sphere, AS/400, Unix, BEA Web Logic 9.1, ORACLE, PC Servers. Software: CrossWorlds, 2000/NT, Oracle, JAVA, MQ Series, .NET, Power Builder, Oracle Finance (11i), UNIX and Mainframe.
Silverpop Systems, Atlanta, GA February 2001- September 2002
Enterprise Level Software/ Client- Server Application
Software Engineer/ Systems Analyst
Silverpop Systems, provides enterprise level marketing and consumer research and forecasting software, with electronic messaging systems.
As software engineer and systems analyst I managed, tested, automated and analyzed all aspects of applications and software produced by the company, and made recommendations to better improve the overall quality and performance of the product.
Systems Analysis of- Full Life Cycle Planning for UPS/ GE/ ISS/ Merial Projects. Manual & Automation Testing of Client Server App.
• Participated in the definition of specifications, architecture and design of marketing and consumer research application in design phase.
• Reviewed and analyzed functional and design specifications, made recommendation for system improvements before development.
• Determine appropriate marketing research methodologies and techniques (e.g., quantitative surveys, marketing mix, qualitative, etc.) by leveraging input from research suppliers and internal experts to raise optimal research protocols that could be applied to company application.
• Ensured test plans and test cases were high on test efficiency and made best use of resources allocated and reduced test cycle times.
• Worked with development team to analyze marketing campaign management functionality to help customers drive more effective, efficient marketing campaigns, with company applications.
• Used test data for functional and performance instance perform application systems analyst and make the appropriate systems recommendations.
• Proven expertise in web testing of 20 different client and browser (IE, AOL, Netscape) environments and Marketing Research Application.
• Billing Systems, ensured the Enterprise level software, was billing customers monthly accurately and generated billing systems trends and reports for management based on statistical data collected.
• Daily manual test execution of functional, integration, security and system to ensure, timely defect reporting.
• Utilized Astra Quick Test, to perform functional and object level regression testing of hosted Web sites.
• Unit Testing and white- box testing of Java Code, in individual modules (Send Engine, Bounce, Tracking, List Management and Content Management) of application, with J UNIT test harnesses in Unix environment.
• Programmed Win Runner 7.2 to run daily integration, regression and database testing on application.
• Successfully tested network, configuration, and security on Apache and IBM AIX servers, setup and made recommendations, to lower known security vulnerabilities.
• Black- Box testing with Win Runner 7.2 of all consumer research content management, data transfer, ensuring the multi-media, html content in tables rendered properly when saved in content server, and rendered when recalled to GUI interface, in reception and in Visual Basic 6.0 client server application.
• Used Win Runner 7.2 to perform regression negative and positive tests in over 200 end- consumer user environments.
• Programmed Load Runner 7.5 to load test hosted web sites, Oracle Financials and IBM AIX, and Oracle 9 servers and database with (500 virtual concurrent users) to ensure huge volume of users, without down time on production system.
• Performed database testing, with queries using SQL on Oracle 9 database, to verify - database transactions, new records, correct display of records in application, and integrity of data.
• Data Migration Testing of data export from Oracle 9 database and correct report execution of reports, with end to end testing and regression testing of SQL stored procedures.
• Evaluated test results to determine compliance with test plans and established business practices.
• Contribute to continuous process improvements in QA and engineering. Setup Separate
Network.
• Installed, set up, administered, and managed server and database for Test Director, Astra Quick Test and Win Runner.
• Worked closely with developers and assisted NOC (Network Operation Center) to troubleshoot issues and trained customer support and end user customers in product.
Environment: Windows 2000 Pro/Server ME/ XP/ NT/ 98/ 95, MAC OSX/ OS9, Unix/Linux, Oracle 9, SQL, WinRunner 7.2, LoadRunner 7.5, Astra Professional, Test Director, Windows ME/ XP/ NT/ 98/ 95, MAC OSX, Unix/Linux, Microsoft Exchange Server, Novell, Group Wise 5/ Lotus Notes 4-6, Pegasus 3.1,
Eudora 4.3, AOL4.0-7.0, IE 4.5- 6.0, Netscape 4.0- 6.75, Team Share, Outlook 2000, Claris Emailer 2.0v1, Mac OS9, JAVA, Visual Basic 6.0, JSP, XML, AIX, Tomcat, Apache.
Interliant, Atlanta, GA QA Tester/Lead/Analyst March 2000- February 2001 Interliant is a web hosting company and provided ASP services and Supply Chain Processes through the web to leading companies like Dell. As QA Lead and Tester I mainly ensured the full integration and back end testing of the web sites that used the Supply Chain Process.
Bandwidth Billing
• Created and customized the master test plan, integration test plan, user acceptance test plan; and guided QA Testers in the practice of QA. Communicated QA processes to all levels of management in company.
• Utilized Rational Unified Process in Software Full Life Cycle, especially in terms of QA processes.
• Responsible for integration and unit testing of invoice billing for Bandwidth usages using Microsoft SQL, and Rodopi, manually and using Rational Robot, to perform automated database testing.
• Manual testing of domain names to assure that sales department were provided with updated and accurate binformation in the database, Rodopi and Access 2000.
• Kept track of domain name, billing date, bandwidth usage, next billing date, invoice, default bandwidth usage and additional data transfer usage according to billing plan.
Dell Host Billing
• Website testing- domain registration on dell host, integration testing of account setup in dell with database, customer notification and supply chain process.
• GUI testing to ensure the accurateness and overall functionality of website with Rational Robot in Internet Explorer 5.0, Netscape Navigator 6.0.
• Regression testing on website code conversion from Cold Fusion to ASP using Rational Robot. Following Rational Unified Process in Software Test approaches.
• Load testing on dell host, measuring server response time with 510 virtual users. Reporting and analyzing results of different capacity time using SQA Load Test.
• Web site checking on Dell Host, measuring and reporting broken links, slowly downloading pages, inactive multimedia and mapping all links using SQA Cite Check.
• Used Clear Quest to manage and report defects found.
• Responsible for overall testing of website, in Dell Host using Rational Performance Studio.
Crystal Reports
• Used Microsoft SQL, and executed reports in Windows 2000/95/98 environment.
• Designed sales, marketing, inventory control, production, in Seagate Info 7.
• Exported from Access database into either Excel 2000 Spreadsheet or HTML, making appropriate changes and e-mailed out the reports to management through out company.
Environment: Rational Robot, SQA Load Test, SQA Site Check (Rational Suite Performance Studio), Cold Fusion, ASP, Internet Explorer 5.0, Netscape Navigator 6.1/3.0,, Windows 2000, Access 2000, Rodopi, Microsoft SQL Server, ASP, ColdFusion, Seagate Info 7, Excel 2000, HTML 3.0- 5.0, Microsoft SQL, Access 2000.
Data Net Systems, Toledo, OH April 1997-March 2000
Project Control System Programmer/ QA Analyst/ Tester
• Implemented ‘Project Control System’(PCS) for an engineering consultancy company.
• Project kept track of resource allocation, activity information, status, employee information, and machinery information at different branches. Used MS Access 7.0 Forms as a front-end tool and Access for report generation.
• Created new reports, maintained existing reports and managed customer support. Managed quality assurance white/black box testing, process control and coordination of testing and QA representative teams, performed Load testing using LoadRunner.
• Also responsible for creating test plans, test cases, testing the entire project, both regression and load testing using automated tools, SQA Robot.
Hospital Management System Tester/Programmer
• Designed, developed and implemented a Hospital Management System involving Front Office Automation, Pharmacy Module System and medical Services System suing MS Access 2,0 and Visual Basic 3.0.
• Responsible for planning and developing prototypes after interfacing with the Hospital Management and testing the front office automation module of this system using WinRunner.
Environment: Oracle 7.0, SQL, Forms and ProC, Microstation, SQA Suite, IBM Mainframe, TSO/ISPF, Cobol, MVS/OS/ESA. MS Access 2.0, Visual Basic, Windows 3.1, WinRunner
Education- University of Toledo Masters of Arts 1999, University of Toledo Bachelor of Arts 1998
Advanced Training in Computer Information Systems, InfoTech Incorporation, Kuala Lumpur Malaysia
Graduate Diploma in Computer Science, Institute of Information and Technology, Kuala Lumpur Malaysia.