Post Job Free

Resume

Sign in

Test Cases Engineer

Location:
Dumfries, VA
Posted:
December 19, 2017

Contact this candidate

Resume:

RESUME

Name: Sharada. Rao.

Telephone: ******-7220

Email ID: ac3rgc@r.postjobfree.com

Position: Tester/QA Analyst

Visa Status: Permanent Resident/Green Card Holder

Location: VA, Willing to Relocate

Skype ID: raoskype

EXPERIENCE SUMMARY:

Over 10 years of extensive experience in software QA. Testing different types of applications using procedural object-oriented techniques. Well versed in testing of Internet/Intranet based web applications using Java, Selenium, Cucumber, Ruby, Eclipse, Jenkins, Bamboo, J2EE, ASP.NET, VB.NET, C#, JavaScript, VB Script, Python Scripting, Maven, Jenkins,XML, DHTML, HTML, CSS, MS SQL Server 200X, Oracle 11g, HP Quality Center 11.0, QTP12.0, Load Runner12.2, Apache JMeter 3.2, Blaze Meter 1.2.1 cloud-based testing, Resful web services, SOAP and MS Access as databases on WinXP/7/Red Hat Linux.

Expertise in Automation Testing using Java, Selenium, Cucumber, Ruby, Eclipse, Maven, Jenkins, Bamboo, Python, Quick Test Professional 12.0, by automating the web application modules.

Expertise in Performance testing using Load Runner12.2, Apache JMeter 3.2, and Blaze Meter 1.2.1 tools by creating virtual users with calculated threshold values based and generating load on the servers to calculate the performance status of the web application with respect to the multiple users based on threshold values. Generated and provided the test results and reports in PDF.

Experience in writing SQL scripts execute these scripts in database and verified the test data in database using Oracle 11g, MS SQL Server 2008, jQuery, and MS Access. Good knowledge of Stored Procedures, Triggers, Cursor and Indexes.

Experience in ETL performance testing by creating files with voluminous data to support stress and space estimates for ETL (Data Warehouse) loads by extracting the data from database, transferring and loading in to another type of database using SQL queries or database tools which supports multiple databases and perform ETL functions.

Hands on experience in different types of testing like: System Testing, Functionality Testing, Regression Testing, Black Box Testing, White Box testing, Database Testing, Reports testing using both PANORAMA and SSRS etc.

Experience in Networking Protocols like TCP/IP, SOAP, True Client, SMTP, DHCP, LDAP Network Routers and SCSI, iSCSI, gateways, in the same and different network domains.

Experience in Implementing Active Directory for Data Replication, De Duplication Purpose. Experience in several types of Backups and Recovery for Tera Byte of Data using different tools for SAN disk, and RAID config, Data Replication and Data Mirroring.

Excellent communication and interpersonal skills, motivated self-starter with exceptional team building, Leadership, and interpersonal skills. Good team player with the ability to work in time sensitive environments.

Experience with ALM tool for test preparation, execution and defect management

Knowledge of Waterfall, Lean & Agile Methodology and conducting SCRUM and PMAP meetings, and Strong knowledge of the Software Development Life Cycle (SDLC)

Hands on experience in Software Testing SDLC process, like, writing Test Plan, Test Scenarios, Test Cases, Performing and maintaining RTMs using Quality Center and BOXI and maintained GAP analysis. Execution of Test Cases, Generated Test Results and Reports. Recorded the defects using defect tracker tools like Jira, Quality Center, etc. Expertise in QA Methodology and QA Validations to ensure the Best Quality Assurance to the applications.

PROFESSIONAL EXPERINCE:

Employer: NCI Inc. VA

Client:Federal Communications Commission

Date: Aug 2013 to October 2017

Project: ULS/Auction/GIS

Role: Software Test Engineer

Roles and Responsibilities:

Analyzed the Design and Requirements, and developed the Test Scenarios and Test Cases, upload the same in JIRA with respect to development and O&M work request for projects: ULS, ULS: HAC, ULS: PND, ASR, ASR Applet Removal, TCNS: PTC, TCNS: PTCE, TCNS: Section 106, TCNS, Pleadings, Genesis, Spectrum Dashboard.

Developed the Positive, Negative, Regression Test Cases after analyzing the Requirements and Design specifications in Quality Center and Spreadsheet.

Expertise in developing automation framework with UFT and selenium.

Expertise in creating Automated Test Script using Ruby, Selenium, Cucumber, Java, and Eclipse, Maven, and UFT, QTP tools with VB Script.

Executed automated scripts in Jenkins for continuous integration purpose. Used GIT and SVN as project repository. Used Maven for tagging the scripts and maintaining the subversion of the automated test scripts. Any defects found during test script execution, logged those as defects in jira.

Expertise in creating automated test scripts for database for validating the data in the database. Expertise in writing complex queries and used query dictionary file as repository. Used pom file for project configuration details used by maven to build the project. Executed automated test scripts in Jenkins for continuous integration purpose.

Created Test Data for automated scripts with respect to Development, System Test and User Acceptance test environments. And executed these automated scripts in these environments and provided the results.

Expertise in creating load test scripts for performance testing purpose using Load Runner12.2, Apache JMeter 3.2, and Blaze Meter 1.2.1 tools. And executed these performance test scripts by creating virtual users with calculated threshold values and generating load on the servers to calculate the performance status of the web application with respect to the multiple users based on threshold values. Generated and provided the test results and reports in PDF for projects ULS, ULS: HAC, ULS: PND, ASR.

Created Test Data for Load test scripts with respect to different environments like Development, System Test and User Acceptance test environments. And executed these scripts and provided the test results and reports.

Executed these Load Runner Test Scripts in different environments System Test, Development and User Acceptance Test environments and provided the results.

Created test script using Rest and executed these test scripts in Restful web services and generated the test results report

Compared the web services test scripts with other web services tools like SOAP, and Load. Provided the best result.

Hands on experience in core AWS features which provides elastic main data center as per the need and packaged applications Development, and test frameworks. AWS features is to set up development environment & test take full control of complex environment to meet required developers environments using Cloud needs direct from IDEs to build and set up free tier environment on aws cloud. And tested the complete testing in this aws cloud as per the aws requirements.

Performed Load Testing using Blaze Meter for Performance Testing purpose

Extensively involved in 508 Testing for ASR Applet removal, and TCNS projects.

Experience with ALM tool for test preparation, execution and defect management

Documented the defects identified during different types of testing environments in System, Development and AT.

Extensively involved in CMMI Level 3 process for Computech Inc, and provided all the artifacts as per the Testing perspective.

Currently involved in CMMI Level 3 process for NCI, and in the preparation of artifacts as per the Testing Perspective.

Extensively involved in System Test and Regression test in AT for Shut Down Processes.

Conducted and Participated in the peer review of QA artifacts such as Test Plan, Test Scenario, Test Cases and Test Results.

Collaborated with the BAs for performing the testing activities like analyzing the requirements, functional specifications, Test Scripts, and Test Results.

Involved in reproducing the defects found in AT and Retest those defects in different environment system test and development and provided the workaround if necessary.

Attended Agile Training provided by Computech and also participated in

In-house training.

Environment:

Java, Selenium, Cucumber, Ruby, Maven, Eclipse, Jenkins, Bamboo, GIT, UFT, Quick Test Professional 11.0, Load Runner 12.2, Apache JMeter 3.2, Resful Webservices, Jmeter, Blaze Meter 1.2.1, SQL Server12.0, Oracle 11g, Postgressql, Toad, Big Data, Quality Center 11.0, XML, HTML, VMWare, ILOG, JRules, Valtool, 508 Compliances, ALM, Serena Change Version Manager,AWS, Cloud. etc.

Employer: Hexaware Technologies, VA

Client: Freddie Mac

Date: Mar 2013 to July2013

Project: Portals

Role: QA Test Engineer

Roles and Responsibilities:

Analyzed requirements and develop the Test Scenario, uploaded the same in Wiki.

Developed the Positive, Negative, Boundary, Verification and Validation Test Cases in Quality Center.

Created the Test Data for Test Cases in all Test Environments. E.g: System Test User Acceptance Test environment.

Expertise in developing automation framework with UFT and selenium.

Expertise in creating Automated Test Script using Ruby, Selenium, Cucumber, Java, and Eclipse, Maven, and QTP tools.

Executed automated scripts in Jenkins for continuous integration purpose. Used GIT and SVN as project repository. Used Maven for tagging the scripts and maintaining the subversion of the automated test scripts. Any defects found during test script execution, logged those as defects in jira.

Expertise in creating automated test scripts for database for validating the data in the database. Expertise in writing complex queries and used query dictionary file as repository. Used pom file for project configuration details used by maven to build the project. Executed automated test scripts in Jenkins for continuous integration purpose.

Created Test Data for automated scripts with respect to Development, System Test and User Acceptance test environments. And executed these automated scripts in these environments and provided the results.

Expertise in creating load test scripts for performance testing purpose using Load Runner12.2, Apache JMeter 3.2, and Blaze Meter 1.2.1 tools. And executed these performance test scripts by creating virtual users with calculated threshold values and generating load on the servers to calculate the performance status of the web application with respect to the multiple users based on threshold values. Generated and provided the test results and reports in PDF for projects Portal 1.0, and Portal 1.1.

Integrated required test scripts to quality center and generated the Test Results Reports using Quality Center.

Extensively worked on SQL: Written and Executed the SQL Queries as per the Requirements. Exported the Data in Spreadsheet from SQL and performed the Data Validation using Excel Compare.

Validate the Data from SQL and Portal web site, to make sure that data validation is accurate as per the Database Web Portal Third Party Used Test Tools

Extensively worked on Linux: Generated and Executed the Bash Script for Input and Output Files for Data validation

Involved in reproducing the defects found in UAT and Retest those defects and provided the solutions, workaround.

Involved in reproducing the defects found in UAT and Retest those defects and provided the solutions, workaround.

Extensively involved in generating the Automation Script using Quick Test Professional.

Executed Automated scripts in different Environments

Experience with ALM tool for test preparation, execution and defect management

Generated the Automated Test Results for each Test Cases using Quality Center in Word Document and uploaded the same in Wiki.

Extensively involved in testing AWS Cloud features. AWS features is to set up development environment & test take full control of complex environment to meet required developers environments using Cloud needs direct from IDEs to build and set up free tier environment on aws cloud. And tested the complete testing in this aws cloud as per the aws requirements.

Extensively involved in generating the Scripts using Resful Webservices for web service testing.

Executed the Web Service Test Scripts using Restful web services, SOAP, compare the data with the web service logs, database and portals

Generated the Web Service Test Results using HP-Web Service and maintained the same in Wiki

Conducted and Participated in the peer review of QA artifacts such as Test Plan, Test Scenario, Test Cases and Test Results.

Knowledge of Cucumber, Ruby for Generating the Automated scripts.

Extensively involved in the deployment of O&M releases and Portals projects. And verified the deployment test results/status whenever necessary.

Environment:

Java, Selenium, Cucumber, Eclipse, Maven, Jenkins, Bamboo, UFT, Quick Test Professional 11.0, Load Runner 12.2, Resful Webservices, Apache JMeter 3.2, Blaze Meter 1.2.1, SQL Server12.0, Oracle 11g, Toad, Big Data, Quality Center 11.0, XML, HTML, VMWare, ILOG, JRules, Valtool, 508 Compliances, ALM, Serena Change Version Manager, AWS, Cloud. Etc.

Employer: Aderas, VA

Client:Pension Benefit Guaranty Corporation

Date: Nov 2011 to Feb 2013

Project: BCVS

Role: QA Test Engineer

Roles and Responsibilities:

Analyzed the requirements and design defined for BCVS and O&M defined in the JRule Template thoroughly, and develop the Test Scenarios, uploaded the same in Serena Change Manager Version Control and Rule Studio, for the projects: ACT Archive 6.2.1, ADT 6.2/6.3, EOSL Release on 2012, Web Recalculation, End-To-End Test cases, UPT Functional Test Cases, 508 Compliances, EOSL Release on 2012

Developed the Positive, Negative, Regression Test Cases after analyzing the Requirements and Design specifications in Quality Center.

Created the Test Environment for testing from within Rule Studio.

Documented the Test Results in Ruel studio.

Expertise in developing automation framework with UFT and selenium.

Expertise in creating Automated Test Script using Ruby, Selenium, Cucumber, Java, and Eclipse, Maven, and QTP tools.

Executed automated scripts in Jenkins for continuous integration purpose. Used GIT and SVN as project repository. Used Maven for tagging the scripts and maintaining the subversion of the automated test scripts. Any defects found during test script execution, logged those as defects in jira.

Expertise in creating automated test scripts for database for validating the data in the database. Expertise in writing complex queries and used query dictionary file as repository. Used pom file for project configuration details used by maven to build the project. Executed automated test scripts in Jenkins for continuous integration purpose.

Expertise in creating load test scripts for performance testing purpose using Load Runner 12.0, Apache and JMeter 3.2 tools. And executed these performance test scripts by creating virtual users with calculated threshold values and generating load on the servers to calculate the performance status of the web application with respect to the multiple users based on threshold values. Generated and provided the test results and reports in PDF for projects ACT Archive 6.2.1, ADT 6.2/6.3, Web Recalculation.

Created Test Data for Load test scripts with respect to different environments like Development, System Test and User Acceptance test environments. And executed these scripts and provided the test results and reports.

Extensively involved in creating the web services test scripts using Resful web services and compared the test results using other tools like SOAP, Load.

Used the ILOG for data validation and monitoring the data in terms of the components graphs.

Extensively involved in testing AWS Cloud features. AWS features is to set up development environment & test take full control of complex environment to meet required developers environments using Cloud needs direct from IDEs to build and set up free tier environment on aws cloud. And tested the complete testing in this aws cloud as per the aws requirements.

Documented the defects identified during different types of testing environments: system, CDE-I and ITC.

Conducted and Participated in the peer review of QA artifacts such as Test Plan, Test Scenario, Test Cases and Test Results.

Written and Executed the Test Scripts for Automation Testing using Quick Test Professional. Developed the test suite, and ran the same in different test environments CDE-I and ITC.

Written and Executed the Test Scripts for Performance Testing using Load Runner. And developed the test suite, and ran the same in different environments CDE-I and ITC. And generated the Test Results.

Collaborated with the BAs for performing the testing activities like analyzing the requirements, functional design, Test Scripts, and Test Results.

Involved in reproducing the defects found in UAT and Retest those defects and provided the solutions, workaround.

Extensively involved in the deployment of O&M releases and BCVS projects. And verified the deployment test results/status whenever necessary.

Extensively involved in Reproducing and analyzing the defects found in all the testing environments: System, CDE-I and ITC.

Developed Automation suite using QTP. And also ran the suite for all different stages of New Build generated.

Developed the Performance test suite using Load Runner, and ran the suite for all different types of environments CDE-I and ITC.

Environment:

Quick Test Professional 11.0, Load Runner 12.0, Apache JMeter 3.2, Resful Webservices, SQL Server12.0, Oracle 11g, Postgressql, Toad, Big Data, MS Access, Quality Center 11.0, XML, HTML, VMWare, ILOG, JRules, Valtool, 508 Compliances, Serena Change Manager Version Control Manager, AWS, Cloud.

Employer: Technology Ventures, VA

Client: Fannie Mae

Date: Feb 2010 to Oct 2011

Project: HP-EPPM (Enterprise Project Portfolio Management)/ Interim Resource Management Tool (iRMT)/ Central Programme Office (CPO)

Role: QA Test Engineer

Roles and Responsibilities:

Analysis of Business/Functional requirements and uploading the requirements to Requisite Pro and tagging them in Requisite Pro as Functional and Business requirements.

Pushing the requirements from Requisite Pro to Quality Center using ICART – RMSync.

Written the Test Cases/Test Scripts in quality Center and Mapping them to the respective requirements.

Created the Test Data for each Test Case and Executed the Test Cases in Quality Center.

Generating the RTM by running SQL Scripts/BOXI.

Executed the Test Cases/ Test Scripts in Quality Center for all the New Releases.

Creating Automated Test Script using VB Scripts and QTP tools.

Created Test Data for automated scripts with respect to Development, System Test and User Acceptance test environments. And executed these automated scripts in these environments and provided the results.

Expertise in creating Load Test Scripts using Load Runner for Performance Testing purpose.

Created Test Data for Load test scripts with respect to different environments like Development, System Test and User Acceptance test environments. And executed these scripts and provided the test results and reports.

Generated the Test Plan and Test Result Summary documents.

Written SQL Queries with respect to the business requirements and executed in SQL Server2005

Maintaining the Admin Role for Pre-Requisite Pro, Quality Center, ICART, RMSync and Clear Quest.

Analyzed the Data and different types of Report format Bar/Pie/etc in Panorama

Developed Automation suite using QTP. And also ran the suite for all different stages of New Build generated.

Developed the Polygraphs in Excel Sheet for the Test Results for all types of Testing.

Worked extensively with SQL queries.

I have involved in the Review process of Test cases

Identification and notification of Showstopper issues, escalation if and when required during testing. Proper communication with management/offshore team to relay details of issues and their impact on the application.

Environment:

Quick Test Professional 11.0, Load Runner 11.0, SQL Server12.0, Oracle 11g, Big data, Toad, Quality Center 11.0, XML, HTML, VMWare, ILOG, JRules, Valtool, 508 Compliances, Serena Change Version Manager,

Employer: TCS, VA

Client: Freddie Mac

Date: Jan 2009 to Jan 2010

Project: Adv Service Portal1.0/ Adv SBO.

Role: QA Test Engineer

Roles and Responsibilities:

Analysis of Business requirements from DOORS and developed the Test Plan and Test cases and also developed the Test Cases in HP-Quality Center.

Developed the Test Cases for SIT, End to End testing, Services Testing and Back End Testing in Quality Center with respective business functions.

Making the Trace ability Matrix for SIT, E2E and Service Test Cases with respective Business Requirements.

Written SQL Queries with respect to the business requirements in DOORS and Tables in the Database.

Analysis of Gaps between the Requirements and Test Cases development.

Maintaining the RTM and generating the SDLC report for the test cases for all the business functionalities.

Raising the defects in Clear Quest and maintaining the Requirement Traceability Matrix for Clear Quest and Quality Center

Created the Test Data for each Test Cases and Executed the Test Cases in Quality Center at different stages like, Preliminary, System, Integration and Regression

Created the Test Data for each Test Cases and Executed all the SQL Queries for all the stages like Preliminary, System, Integration and Regression

Executed all the Service Test Cases using SOAP SONAR tool.

Generated Request XML files and ran against respective business functionalities for web services and generated Response XML files. And compared both Request and Response files.

Captured the XML file difference and Screen Shots during execution.

Automated the Scripts using VB Script and ran the automated scripts for respective business functionalities

Developed Automation suite using QTP. And also ran the suite for all different stages of New Build generated.

Developed the Test cases for Performance Testing with respect to the Product features.

Executed the test cases to check the Performance by using Performance Testing methodologies.

Developed the Polygraphs in Excel to compare the Performance of the product.

Responsible for GUI Testing, System Testing, Integration Testing, E2E Testing, Regression Testing, Service Testing and Back End Testing.

Worked extensively with SQL queries.

Generated SDLC report for all the modules for all the different stages.

I have involved in the Review process of Test cases and RTM Process.

Identification and notification of Showstopper issues, escalation if and when required during testing. Proper communication with management/offshore team to relay details of issues and their impact on the application.

Environment:

QTP, SOAP SONAR, Service Testing, Performance Testing, Manual Testing, GUI Testing, E2E Testing, Regression Testing, Black Box testing, Integration Testing, PVCS, Java, J2EE, VB Script, C#, HTML, XML, Oracle 9i and Web Services, HP Quality Center, Clear Quest.

Employer: Wipro Technologies, India

Date: Jan 2005 to Jan 2008

Project: Galileo/ Supply Chain Management Services/ Auto Banker Banking System

Role: QA Test Engineer

Roles and Responsibilities:

Configured the Quantum Storage devices DXI3500, DXI4500, DXI 5500, and DXI 7500 using LUN 5 mechanism.

Configured the DX100 machines using Solstice Disk and shared Quantum Storage over Ultra Wide Differential SCSI interfaces; stripped volumes were carefully configured to work optimally with Oracle 7.3.3 and 8;

Created both RAID5 and RAID 0+1 striped LUNS for use with DXI3500, DXI5500 and DXI7500 for Tera Byte of disk configuration

Configured the SCSI and iSCSI sender and Receiver for Back Up and Restore of Tera Byte Data Storage.

Generated Test Suite by using Rational Robot and Rational Unified Process.

Installed Oracle 7.3.3 and created a basic database. Configured the SCSI options for optimal disk I/O, UFS and VXFS file systems with appropriate block sizes.

Install, debug, configure and UNIX based subsystems. Installed and configured Solaris O/S, 9/10, TCP/IP network configuration, RAID Volumes on Storage Array Disks using Disk Suite, file systems, NAS and NFS services, User Accounts, Automatic Installation via JumpStart, Network Terminal Servers, Configured departmental intranets with HTTP and FTP services, Basic firewall with FireWall-1, High Availability Servers for NFS.

Analysis of Business requirements and developed the Test Plan and Test cases

Created Test Data and Ran the Test Cases for SIT, End to End testing and Regression Testing.

Raised the Defects in Bugzilla

Generated SDLC report for all the stages of SDLC Process.

Environment:

Quantum DX100, DXI3500, DXI4500, DXI5500 and DXI7500, Rational tools like Rational Robot, Rational Unified Process, SOLARIS 2.x; Linux, Kernel 2.4,2.6, Raid Manager 6, IP Multicasting, Storage Array: Connector SAN Directors and switches, File System (VxFS), NFS, NAS, DNS, Jumpstart, TCP/IP, SSH, Brocade Switch, JNI/Qlogic/Emulex fibre boards and configuration, Fibre Channel Switch, Brocade 3800 FC Switches, SATA based storage arrays, FC Switches; Bugzilla, Atlassian JIRA, Atlassian Confluence

Employer: Systime Computer Systems Ltd, India

Date: Jun 2002 to Dec 2004

Project: E-Commerce, ERP, SAP Packages

Role: QA Test Engineer

Roles and Responsibilities:

Analysis of Business/Functional requirements and uploading the requirements to Requisite Pro and tagging them in Requisite Pro as Functional and Business requirements.

Pushing the requirements from Requisite Pro to Quality Center using ICART – RMSync.

Written the Test Cases/Test Scripts in quality Center and Mapping them to the respective requirements.

Created the Test Data for each Test Case and Executed the Test Cases in Quality Center.

Generating the RTM by running SQL Scripts/BOXI.

Executed the Test Cases/ Test Scripts in Quality Center for all the New Releases.

Creating Automated Test Script using win runner, and VB Scripts tools.

Created Test Data for automated scripts with respect to Development, System Test and User Acceptance test environments. And executed these automated scripts in these environments and provided the results.

Generated the Test Plan and Test Result Summary documents.

Written SQL Queries with respect to the business requirements and executed in SQL Server 2000

Maintaining the Admin Role for Quality Center and Clear Quest.

Analyzed the Data and different types of Report format Bar/Pie/etc in SSRS.

Developed the Polygraphs in Excel Sheet for the Test Results for all types of Testing.

Worked extensively with SQL queries.

I have involved in the Review process of Test cases

Identification and notification of Showstopper issues, escalation if and when required during testing. Proper communication with management/offshore team to relay details of issues and their impact on the application.

Environment:

Quality Center, SQL Server 2000, Oracle 9i, Visual Basic, Java, JSP, ASP, Win Runner, VB SCript, Java Script, XML, HTML,DHTL.

QUALIFICATIN:

B.E [E & C] : Bachelor Of Engineering in Electronics & Communication, India

M.B.A.[Master of Business Administration], India.

ADDITIONAL INFORMATION:

SKILLS

Java (6 years), Cucumber (6 years), Selenium (6 years), Maven (6), API (6 years), .NET (4 years), UFT (6 years).

Load Runner (5 Years), Jmeter (4 Years),Blazemeter(2 Years), QTP (4 Years), SOAP UI (4 Years), REST web services (4 years), QC (8 Years).

Database (10 years), SQL Servre (10 years), Oracle (5 years), Postgressql (2 Years).

Jira (8 years), Agile (8 Years), PVCS (2 years), UNIX (2 years).

TECHNICAL SKILLS:

Testing Tools: Load Runner,Jmeter, Blazemeter, SoapUI, Restful web service, APIs.

Languages: Java, C, C++, ASP.NET, VB.NET Cucmber, Selenium,Maven, XML, HTML

Database: Oracle 11g/12C, MS SQL Server, DB2, Postgressql.

Scripting: Java Script, VB Script,Unix, Shell,Python, PL/SQL, and Ruby

Management Tools: UFT, Jira, GIT, Jenkins, Bamboo, HP ALM, Remedy.

Methodologies: Waterfall, V-Model, Iterative, Agile, Scrum.

Operating System: Windows 7/XP/NT, Linux, Unix.

Servers IBM Web Sphere, JBoss, Apache, Tomcat, IIS.

Networking: LAN, WAN, Router, Modem, Switch

Reporting Tools: Crystal Reports, SSRS



Contact this candidate