Post Job Free

Resume

Sign in

Sales Test

Location:
Lake Forest, CA
Posted:
April 07, 2020

Contact this candidate

Resume:

QA Test Consultant/Analyst

Name : Madhusudhan Gayam

E-mail : adcpfu@r.postjobfree.com Mobile: +1-762-***-****

Summary: Having 14 plus years of experience in IT industry and specialized in Manual, Automation (in-house and open source), Functional, API Web services (REST and SOAP), System, Integration, UAT/CAT testing and Test Data and Test Environment Setups. Proficient in delivering Mobile, Client/Server, Web and Cloud applications. Strong Experience in domains like Salesforce (Sales Could, Service Cloud, Customer Portal, etc.), CRM, Cloud Security, Pharma, Telecom (IPTV), Media, Entertainment, Mobile, Retail, Finance, E-Commerce, ServiceNow, Insurance, Healthcare, Gaming (Gambling, Slots, Regulations and Compliance), Environmental Science, Digital, etc.

Good Experience in Waterfall, V-Model and Agile (Scrum, Kanban and customized Agile) methodologies

Well versed with manual testing concepts, processes, techniques, approaches and methodologies

Having 8 plus years of QA Lead exp in handling both onshore and offshore teams, Clients, Stakeholders, etc.

Experience in functional automation using selenium web driver and other in-house tools

Proficient in DB, Data Migration and Back-end testing using SQL queries preparation and execution

Having good experience in handling Omni channel testing and exposure to POS (point of sale) and WMS

Well versed with browser compatibility and both desktop and mobile browser responsive testing

Experience with system, integration and end to end testing on both mobile and desktop platforms

Well versed with Sales cloud, Service cloud, Chatter cloud and customer portal of Salesforce.com and force.com, Visualforce, both classic and lighting features and CRM configurations

Well versed with Lead, Account, Contact, Opportunity and Case management of SFDC CRM

Experience in testing Validation and Sharing Rules, Workflows, Custom objects, Tabs, Record Types, Page and Search Layouts, Profiles, Roles, Emails and Permission Sets for different users at different levels of organization

Actively participating in Bug Triage and SCRUM meetings and interacting with Client and Team

Having experience in setting up the TCOE and being part of test center of excellence team

Experience in Bug Reporting, Reviewing, Verification, Compilation, Tracking and Prioritizing them

Experience on STBs (Set-top-Box), Middleware, Embedded, Retail, Media, CRM, Real time OS and APIs testing

Well versed with Android and iOS apps (both native and hybrid), risk based and provisioning-based testing

Experience in creation of Test Plan, Estimation, Metrics, Reports, Schedules, Daily Status and process docs

Strong Experience in Conducting Agile ceremonies like Backlog grooming, Planning, Review and Retrospection

Having good experience on Casino Gaming, Compliance, Regulations, Infrastructure and Hardware peripherals testing of Slot Machines

Proficient with bug tracking and test management tools like Jira, IBM Clear Quest, Test Track Pro, Rally, TFS, Source Forge, Test Link and QA Sync

Thorough understanding and knowledge of STLC, SDLC, Test Process, Test Methodologies, Testing Fundamentals, Bug Life Cycle and Release Process

Writing and executing functional, UI, acceptance, regression, integration tests on multi-tiered web applications in a cutting edge Agile and Waterfall environment

Experience in Compilation/Deployment of builds and building Test Environment from ground-up including set-up of test engines/servers in variety of environments

Experience in User Interface (GUI), Accessibility (ADA, WCAG, 508, etc. compliances), Backend, Ad-hoc, Compatibility, Exploratory, Database, End-End, System, Integration, Performance, Retesting and UAT testing

Researching, recommending, building and/or implementing standardized tools to help move faster and get more efficient in carrying out day to day QA activities and tasks

Key contact for Client interface during User Acceptance Test phases. Manage all reporting, test metrics and scope estimating and provide ongoing support to application development project manager.

Experience in Design, Develop, Execute and maintaining automated test, smoke and Performance scripts using open source, in-house test and/or commercial automation tools

Identifying, developing and reviewing test suits for various functional areas based on the product requirements and evaluating, reviewing product specifications and identifying the gaps

Coordinate all aspects of the project with other teams, such as Data Architecture, Reporting, Quality Assurance, Support, PMOs, Infrastructure, product managers/owners, business Analysts/owners and coordinate different application/environment teams in running integration/interfaces

Domain Knowledge:

Casino Gaming, Embedded, Telecom (IPTV), Set-top-Box (STB), Web Applications, Finance, Insurance, Digital, Media, Entertainment, Waste Management, e-commerce, Retail, CRM, Pharmaceutical Claims, HealthCare, Cloud Security, Data Science, Data Analytics, DLP, Anomaly Detection (Machine Learning and Artificial Intelligence), Mobile (iOS, Android, Hybrid and Native) Apps, API Web services (Rest and SOAP), ServiceNow, and Salesforce (SFDC, Sales Could, Service Cloud, Customer Portal, Chatter Cloud, etc)

Educational Qualification:

Pursued M. Sc (Computer Science) in Mar’2003 from Nagarjuna University with 78%

Technical Skills:

Operating Systems : Windows XP/ NT/2000/9, MS-DOS, UNIX, Linux

Programming Languages : C, C++, Core Java, VB

Scripting and Web : JavaScript, VBScript, HTML, DHTML

Test Automation Tools : Shark, Crawler, Build Server, Selenium (Webdriver, POM), JMeter

Test and Bug Tracking Tools : HP ALM, HP QC, IBM ClearQuest (IBM CQ), Test Track Pro (TTP), Jira, Rally,

Source Forge, TFS, qTest, Mantis, QA Sync, SharePoint, TestLink, ServiceNow

Version Control Tool : VSS, SVN, Clarity, PSuit, Jenkins, BitBucket, Git, Azure DevOps

Other Tools : Putty, WinSCP, WinSend, TOAD, PgAdmin, Anywhere Device, Confluence, Slack,

SOAPUI, Postman, Swagger, Wrike, IBM Sterling OMS, Lucid Charts, Smartsheets

Application &Web Server : WebLogic 7.1, Orion 2.0.2, Jboss 4.0.3, Tomcat 7&8, IIS 6.0, 7.5

Databases : Oracle 9i, MySQL, SQL Server, Postgres, HBase, Hadoop, Vertica, Mongo, Ping

Devices : IOS (iPhone and iPad-mini) and Android (nexus and S4)

Experience Summary:

#9

Client: Capital Group, Irvine, CA Jan’20 – till date

Role: IT - Technical Test Lead CRM SFDC Sales Cloud

Description: iCRM, Retirement Plan, Institutional Consultants and Clients

Responsibilities:

Analyzing and reviewing Requirements, User Stories and identifying the gaps

Keep all stake holders up to date on the status of the project/Sprints

Functional, Regression, Ad-hoc, Exploratory, End-End and API Web services (REST) testing

Bug Reporting, Validation, Compilation, Review, Prioritizing and Verification

Plan, Estimate and Organize the work within the team

Attending scrum and client calls to give daily/weekly/monthly status updates

Account, Contact, Opportunity, customer portal, Case and product management of SFDC CRM

Validation and Sharing Rules, Workflows, Custom objects, Tabs, Record Types, Page and Search

Layouts, Profiles, Roles, Emails and Permission Sets for different users at different levels of organization

Performing testing on Salesforce Classic and Lightning Components, Lightning Web Components, etc.

Analyzing Salesforce debug logs to help other teams in troubleshooting and finding root cause of the issues

Mentors, guide, help, motivate, inspire, direct and provide solutions to the team members

Maintaining query logs, project related docs and conducting project, defect status meetings

Cross-team collaboration with engineering/dev, PMOs, Product Managers and partner teams

Testing application on different kind of environments like QA and Staging/Pre-Prod (UAT)

Environment: HP QC/ALM, Jira, SQL Server, ServiceNow, Apex, qTest, Saleforce.com (SFDC, Sales Cloud, Partner, Customer Community Portal, PSE Console and Service Cloud), Postman, SharePoint, RocketDocs, WebEx, AEM

#8

Client: S&P Global Platts, Denver, CO Jul’19 – Jan’20

Role: Software Developer in Test

Description: Benport, Platts Dimensions, PACE Automation, WRD API (OData Services) testing

Responsibilities:

Analyzing and reviewing Requirements, User Stories and identifying the gaps

Keep all stake holders up to date on the status of the project/Sprints

Regression, Ad-hoc, Exploratory, End-End, OData and API Web services (REST) testing

Bug Reporting, Validation, Compilation, Review, Prioritizing and Verification

Plan, Estimate and Organize the work within the team and resource planning

Attending scrum and client calls to give daily/weekly/monthly status updates

Review bug reports for errors prior to them being entered into the tracking software

Functional, System, UI, Accessibility, Integration, Database, Data Migration, Mobile (hybrid and native) and Browser Compatibility testing

Performing functional, automation, database and API testing for a wide range of enterprise applications

Attending, contributing in sprint planning sessions and design test approaches to test user stories within the

same sprint and look for the opportunities to automate

Tests by identifying the areas as to minimize retesting

Initiating and supporting performance and security testing

Writing test cases and scenarios from the requirements/designs; manage bug tracking

Test Analysis and Reporting - Investigating new Testing Methodologies, keep abreast with latest advances/

techniques and ramp up the team accordingly

Creating and maintaining functional, non-functional and system test scenarios and plans

Preparing test plan, providing estimations, reviewing performance and security test plans

Writing, Updating, Reviewing, Refining and Executing test cases and creation of test data

Cross-team collaboration with engineering/dev, PMOs, Product Managers and partner teams

Testing application on different kind of environments like Test, Staging, Pre-Prod and Production

Setting up and reviewing team/personal goals, giving continuous feedback and team performance

Enabling the team to prepare for Test Readiness based on IT services knowledge

Environment: Azure DevOps, TFS, .Net, CCP, Java Script, SharePoint 2010 and 2013, Java, Selenium, SQL Server, Maven, Jenkins, ALM, VSTS, Angular JS, NodeJS, OData Services, Postman, SOAP API, Swagger, JMeter, iOS, Android, Confluence, WebEx Team, ServiceNow, Microsoft Teams

#7

Client: Charter Communications (Spectrum, TWC, BHN), Denver, CO Apr’19 – Sep’19

Role: Principal Technologist I

Description: spectrum.net, busniess.net, my spectrum app, IT Services, Microservices and SOA Architecture

Responsibilities:

Analyzing and reviewing Requirements, User Stories and identifying the gaps

Keep all stake holders up to date on the status of the project/Sprints

Regression, Ad-hoc, Exploratory, End-End and API Web services (RESTful and SOAP) testing

Bug Reporting, Validation, Compilation, Review, Prioritizing and Verification

Plan, Estimate and Organize the work within the team and resource planning

Attending scrum and client calls to give daily/weekly/monthly status updates

Review bug reports for errors prior to them being entered into the tracking software

Functional, System, UI, Accessibility, Integration, Database, Data Migration, Mobile (hybrid and native) and Browser Compatibility testing

Understanding technical details of Xplatform programs such as Biller Isolation, SDP, Company-wide Payment Services, and articulate the impact for SIT LOBs and developing Test Strategy for the same

Creating and maintaining functional, non-functional and system test scenarios and plans

Preparing test plan, providing estimations, reviewing performance and security test plans

Writing, Updating, Reviewing, Refining and Executing test cases and creation of test data

Cross-team collaboration with engineering/dev, PMOs, Product Managers and partner teams

Testing application on different kind of environments like Test, Staging, Pre-Prod and Production

Setting up and reviewing team/personal goals, giving continuous feedback and team performance

Enabling the team to prepare for Test Readiness based on IT services knowledge

Identification of IT services corresponding to consuming microservices

Validation of IT Webservices interfacing with microservices

Validation of functionalities end to end from consuming application to micro services, web-services and DB

Enabling team to do API Test Suite Updation (Automation) on IT Webservices side

Helping the project team to understand Automation Scripts/Framework for the interfacing calls

E2E Architecture Understanding & Workflow Documentation

Identify and document Sequence Diagrams for E2E Workflows from UI - SE layer - IT layer

Developing Ball-drop artifacts depicting E2E workflows across the Architecture

Environment: SharePoint 2010 and 2013, Java, Ping DB, SQL Server, Maven, Jenkins, ALM, Jira, Angular JS, NodeJS, Postman, MuleSoft, SOAP API, Swagger, iOS, Android, Confluence, WebEx Team, ServiceNow, ICOMS

#6

Client: OSISoft, San Francisco area, CA Jun’18 – Mar’19

Role: QA Lead SFDC Service Cloud and Customer Portal/Sr. SFDC Data Migration and Integration Engineer

Description: Salesforce Service Cloud and Customer Portal and SFDC Data Migration and Integration

Responsibilities:

Managing 4 engineers of team and 1 to 4 projects in parallel and onshore and offshore coordination

Analyzing and reviewing Requirements, User Stories and identifying the gaps

Conduction Risk based and provisioning-based testing

Keep all stake holders up to date on the status of the project/Sprints

Regression, Ad-hoc, Exploratory, End-End and API Web services (RESTful) testing

Bug Reporting, Validation, Compilation, Review, Prioritizing and Verification

Lead, Account, Contact, Opportunity, customer portal, Case and knowledge management of SFDC CRM

Tested Validation and Sharing Rules, Workflows, Custom objects, Tabs, Record Types, Page and Search

Layouts, Profiles, Roles, Emails and Permission Sets for different users at different levels of organization

CTI/InContat, Coveo Search, Bomgar and other tools integration testing with Service Cloud and Portal

Performing testing on Salesforce Classic and Lightning Components, Lightning Web Components, etc.

Plan, Estimate and Organize the work within the team and resource planning

Attending scrum and client calls to give daily/weekly/monthly status updates

Review bug reports for errors prior to them being entered the tracking software

Functional, System, UI, Accessibility, Integration, Data Migration, UAT and Browser Compatibility testing

Creating and maintaining functional, non-functional and system test scenarios and plans

Preparing test plan, providing estimations, adding new test cases and covers ad-hoc and negative test cases

Writing, Updating, Reviewing, Refining and Executing test cases and creation of test data

Mentors, guide, help, motivate, inspire, direct and provide solutions to the team members

Maintaining query logs, project related docs and conducting project, defect status meetings

Cross-team collaboration with engineering/dev, PMOs, Product Managers and partner teams

Testing application on different kind of environments like Test, Staging and Pre-Prod (UAT)

Environment: SharePoint 2010 and 2013, SQL Server, ServiceNow, Apex, Saleforce.com (SFDC, Sales Cloud, Partner, Customer Community Portal, PSE Console and Service Cloud), Postman, Dell Boomi, pForce, Smartsheets, Bomgar, CTI, Incontact, Coveo Search, Lucid charts,

#5

Client: Republic Services Inc, Phoenix, AZ Aug’15 – Dec’17

Role: Lead QA/QA Manager

Project Description: Classic/Digital MR, RS.com (eCommerce), Capture (SFDC, Sales Cloud) and Sales Com

The client is basically into environmental science, they collect waste/garbage from Residents, Industries and Communities in USA and process some of them and landfills rest. The mentioned application above creates opportunities to the end users to go through all needs and provides feasibility to create an account, maintain their profile, raise requests and make payments.

Responsibilities:

Managing 7 engineers of team and 1 to 4 projects in parallel

Analyzing and reviewing Requirements, User Stories and identifying the gaps

Browser compatibility, desktop and mobile browser responsive testing

Keep all stake holders up to date on the status of the project/Sprints

Experienced in Automation and execution of Test cases using Selenium

Bug Reporting, Validation, Compilation, Review, Prioritizing and Verification

Testing Lead, Account, Contact, Opportunity and Case objects and visualforce and lighting of SFDC CRM

Tested Validation and Sharing Rules, Work Flows, Custom objects, Tabs, Record Types, Page and Search Layouts, Profiles, Roles, Emails and Permission Sets for different users at different levels of organization

Regression, Ad-hoc, Exploratory and API Web services (RESTful and SOAP) testing

Plan, Estimate and Organize the work within the team and resource planning

Attending scrum and client calls to give daily/weekly/monthly status updates

Review bug reports for errors prior to them being entered into the tracking software

Functional, System, UI, Accessibility (ADA, WCAG, 508, etc. compliances), Integration, Database, Data Migration, Mobile (hybrid and native) and Browser Compatibility testing

Creating and maintaining functional, non-functional and system manual test scenarios and plans

Preparing test plan, providing estimations, reviewing performance and security test plans

Adding new test cases to an existing test plan and covers ad-hoc and negative test cases

Writing, Updating, Reviewing, Refining and Executing test cases and creation of test data

Mentors, guide, help, motivate, inspire, direct and provide solutions to the team members

Formed a TCOE team with 4 key resources for the products I am looking after

Interviewing, hiring and placing candidates in various test positions within the organization

Maintaining query logs, project related docs and conducting project, defect status meetings

Cross-team collaboration with engineering/dev, PMOs, Product Managers and partner teams

Testing application on different kind of environments like Test, Staging, Pre-Prod (UAT) and Production

Setting up and reviewing team/personal goals, giving continuous feedback and team performance

Environment: SharePoint 2010/13, SQL Server, HP ALM, Rally, Selenium, WebDriver, qTest, BitBucket, Jenkins, ServiceNow, Confluence, Slack, Android and iOS mobiles and tabs, Angular JS, Node JS, Apex, Saleforce.com (SFDC, Sales Cloud and Service Cloud), force.com, Lighting Component, Visualforce, Postman, SOAPUI, Oracle Big machines

#4

Client: CipherCloud, San Jose, CA Oct’13 – Aug’15

Role: Sr. Lead QA/Associate QA Manager

Product Description: Cloud Discovery, ServiceNow, Salesforce (SFDC) DLP, Monitoring and Anomaly

The Unified Management Console aims to build out a single pane of glass that will allow CipherCloud customers to perform – “DLP Data and Cloud Service discovery”, “User Activity and Anomaly Monitoring” and build “Data Flow Controls”. This is in addition to many other strategic items that will help customers manage and monitor their CipherCloud software components (such as a distributed cluster of CC gateways). Our goal will be to provide greater visibility – both at a high level as well at a granular level and very granular controls in terms of SFDC data across users, devices and as it relates to the different activities that are being performed. This will provide significant additional value as compared to analyzing logs through a generic log analysis platform such as Splunk or LogRhythm.

Responsibilities:

Analyzing and reviewing Requirements, User Stories and identifying the gaps

Regression, Ad-hoc, Exploratory and API Web services (RESTful) testing

Review Test Scenarios and Cases and Bug Reporting, Validation, Compilation, Prioritizing and Verification

Testing Lead, Account, Contact, Opportunity and Case objects and visualforce and lighting of SFDC CRM

Tested Validation and Sharing Rules, Workflows, Custom objects, Tabs, Record Types, Page and Search Layouts, Profiles, Roles, Emails and Permission Sets for different users at different levels of organization

Performance, Functional, System, Integration, UAT, DB and Browser Compatibility testing

Attending scrum and HQ QA Director calls and providing daily/weekly/monthly status updates

Creating and maintaining functional, non-functional and system test scenarios and plans

Preparing test plan, providing estimations, reviewing performance and security test plans

Adding new test cases to an existing test plan and covers ad-hoc and negative test cases

Writing, Updating, Reviewing, Refining and Executing manual test cases and creation of test data

Formed a TCOE team with 5 key resources for the products I am looking after

Mentors, guide, help, motivate, inspire, direct and provide solutions to the team members

Maintaining query logs, product related docs and conducting project, defect status meetings

Cross-team collaboration with engineering/dev, PMOs, Product Managers and partner teams

Setting up and reviewing team/personal goals, giving continuous feedback and team performance

Managing 3 to 13 engineers of team including both full time and out sourced and 1 to 5 products in parallel

Environment: Linux CentOS6.6, Jira, JMeter, Selenium, Webdriver, TestLink, BitBucket, Jenkins, Putty, WinScp, Splunk, Flume, Mongo DB, PostgresSQL, HBase, Hadoop, Hive, Impala, Cloudera, Vertica, Apache Tomcat, Apex, Saleforce.com (SFDC, Sales Cloud and Service Cloud), force.com, Lighting Component, Visualforce, ServiceNow, Postman, REST Postman, REST

#3.A

Client: Abercrombie & Fitch, New Albany, OH Aug’12 – Oct’13

Project Description: A&F, A&F Kids (Retail, eCommerce, POS)

#3.B

Client: First Lab, KOP, PA

Project Description: Ulliance (Tasks and Messaging)/PHM (E-Forms)/Health Care

#3.C

Client: ScriptSave, KOP, PA

Project Description: Pharmaceutical Claims (Pharma, PBM)

#3.D

Client: Harleysville Insurance (Nationwide), Harleysville, PA

Project Description: Browser Agnostic

Responsibilities:

Analyzing requirements, User Stories and identifying the gaps

Attending scrum and client calls to give updates on daily status

Regression, Ad-hoc and Exploratory testing for uncovered bugs

Bug Reporting, Validation, Compilation, Review, Prioritizing and Verification

Regression, Ad-hoc, Exploratory and API Web services (RESTful) testing

Performance, Functional, System, Integration and end-end testing

Writing, Updating, Reviewing and Executing manual test cases and creation of test data

Browser compatibility, desktop and mobile browser responsive testing

Review bug reports for errors prior to them being entered into the tracking software

Creating and maintaining functional, non-functional and system test scenarios and plans

Testing application on different kind of environments like Test, Staging, Pre-Prod (UAT) and Production

Environment: Windows XP, .Net, ASP.Net, VB.Net 6.0, C#, MVC 3.0, Entity Framework, HP ALM, Rally, IBM Sterling OMS, TFS, Selenium, VSTS, SVN, SharePoint, SQL Server 2005, 2008, ASP, Java Script, HTML, XML, XLS, JQuery, AJAX, Kendo UI Controls, IIS 7.5, IE 7/8/9, Chrome, Mozilla, Safari, Postman, REST

#2

Organization: Aristocrat Technologies, Sydney, Australia Jun’10 – Jul’12

Other Clients: Win Systems, Octavian, and Konami Aug’08 – May’10

Product Description: Casino Gaming (Entertainment, Media, Real Time Embedded system)

Responsibilities:

Preparing test cases for Build Server automation tool

Testing hardware peripheral, Infrastructure and compatibility testing

Test Case writing, Updating, Reviewing and Execution

Bug Reporting, Validation, Compilation, Review/Verification

Game Compliance, Functional, integration, System and UAT Testing

Execute test cases and do exploratory testing for uncovered bugs

Mentors, guide, inspire, direct and provide solutions to team members

Trouble shooting with machines, hard ware, games and simulator setups

Testing high priority games and testing games parallel to meet dead lines

Compilation of build, preparing test report and providing delivery package

Deliver projects on-time and to specification with an appropriate level of quality

Preparing and updating checklists/test cases based on the bugs found after the release

Being a member of TCOE, reviewed the games returned from fields and external authorities

Environment: Windows XP, openSuse Linux, C, C++, Tomcat, IBM Clear Quest, Build Server, Test Track Pro, GIG, Beyond Compare, VSS, SVN, Clarity, SharePoint, DACOM, SAS, LAB, Mikohn, ASP, X, Gampro, G2S, Eurasia, Macau, RSA, NSW, Crown, TAS, NZ, USA, Spanish, Russia…

#1

Clients: Nokia Siemens Network, Myrio, Qwest, and Avot Media Sep’05 – Aug’08

Project Description: Home Entertainment Solution (HES) (IPTV, Telecom, Digital and Media)

Myrio entered the emerging IP video services market through its roots as an Internet service provider (ISP), formerly Source Net Corp. Myrio pioneered Internet Protocol TV-based applications delivery for telephone operators, demonstrating digital TV channels, video on demand, and internet access to the TV. Total Manage application server is deployed on Sun servers, using the Solaris operating system. Total Manage provides the following some services to the subscriber like Digital TV channel lineups, Music, IPG/EPG, PPV, VOD, Web portal, Web on TV, Emergency Alert System (EAS), Caller ID, Pause Live TV, Parental Controls, PVR and NPVR, Favorites, Widgets, HDTV, MPEG2, MPEG4, etc . Total Manage is divided in 7 modules Subscriber Management, Product Management, Content Management, Packaging and Pricing, System Management, Transaction processing, and Reporting.

Responsibilities:

Involved in Functional, API, System and Integration testing

Bug Reporting, Verification and Regression testing

Creating new database and updating database patches

Trouble shooting with Set-top-Boxes and Server Setups

Testing hardware peripheral, Infrastructure and compatibility testing

Test Case Creation, Updating, Reviewing, and Execution

Regression, Ad-hoc, Exploratory and API Web services (SOAP) testing

Understanding the test strategy, test plan, User Cases and test cases

Involved in User Interface (UI), Database and Backward Compatibility testing

Involved in Functional and Performance testing using Shark and Crawler tools

Environment: Windows XP, Win 2000, Solaris, Linux, Java 1.4, Orion, Jboss, Sony Ericsson K790i, Noika N95, Motorola Razr V3x, J2ME, CLDC 1.1, MIDP 2.0, Shark, Crawler, QA Sync, Source Forge, Putty, WinSCP, TOAD, PgAdmin, VSS, PSuit, Oracle 10g and PostgreSQL 7.4.6



Contact this candidate