Hari V. (Harry) Menon - 952-***-**** / 612-***-****
******************@*****.*** / ***************@*****.***
Skill set - work experiences – (18) years - brief background
Industries/verticals - Software products, Credit score companies, Healthcare, Global banking, Insurance, Financial services, Brokerage, Payroll processing, Diversified Publishing and Mortgage, Business Intelligence, Legal services, Freight transportation logistics, Government/Defense contractors, Aviation Control Systems and Mobile testing.
Dev, QA/QC Lead with full SDLC experience including onsite and offshore mentoring/training. 12 years
Agile SCRUM team lead and team member experience. 6 years
J2EE/JDK, Apache, Java Platform and Eclipse. 6 years
Visual Basic, .NET platform, VC++, C#, Visual Studio 2003, VBScript, JavaScript and JScript. 12 years
Visual Studio 2008, 2010, 2012 ad 2013 TFS, Development and Test editions with SharePoint 2010 and Microsoft Test Manager 6 years
Oracle DB, Berkeley DB, Oracle Enterprise Manager, SQL Server 2005/2008 with Reporting Services, IBM UDB with SQL scripting 8 years
Selenium GRID and RC with Firefox and IE 6.0 to 9.0, YSlow and Firebug. 4 years
Mercury (HP) LoadRunner/HP Performance Center 6.5 to 11.5 + all monitors, and HP Quality Center/ALM up to 11.5 13 years
HP Service Test upto 11.20 and Unified Functional Testing 11.5 2 years
Mercury (HP) WinRunner 6.5 to 7.6 and Quick Test Pro (4.5 beta to 11). 8 years
Mercury (HP) Optane/Topaz/ProTune/Tuning Module, Deep diagnostics and Sitescope 11.20, Compuware Gomez 5 years
Windows, SUSE Linux and Sun SOLARIS, Red Hat Linux (multiple versions) 15 years
VMware, VSphere, Cloud computing hosting/development 4 years
SilkTest, SilkPerformer, Rational Rose, Rational Robot, Rational Test Suite, TestComplete, TestExecute, AQTime, LoadUIWebPro, SoapUI, NeoLoad 3.1 AJAX edition, Automation Anywhere 6.1, Micro Focus TestPartner, Wily Introscope 5 years
WebSphere, Weblogic, BizTalk/ISA, MQ series, JBOSS 4 years
Shunra Performance Suite, Network Catcher, Shunra for LoadRunner/Performance Center and STN Network Appliance device. 1 year
Shunra Transaction Builder, Analyzer, Reports and Deep Diagnosis. 1 year
Testing Application Deployment, Cloud Migration, Data Center Relocation, Unified (VOIP) Communications and WAN Acceleration Validation. 2 years
Custom Mobile application testing with Objective C and DeviceAnywhere. 3 years
TIBCO, F5 BigIP 8800, BMC Patrol EBusiness line of products. 4 years
CMM/TQM/Six Sigma/ITIL – diploma and experience. 2 years
Certified Software Test Engineer from QAI. 2006
Assembly language and Win32 SDK, API and MFC with ATL. 6 years
Office – Word, Excel, Access, PowerPoint, Visio, Project (2000 to 2013). 14 years
Silverlight 3.x/4.x testing. 1 years
DynaTrace monitoring with LoadRunner integration, JMeter. 2 years
BrowserMob Cloud based load testing., Testing Anywhere 6.50 3 years
McCabeIQ – Source code based quality/performance analysis 1 year
Education
Quality Assurance Institute – Certified Software Test Engineer - CSTE (2006).
HIPAA Certified BF18099 – HIPAA Business Associate III (Non-Clinical) V8 – June 2013
Certification in HIPAA Privacy and Security – Nov 18th 2014.
Certification in Medicare Compliance – Nov 18th 2014.
Certification in Healthcare Fraud and Abuse – Nov 18th 2014.
Indian Institute of Management & Technology, India
• Diploma in Total Quality Management (1999).
• Post Graduate Diploma in International Business (2000).
• Diploma in Software Capability Maturity Model (2004).
All India Council for Management Studies, India
• Diploma in Electrical & Electronics Engineering (1995).
• Diploma in Electrical & Communication Engineering (1996).
Frederick Institute of Technology, Cyprus
• 2 Semesters in Electrical Engineering (09/1993 - 06/1994)
Experience
Verisk Health, Eden Prairie, Minneapolis (04/2013 – 05/2014) - Distributed/Parallel Software/Infrastructure Performance Architect
• Research and identify the specific test tool as the application’s front end, being developed in Object Pascal limited the ability of other tools to identify user interface components built using Delphi.
o TestComplete was used as it had the ability to record and custom scripts to ensure all client side facing, front end, user interface components could be validated for expected functionality.
• Refactor the application performance by testing the functionality and source code directly.
o To optimize application functionality.
o Identify and track code and module interdependence and re-modularize critical code (to ensure that fixing a bug in one location doesn’t break something else that is working).
o Identify and re-engineer redundant code by cyclomatic complexity (listing code and module blocks that can be combined or restructured for maintainability, enhancements, and file sizes, etc.).
o Reduce operational footprint (memory and CPU resources).
o Identify database access inconsistencies (and prevent data corruption).
o Identify redundancies in 3rd party controls utilization.
o Promote compact and modular code design.
o Validate consistent interface behavior across different versions of Windows.
• Create a repeatable Quality Control validation process to ensure complete coverage from application source code onward in addition to release-based business functions/features, unique client enhancements and existing defects.
• Create a stand-alone, independent, in-house test environment that supports comprehensive automated (functional, performance, regression and data validation) testing; increasing coverage and frequency as per the project’s requirements.
• Create a dedicated, stand-alone, in-house QC team responsible for testing multiple iterations of varying applications builds within fixed execution windows/schedule with minimal to no dependence on the development team.
SWAT Solutions, Minneapolis, Minnesota (03/2013 – 05/2013) – Principal Software Performance Architect
• Planning, creating, executing, and analyzing high-level performance testing strategies.
• Manage the design and implementation of performance testing tools and scripts.
• Performance Monitoring Setup
• Performance Benchmark Strategy and execution
• Validate and monitoring frameworks.
• Design and implement Performance Profiling and Monitoring of multiple, varied environments.
• Code optimization and platform tuning.
Avnet Technology Solutions (formerly Genilogix), Philadelphia, Pennsylvania (07/2012 – 02/2013) – Senior Performance Test Architect
• Implemented high-end consulting engagements in the area of Performance Engineering, Mentoring and Corporate Training.
o Advised customers and drove engagements on Performance Engineering of large systems from Strategy, Architecture, Design, Benchmark, Technology Evaluation, Tooling perspectives.
o Planning, defining, executing, and analyzing high-level performance testing strategies and solutions through business, functional and technical expertise.
Manage the design and implementation of performance testing tools and scripts.
Supervising the maintenance, repair and tuning of applications; directing user support activities; managing preventive maintenance activities.
Performance Monitoring Setup
Performance Benchmark Strategy and execution
Validate and recommend robust production monitoring frameworks, the outcome and reporting dashboard modules.
Design and implement Performance Profiling and Monitoring of .NET and Java environments.
Code optimization and platform tuning in Windows and xNix.
Code instrumentation in Visual Studio 2008 and Java.
• Conducted corporate training for over 250 participants in HP tools across different onsite locations and via remote learning platforms.
• Presales Support for Performance Engineering opportunities.
o Carrying out due-diligence for specific key accounts.
o Liaison with strategic partners for specific opportunities.
o Presented a technical demonstration/Technical Pre-sales of Performance Center for Target in Sep 2012 along with HP representative (Chuck Johnston) which led to HP Performance Center an ALM being purchased by Target and deployed by my team in a 30/45 day period.
o Worked with the development team at UPS to architect/design a customized plugin-in for QTP to enable testing mainframe applications where the client interface was a green screen front end with no user interface components (and QTP doesn’t support screen-scraping based testing out for the box).
SAKAI Foundation, Ann Arbor, Michigan (06/2012 – 07/2012) – Senior Performance Test Consultant
• Created performance test strategy and methodology based on requirements and functional specifications including test strategy and methodology
• Validate that application functions as expected and is interfaced as documented to external devices/applications, etc.
• Created batch automation scripts to validate the migration of different components across different technologies initially.
• Tested the scalability and performance of the Learning Management System using Selenium and Grinder across varied platforms.
• Selenium was specifically selected to validate the front end, end user facing components as the user interface was composed of varied open-source widgets built upon a custom asynchronous communication mechanism.
Assurant, Minneapolis, Minnesota (02/2012 – 05/2012) – Senior Performance Test Consultant
• Created performance test strategy and methodology based on requirements, functional specifications and meetings with project and product management.
• Parametrized the test data, introduced variables to represent and handled it within the script, debug and finalize test script creation.
• Setup requisite monitors and scenarios and executed the performance tests (baseline, benchmark with differing loads and ramp-up/ramp down behavior in scalability and stability tests, spike tests, bell curve tests and failover tests)
• Published documentation with results and recommendations, established build-based or revision-dependant performance baseline and proceeded with additional tests in distributed systems configuration and multiple back end database platforms.
• Integrated DynaTrace monitoring and diagnostic framework with LoadRunner in order to help middle tier team with deeper diagnostics.
Shunra, Philadelphia, Pennsylvania (10/2011 – 12/2011) – Senior Application Performance Engineering Consultant
• Implemented a customized network simulation and infrastructure performance testing architecture for the U.S Department of Veteran Affairs in Tampa, FL. Imparted customized onsite training for the same.
• Implemented QA and performance testing strategy at the Lincoln Financial Group at Greensboro, VA for a suite of financial and back office applications integrated with PeopleSoft, J2EE custom code executing on Weblogic powered by SAP Business Objects business rules and workflow orchestration on IBM AIX 6. The application integrated with Oracle 11i and a JIT In-Memory database built on a NetApp powered architecture.
• Imparted customized advanced training on the Performance Engineering with onsite implementation of the Shunra Network Appliance with Shunra Performance Suite, Network Catcher, and LoadRunner on a virtual test environment.
• Created a Performance and user experience test strategy for Twitter to validate the integrity of a data center relocation process by scaling the custom application engine.
• Creating testing and quality control roadmaps, documents and onsite tools/technology training for
o Virtual Machine Build and Rollout Strategy (VMM and VDI).
o Cloud Performance testing.
o Performance in Development.
o Network Capacity Planning.
Senior Performance, Automation and Quality Control Project Lead for multiple companies (done simultaneously from 05/2011 to 10/2011)
• Created an end to end application and infrastructure test strategy and led an implementation team for the VirtuWell suite of applications at HealthPartners, Bloomington, MN.
• Worked with the development team to test the performance and scalability of the TrackWell suite of mobile applications using DeviceAnywhere and custom Objective C code at HealthPartners, Bloomington, MN.
• Conducted end user experience based performance optimization at T-Chek Systems, Eden Prairie, MN. Using LoadRunner, custom tools and network profiler, a solution was identified and implemented which resolved user access impairments caused by the hardware and operating system configuration mismatches on the client side.
T-Chek Systems (C.H. Robinson), Eden Prairie, Minnesota (02/2011 – 05/2011) – Senior Performance Test Engineer
• Created a performance test strategy to deliver an end-to-end user experience metric analysis for a proof of concept implementation of a Freight Transport Logistics application.
• Worked with the architect, development, project management and business analyst teams to formulate a comprehensive quality assurance strategy for the initial version of the application.
• Extensively researched on all possible automation test tools and test suites for Functional, Regression, Smoke, Performance/Load, Database, End to End and Custom control application tests across the .NET platforms (in standard and proprietary modes).
• Worked with development and deployment team members to source application code in order to fashion a wraparound harness mechanism to enable testing tied to the compiler (Visual Studio 2008).
• Application architecture is based on .NET 4.0 custom-AJAX components built with SQL Server Reporting Web Services layer and the front end interface was built using Microsoft’s standard toolkit of auto-generated AJAX widgets created on the front-end at runtime.
• The test tool used to test the front-end components, the underlying infrastructure and the backend simultaneously was NeoLoad.
• Proposed QA/QC strategy for team is based on a combination of available open source, proprietary and mix-match tool suites including Selenium, Visual Studio 2010 Test Edition and NeoLoad.
• Performed infrastructure load testing with custom harness code for 1000 concurrent requests simulating custom AJAX controls from SQL SSRS on a combination of desktops and VM images.
Wells Fargo NA, Minneapolis, Minnesota (10/2010 – 12/2010) – Senior Performance Test Consultant
• Created performance test strategy and methodology based on requirements, functional specifications and meetings with project and product management.
• Parametrized the test data, introduced variables to represent and handled it within the script, debug and finalize test script creation.
• Setup requisite monitors and scenarios and executed the performance tests (baseline, benchmark with differing loads and ramp-up/ramp down behavior in scalability and stability tests, spike tests, bell curve tests and failover tests)
• Published documentation with results and recommendations, established build-based or revision-dependant performance baseline and proceeded with additional tests in distributed systems configuration and multiple back end database platforms.
• Integrated DynaTrace monitoring and diagnostic framework with LoadRunner in order to help middle tier team with deeper diagnostics.
• Identified/selected/created an offshore team with mentoring, technical training and technical architecture leadership for multiple projects.
Ceridian, Bloomington, Minnesota (05/2010 – 10/2010) – Senior Automation Test Engineer and Offshore Team Lead
• Architect-ed the process and technical mechanisms required to enable a transition from manual test cases to automated test scripts in QTP 10.
• Integrated QTP with QC (and additional excel macros) to enable download of manual test steps onto a spreadsheet and the upload of the same to automated test scripts thereby reducing the burden of mapping existing manual test to the automated ones – for the design steps in particular.
• Created/configured the remote QTP test engine systems to enable the ability to execute automated test remotely from QC. This has been successfully implemented as my team in India can initiate any automated tests though QC, which launch on a test server in my cube.
• Process creation of an end-to-end mechanism that tracks the test cycle in the AGILE mode from test cases to defects with traceability matrix tracking.
• Created a mechanism to introduce the dashboard onto the process to help management get the instant feedback on the state of tasks for each sprint.
• Integrated QC to TFS for cross level defect, tasks and script management using the provided plugins and extension kits. It has been a bit complicated and there are some issues at present based on cross tool compatibility.
• Created a framework to test Silverlight based applications with QTP 10 and the Web 2.0 Feature Pack.
• Built, trained and mentored a team of entry level engineers (onsite and offshore) to script an application with propriety and custom controls via a combination of the correct usage of QTP’s add-in’s, selected VBScript programming, appropriate usage of multiple v/s single actions, global variables and the Object repository.
• Team task allocation based on TFS and SharePoint backlog items as per business requirements positioned on severity.
ING Banking/ING Brokerage Advisors Network/PrimeVest Financial (Brokerage) Services, St. Cloud, Minnesota (02/2010 – 05/2010) – Datacenter Migration, Performance, Automation
• Worked with the architects to understand the requirements of the migration in order to create a process mechanism to chart, detail and document the expectations and possible results based on timelines for deliverables and available resources for the brokerage business unit’s transition from ING to PrimeVest.
• Worked on understanding a comprehensive infrastructure while documenting and mapping the dependencies between the varied systems in different locations, which needed to be migrated at the same time.
• Provided mapping and firewall port documentation required to enable automation across different locations to the different network and infrastructure teams.
• Created basic batch automation scripts to validate the migration of different components across different technologies initially.
• Created additional tests and a regression test suite in QTP once tool license was available for the same.
• Tested the scalability and performance of brokerage application using Performance Center in multiple iterations.
• Coordinated, trained, mentored and lead an onsite team on the steps, processes and technologies mentioned above.
Thomson Reuters, Eagan, Minnesota (10/2008 – 12/2009) - Senior Performance Test Framework Architect
• Architect-ed and created an in-house performance testing tool to test an AJAX based multiple-vertical application for up to 10,000 users based on Selenium and Visual C# and SQL Server 2008.
• Ported and rebuilt the same application using JAVA 6 update 14 and Selenium RC to mitigate the cost of individual licenses based on the earlier C# approach.
• Currently building an test framework using Apache, Oracle 10i to scale the application to Amazon's Cloud computing and HP's data center infrastructure to accommodate for larger scale loads and longer duration tests.
• Working with VMWare's virtualization center and SDK to enable ESX server performance monitoring during test execution to provide for auto-load-balancing capabilities.
• Performance testing the conversion of an existing multi-server farm based legal/law application from legacy technology to functionality and subject-matter driven .NET and JAVA information silos (i.e. breaking up a consolidated legal application into different server farms based on type of information sought – a form of application functionality based architectural abstraction in order to segregate code clutter and maintenance from future addition of logic and overall application maintenance).
• The test scripts were prepared and executed on Silk Performer and monitoring was done using a combination built-in, 3rd party and additional in-house tools.
• Different testing methodologies and plans had to be developed as one large legacy application was to be to be hosted in different technologies which lead to many differences in test environment preparation, data collation, script creation and usage as well as monitoring tools as different operating systems were also involved.
• Performance/Capacity testing and scalability determination of the application (while being constructed) to be able to deduce the overall number of transactions that couldn’t be handled during peak hour usage (by computing it from current legacy application usage patterns), monitoring system resources over entire test infrastructure (built as close to mirror production) and translation test results such as transactions/sec, etc to a final value such as
‘The xxxxxx Vertical will require xxx Sun 5120 Cool thread servers at xxxxxx configuration to be able to support xxxx transactions/sec with the transaction response ranging from xx seconds to xxx seconds.
• The next step in this exercise was to be able to re-create the number of possible users based on the numbers of transactions pushed through the test to give the business unit something to base their calculations on. This capacity test was to be executed for each of the functionality driven silo (or Vertical) thereby enabling production purchase forecasts during development.
• Failover and redundancy testing the application, its platforms of operation (due to the intermingling of different technologies esp. – operating systems). The entire logic here was based on custom tcl/tk code written in the F5 BIGIP 8800 Load Balancer and ‘context-session’ aid supplemented by a row of Weblogic servers. The tasks here were to flood the entire infrastructure with data packets and bring down different section (initially) and complete data farms (eventually) while testing the ability of the load balancer and the suite of Weblogic servers to always ensure that the user would be brought/led to the expected re-start point without loss of current ‘context –session’ information up on which application usage billing was calculated; hence paramount to the entire application.
Cognos Incorporated, Bloomington, Minnesota (04/2007 – 10/2008) – Senior Automation & Performance Test Engineer – Cognos Planning Contributor version 8.1 to 8.4
• Create test plans based on requirements functional specifications and meetings with project and product management including test strategy and methodology.
• Parametrize the test data, introduce variables to represent and handle it within the script, debug and finalize test script creation.
• Setup requisite monitors and scenarios and execute the performance tests (baseline, benchmark with differing loads and ramp-up/ramp down behavior in scalability and stability tests, spike tests, bell curve tests and failover tests)
• Publish documentation with results and recommendations, establish build-based or revision-dependant performance baseline and proceed with additional tests in distributed systems configuration and multiple back end database platforms.
• Work with remote team and management in disseminating product capability performance and participate in road map discussions.
• Lead a team of locally based test engineers in coordination with the remote team to enable comprehensive product testing coverage.
Wolters Kluwer FS, St. Cloud, Minnesota (05/2005 – 04/2007) – Senior Automation & Performance Test Engineer - COM+, .NET and J2EE platform implementations
• Creation of test infrastructure and installation of test application and supplementary components.
• Secure valid test data in a version-controlled, restricted access storage.
• Validate that application functions as expected and is interfaced as documented to external devices/applications, etc.
• Create test plan based on requirements functional specifications, use-cases and meetings with project and product management, development, marketing and support-line teams and includes test strategy and methodology (scenarios, etc).
• Based on technical documents identify the components by filename (.exe/.dll/.ocx), appropriate type library name, class name and specific interface which implements the code that needs to be tested.
• Hook onto the specific interface code’s entry point using LoadRunner’s COM+ recorder and write/record (C/VBA/VBScript) the script keeping track of context states and other connection behavior.
• Parameterize the test data, introduce variables to represent and handle it within the script, debug and finalize test script creation.
• Setup requisite monitors and scenarios and execute the performance tests (baseline, benchmark, same/differing loads with varying ramp-up/ramp down behavior in scalability and stability (over 1 to 11 hours) tests, spike tests, bell curve tests and failover tests).
• Collate results, analyse data and produce final results and recommendations document such as –
In its current configuration, the application can process 7501 documents (consisting of a total of 1,905, 254 pages) for 25 concurrent users threaded to 8 concurrent requests in 60 minutes.
• Creation of a Performance Management Architecture Framework to pre-emptively build performance and scalability features within application design.
Performance Testing Methodology.
Performance Baseline.
Performance Benchmark.
Performance Tuning Methodology
Allianz Life, Golden Valley, Minnesota (11/2004 – 05/2005) – Enterprise Infrastructure Architect Consultant
• Design and implement an 18-month strategy to enable multiple domain, Infrastructure/Platform/Application/Services monitoring solution derived fro ITIL/CMM practices to do the following:
o Automatic and aided infrastructure device (Platform/Network Appliance/Business Application) detection on physical (Layer 1) and logical (layers 2 to 7) across protocols.
o Fine grained Fault Monitoring of applicable components across the infrastructure to proactively prevent service outages and failures.
o Implement Event processing, Cross Domain Correlation and Cascading Downstream failure detection & Notification mechanisms to enable Infrastructure wide assessment of the scope of outages.
o Interface monitoring mechanisms to feed into and derive from the Change Management Database to validate mapping of individual components to business workflows and service functions.
o Identify and implement best-of-breed practices for each domain as performance rules to enable proactive Performance Management
o Create separate interfaces to Support desk systems and process views to predict and assess business impact and service outages based on individual component/device failure.
o Design and document a Statistical Gap Analysis Methodology to rate and rank each individual domain support group’s current capability on a custom performance management scale - to assess the quantum of work required to implement the project.
BMC Software, Pune, India (12/2003 – 05/2004) – Software Quality Assurance Project Lead (Manual & Automation testing)
• Design the test strategy framework for enhancements and defect testing for the Patrol EBusiness product suite of tools.
• Creation of an automation strategy for converting existing and new manual scripts into QuickTestPro scripts and implementing Test Director as the central test repository for the complete testing lifecycle.
• Execute the tests, track defects and work with the development team to analyze/resolve the fixes.
Citigroup, London, England (12/2001 – 07/2003) – Assistant Vice President of the Regional Testing Team
• Test strategy design for the execution of the rollout/pilot deployment phase of multiple applications simultaneously.
• Designed Test Plan templates detailing scope/pre-requisites/entry & exit criteria/deliverables/schedules & milestones/risk & assumptions/test preparation/test creation/test execution/defect management lifecycle & traceability, etc.
• Created a defect management lifecycle detailed logic of identifying a defect, pre-requisites to open/assign/work/close/test/re-open defects, differentiating between defects and enhancements identification (and prevent scope creep), implementing traceability logic and classifying and linking requirement priority (identified during requirements decomposition) to defect severity (to determine number of rounds of testing required and the scope of regression testing).
• Designed/Created the test lab based on application architecture and procured sample test data for selected transactions. Initiated a process to create, populate and harvest a growing reusable test bed for different versions of each application.
• Created manual test plans and scripts in Test Director based on application domain knowledge, interaction with subject matter experts, business analysts, technical architect, design/development team, technical documentation team and training team. All test scripts were linked to the requirement/s represented to enable traceability from requirement to defect and vice versa.
• Led a team if 12 personnel to implement a Siebel Call Centre for 3 countries, 5 lines of business for Citibank Europe based in Barcelona and conducted the following:
functional, load, stress, volume, fail-over, redundancy, network/infrastructure, cluster/failsafe/high availability, user interface/user experience, real-time monitoring, system tuning, application tuning, benchmark quality & performance measurements, metrics development and identification of critical service levels for the following projects.
o Mutual Funds & Brokerage, Citibank Belgium, Citibank Greece.
o Siebel Call Centre, Citibank Spain.
o Orbit Payment System, Citibank Belgium.
o TestDirector Deployment, Citibank Germany.
o WebFarm