Patrick Leinus
Email: *************@*****.***
Saint Augustine, Florida
Work status: Green card
Professional Summary:
● Experienced in all stages of Software Development Life Cycle and Software Testing Life Cycle.
● Exporting understanding Testing Stages/Levels/Phases, Testing Types, and Testing Techniques.
● Participated in complete Testing Lifecycle, Test Plan, Test Cases, Test Execution, and Defect Report.
● Attended review sessions to understand the Business Requirements/Functional Specifications and reviewed the documents to understand the client business logic.
● Developed and executed SQL Queries to verify the database updates, inserts and delete of the records, To extract data from target tables and validate reports
● Experienced in analyzing Business Requirement and Technical Specification documents.
● Performed back-end testing by executing SQL queries over various database products on Window platform.
● Analyzed business requirements document, functional specifications document to prepare Test plan and Test cases.
● Developed and executed SQL Queries to verify the database updates, inserts and delete of the records, To extract data from target tables and validate reports
● Developed and execute complex SQL queries.
● Developed and executed Test Cases and Test Scripts for Functionality and Regression testing.
● Performed functional testing, integration testing, system testing, performance testing, stress testing and load testing.
● Performed Data Validation followed by the manual testing of back end
● Performed Web services and workflows testing and automated the test suit and test cases using SOAP UI.
● Performed Data Driven test using test data from flat files and analyzed the test results.
● Verified Data with the Toad Interface tool running SQL queries in the Data Base (oracle)
● Recording, reporting and closing defects using Quality Center through the entire test life cycle.
● Also followed up with development team to verify bug fixes, and update bug status.
● Involved in Data loading and data conversion process and tested the data
● Involved in performing Functional, Integration, System, and Regression Testing.
● Involved in performing Functional, Integration, System, and Regression Testing.
● Set up test environment and created test data for both positive and negative test cases.
● Executed test plans and test scripts using Quality Center.
● Executed Test Cases and verified actual results against expected values.
● Performed End-to-End testing Manually and was associated with User Acceptance Testing.
● Prepared test cases to test the back end Database by retrieving the data for the tables using SQL
● Created positive and negative testing scenarios and tested them manually
● Tested all the Webservices including theSOAPmessages and validated the XML using SOA test tool.
● Performance testing on bulk data loading on IS data stage Picasso version (11.3.0) from SAP (ECC) to target databases (Oracle & DB2) using Oracle & ODBC connector stages in jobs.
● Tested the APAR patches resolves the issue for the IBM CRM tickets raised by IBM clients for data stage bugs on environments (IS 8.7, 9.1&11.3 on Windows, Red hat& Suse).
● Worked on UI testing, wrote test cases based on rules for the code to test fire or no fire by hard coding inputs on the code then executing the code to find the expected results by highlighting the
fields on UI.
● Worked on regression testing ABAP, BAPI & IDOC stages on environments (IS data stage 8.7, 9.1&11.3 on Windows, Red hat& SUSE with SAP packs 7.0 & 7.1).
● Worked extensively on IBM RTC(Rational team concert) for creating stories,tasks and defects by following up by assigning to the appropriate team member . When it is fixed retesting and changing to “Approved” status and assigning to the lead.
● Worked extensively on IBM IDA(Info sphere Data Architect) to develop jobs and execute by creating logical and physical model and generating a data stage job.
● Used IBM eclipse IDE extensively for setting up application and installing the plug-in.
● Tested ABAP stage through RFC or QRFC,FTP,Local file data transfers by extracting table or joining two tables using SQL build query or extraction object then generate SAP program from SAP data repositories .
● Tested BAPI stages (BAPI extract, BAPI load, BAPI extract & load) with custom & standard BAPI’s to and from sequential file.
● Tested IDOC stages(IDOC extract & IDOC load connectors ) by creating partner profile and RFC destination or listener on SAP server to retrieve IDOC’s on data stage server using Data stage administrator for SAP then extract IDOC‘s from SAP and loaded IDOC‘s to SAP
● Handled data stage administration involves creating datastage users and mapping under user,domain&engine credentials by assigning roles also troubleshot data stage login and hung issues . Restarted data stage servers by stopping & starting the data stage engine, ASB agent and then WAS.
● As a part of technical team panel interviewed candidates for software engineer role.
● Designed complex jobs using join,merge,aggregator,sort,remove duplicates stage in data stage.
● Created sequence jobs with complex parallel and server job design with Hash and round robin technique.
● Imported Unicode & non Unicode transports on SAP ECC (EHP) & SAP R/3(4.6c& 4.7) systems.
● Worked on IVT (Installation verification testing) & FVT (Functional Verification testing) of IS(8.7,9.1,11.3), SAP packs(6.5,7.0,7.1),Updating data stage installer & IDA installation manager versions& applying patches on Windows,Redhat&Suse in graphical &command prompt mode.
● Installation, Configuration, Upgradation, and Maintaience of IBM Websphere Datastage and . EAR plugins on UNIX and windows.
● Installed and uninstalled IDA 8.5 with RMRG (Rapid Modeler & Rapid Generator) plug-in on client and RMRG.EAR on web console data stage server.
● Installed and uninstalled IDA 8.5 with RMRG (Rapid Modeler & Rapid Generator) plug-in on client and RMRG.EAR on web console data stage server.
● Installed and uninstalled IDA 8.5 with CW (Conversion workbench plug-in on client and installing CW.EAR on web console data stage server.
● FVT on RMRG (7.0 & 7.1) for generated ABAP programs from data stage and RMRG.
● Worked on creating logical and physical models in RMRG supplying job parameters hard coded from RMRG or using parameter sets from data stage.
● Worked on CW testing includes Cognos gap reporting, data stage assets and fast track.
● Worked on SAP BW (BI) testing involves creating process chains, open hub destinations and using BW extract and BW load (Push Pull) stage from data stage.
● Well written test cases for a given scenario consisting of scenario check (SC), Resource count check (RC), column level check (CL) and column check (CC) and also the audit file check.
● Writing SQL scripts for each steps for particular TCS and passing it and if it fails raising a defect and provide the analysis to the developer and following up with him.
● Worked on Data stage 8.1 and 8.5 and also developed few jobs using necessary transformations. And used Data stage director extensively to analyze the error log and to produce the exact cause for abortion and fix it with the developer.
● Reject files which are generated or to check the source table with target file check will be using UNIX commands and shell scripts.
● Have experience on DB testing (Oracle 11g), performance tuning on daily partitions are allocated for the daily run which would append to the monthly table by the end of the day.
● Hands on experience in writing, testing and implementation of the triggers, Stored Procedures, functions at Database level using PL/SQL.
● Knowledge in Unit testing, DB testing, System integration testing, implementation, maintenance and Performance tuning.
● Worked on HP Quality center 9.0 as test management tool for manual testing process.
● Have worked on Shell scripting and UNIX commands on production support ensuring the huge daily transactions from different source( Siebel & MCR) gathered and filtered through MDM to our server where the files executed through DS jobs and delivered to the target file systems where worked on Unix and shell scripting from SSH.
● Scheduled & unscheduled jobs from crontab for daily, weekly execution of DS jobs. Also worked on CA7 Job scheduler.
● Worked on Slowly Changing Dimensions, conformed dimensions and facts.
● Good knowledge Relational Database Modeling and Dimensional Modeling-Star schema, Snow-Flake schema using ERWIN tool.
● Basic DBA knowledge (Import & Export, SQL * Loader, Data Pump, performance tuning).
● Good understanding in Change Data Capture (CDC) techniques like slowly changing dimensions, slowly growing target.
● Deployed DS jobs into production and updated the job parameters.
● Ability to meet critical deadlines without compromising quality.
● Excellent communication skills and extremely proactive, process abiding and result oriented.
Education:-
COURSE
INSTITUTION
High school
Immaculate Heart of Mary’s matriculation higher secondary school (2001)
Diploma in Computer Technology
Nanjappa Institute of Technology (2005)
Bachelors of Engineering-Information technology
Karunya University (2008)
Technical skills:-
IBM Data stage(8.1,8.5,8.7,9.1,11.3.*) (testing, developing, administrating),QA testing (Agile, waterfall, UI testing, Manual testing, ETL testing (Regression, Sanity, performance, functional, black box testing), System testing, DB testing, SIT, IVT, FVT, SQL, Oracle, Teradata, Unix, Java, JIRA, C++, C#,ASP.NET,Bootstrap,SQL Server 2016,CA7,Crontab, HP quality center, IBM RTC(Rational team concert), IBM IDA(Info sphere Data Architect), Conversion workbench, Cognos, IBM RMRG, SAP R/3 & BW, DB2, Shell script, IBM eclipse IDE, SOAP UI, Informatica.
Professional Experience:
Postal Fleet services- Saint Augustine, Florida (April 2016 – Present)
C# & asp.net developer/SQL server DBA/Data modeler
Developed front end payroll software application using C# on visual studio.
Installed and setup SQL Server database and designed data modelling for business data on tables.
Built company website using asp.net on Visual studio express on web with bootstrap technology and inbuilt job application form in the website.
Worked on JSON file and java query part of API for logistics.
Preparing business requirements and transforming to technical specifications.
Using VBA (Visual Basic Applications) developed a data entry tool also automating using formulas in excel to update the report.
Managing and testing Load Trek application and analyzing the application information.
Making entry of all the business information on excel using formulas and macros.
Transforming excel data to MS access DB and creating and writing queries on DB.
Streamlining the field data to the DB.
Working on Automation tools, Logistics tracking the Vehicle location and analyzing the time arrival.
Implementing Date warehousing and built the logical and physical model.
IBM (March 2014 – Dec 2015)
QA Test Engineer/Tester
Performance testing on bulk data loading on IS data stage Picasso version (11.3.0) from SAP (ECC) to target databases (Oracle & DB2) using Oracle & ODBC connector stages in jobs.
Tested the APAR patches resolves the issue for the IBM CRM tickets raised by IBM clients for data stage bugs on environments (IS data stage 8.7, 9.1&11.3 on Windows, Red hat& Suse).
Provided support to other developers in accurately mapping source attributes into the Teradata Financial Services Logical Data Model (FSLDM) and in interpreting business requirements
Worked on regression testing ABAP, BAPI & IDOC stages on environments (IS data stage 8.7, 9.1&11.3 on Windows, Red hat& Suse with SAP packs 7.0 & 7.1).
Worked extensively on IBM RTC (Rational team concert) for creating stories, tasks and defects by following up by assigning to the appropriate team member. When it is fixed retesting and changing to “Approved” status and assigning to the lead.
Used IBM eclipse IDE extensively for setting up application and installing the plug-in.
Worked extensively on IBM IDA (Infosphere Data Architect) to develop jobs and execute by creating logical and physical model and generating a data stage job.
Tested based on manual and Automation tools and tested ABAP stage through RFC or QRFC, FTP, Local file data transfers by extracting table or joining two tables using SQL build query or extraction object then generate SAP program from SAP data repositories .
Tested BAPI stages (BAPI extract, BAPI load, BAPI extract & load) with custom & standard BAPI’s to and from sequential file.
Tested IDOC stages(IDOC extract & IDOC load connectors ) by creating partner profile and RFC destination or listener on SAP server to retrieve IDOC’s on data stage server using Data stage administrator for SAP then extract IDOC‘s from SAP and loaded IDOC‘s to SAP
Handled data stage administration involves creating IS users and mapping under user, domain & engine credentials by assigning roles also troubleshot data stage login and hung issues . Restarted
data stage servers by stopping & starting the data stage engine, ASB agent and then WAS.
Worked as hiring team technical panel hiring ETL testers.
Designed complex jobs using join, merge, aggregator, sort, remove duplicates stage in data stage.
Created sequence jobs with complex parallel and server job design with Hash and round robin technique.
Imported Unicode & non Unicode transports on SAP ECC (EHP) & SAP R/3(4.6c& 4.7) systems.
Worked on IVT (Installation verification testing) & FVT (Functional Verification testing) of IS data stage(8.7,9.1,11.3), SAP packs(6.5,7.0,7.1),Updating data stage installer & IDA installation manager versions& applying patches on Windows, Redhat & Suse in graphical &command prompt mode.
Installed and uninstalled IDA 8.5 with RMRG (Rapid Modeler & Rapid Generator) plug-in on client and RMRG.EAR on web console data stage server.
Installed and uninstalled IDA 8.5 with CW (Conversion workbench plug-in on client and installing CW.EAR on web console data stage server.
FVT on RMRG (7.0 & 7.1) for generated ABAP programs from data stage and RMRG.
Worked on Automation tools like Selenium, QA, Creating logical and physical models in RMRG supplying job parameters hard coded from RMRG or using parameter sets from data stage.
Worked on CW testing includes Cognos gap reporting, data stage assets and fast track.
Worked on SAP BW (BI) testing involves creating process chains, open hub destinations and using BW extract and BW load (Push Pull) stage from data stage.
41st Parameters (July 2014- November 2014)
Quality Test Engineer/Tester
Worked on UI testing based on the rules logic, wrote test cases and executed tcs’s supplying input on hard code to check the UI fields highlight which concludes the result with fire or no fire.
Worked on Rule based testing, analyzing the Java Groovy code and modifying the hard coded values in JSON file then validating the JSON file then executing to verifying the results(Fire or No fire) by the UI fields highlighting or vice versa.
Experienced with Automation Tool based testing methodologies
Worked extensively on JIRA tool to create defects, stories and manage test cases and assigning, tracking till the status changes to fixed.
Tesco Bank (November 2013- February 2014):-
IT/Software Consultant
It was both SIT and System testing we were executing existing TCS i.e. Test case and writing new TCS for different modules according to the technical specification document and mapping documents .On an average of 8-15 TCS will be executed depends on the complexity and will also be writing TCS for given scenario.
HP Quality Center – Writing Test cases for a given scenario which would consist of scenario check, Resource count check, column level check and column check and also the audit file check.
SQL script: - Writing complex long SQL scripts for each steps for particular TCS and passing it and if it fails raising a defect and provides the analysis to the developer and following up with him.
UNIX & shell script: - The reject files which are generated are to check the source table with target file check will be using UNIX commands and shell scripts.
Capco (TDMBNA-Toronto Dominion & Maryland bank of North America Bank)
IT/Software Consultant (April 2013- October 2013):-
HP Quality Center/Selenium – Manual Testing and Automation Testing with positive and negative scenarios and writing Test cases providing the manually created source file with the steps to run to the coordination team and execute it and provide the results.
Testing and deploying jobs in data stage-Feeding the source flat files on data stage with different data and tested positive and negative scenario according to the output.
Typically the source file has to be placed in the respective directory same as the path in source path parameter in data stage and run the shell script which would trigger the job and produce the output in the target directory and if there are rejects to the reject directory.
Access Provisioning - coordination
Providing the co-ordination support for environment access request with the different TD vendor teams.
Licensing management – coordination
Continuous Integrations, Transformations.
Co-ordinate the license request for IBM products in the environment with the ITS Asset management team for new licensing requests and upgrade licenses
Code deployment – Integration deployment - coordination
Co-ordinate the code deployment request with Development team, TEMD team to deploy the ETL scripts into the SIT environment.
Mobily, Saudi Arabia (July 2012-Feb 2013)
IT/Software Consultant
As an Operation analyst and tester, MDM performs the data reconciliation from MCR & Siebel data bases to ETL server which then will be processed by me to deliver to different stake holders.
Scheduled the data stage jobs minutely, hourly, daily, weekly monthly jobs in data stage director.
Shell scripting for processing or moving the files.
Developed ETL data stage jobs passing soft & hard code job parameters, job routines and Sequence jobs
Worked extensively on different types of stages like ODB Enterprise stage, sequential file, Lookup, Join, Merge, Funnel, Copy, Transformer, Filter, Peek, Pivot, Row Generator, Column Generator, Change capture etc in data stage .
Extracting data from sources like oracle, flat files and transforming them using business logic and loading the data to the Data warehouse.
Designed and Implemented Server and Parallel Jobs including Batch Jobs and Sequencers data stage.
Worked on various Stages like Transformer, Aggregator, Lookup, Join, TDD, Merge, Remove duplicates, Funnel Filter, Pivot, ODBC stages etc in data stage.
Expertise in data warehousing techniques for data cleansing, Slowly Changing Dimension phenomenon.
Knowledge in performing Regression testing, Selenium and end to end Integration testing along with application methodologies on Jenkins, ANT, Maven.
Experience in developing PX jobs that include both pipeline parallelism and partition parallelism techniques.
Extensively used multiple configuration files (environment variable) to increase the nodes according
to the varying processing needs.
Experienced in working with the XML transformations and all types of files including the flat files.
Experience in integration of various data sources like IBM DB2, Oracle, SQL Server and Flat Files using Active/Passive Stages available in Data Stage Tool.
Experience in using the third party Scheduler like crontab to schedule the jobs.
Excellent knowledge of studying the data dependencies using Metadata of Data Stage and preparing job sequences for the existing jobs to facilitate scheduling of multiple jobs.
Experience RDBMS, E_R model, Logical Model, Dimensional Model.
Experience in programming using SQL, PL/SQL and Unix Shell Programming/Scripting.
Experience in writing, testing and implementation of Stored Procedures, Functions, Cursors and Triggers at Database level using PL/SQL.
Expertise in using Oracle 6/8i/9i/10g databases
Proficient in SQL*Loader, SQL*PLUS, PL/SQL developer and TOAD applications to manage database.
Experienced in Tuning SQL Statements and Procedures for enhancing the load performance in various schemas across databases.
Excellent analytical, problem solving, communication and interpersonal skills, with ability to interact with individuals at all levels.
Extensive experience with Extraction, Transformation, Loading (ETL) process using Ascential Data Stage EE/8.0/7.5/ 7.1/6.0/5.2
Strong understanding of business processes and interface with IT.
Experienced in writing system specification, translating user requirements & creating design specifications.
Experienced in Testing tools, Test Driven Development (TDD), BI Tools, Full Life Cycle and Methodology for implementing Data warehouse and Business Intelligence Reporting Systems.
Experienced in design and implementation of ER, Star Schema and Snowflake Schema.
Experience in developing different ETL DataStage jobs using stages (Lookup, Join, Change Capture, Aggregator, Datasets, XML Input, XML Output, Sequential files, ODBC, OraOCI, Transformer, Oracle and DB2 Enterprise, Merge, Modify and Copy).
Designed Parallel jobs, Job Sequencers, Batch jobs and Server jobs.
Expertise in developing strategies for Extraction, Transformation and Loading (ETL) processes using DataStage, expertise in LLD (low level design) and requirement analysis gathering.
Skilled in integration of various data sources such as SQL Server, Oracle, DB2 and MS Access into staging area.
Involved in Performance Tuning of the DataStage Parallel jobs using Performance Statistics, Dump Score.
Experience in handling xml files using Datastage.
Experienced in scheduling and running jobs using DS Director & Unix scripting.
Creating Dashboard Style operational/adhoc reports for various departments using cognos 8.x/CRN1.1 tools (Frame work Manager, Report Studio, Query Studio).
Used Quality Stage with in DataStage parallel extender Version 8 to investigate, standardize and cleans names, address, phone numbers etc..
Experience in integration of various data source, testing and user documentation.
Installation, Configuration, Up gradation, and Maintenance of IBM Websphere Datastage on UNIX and windows.
Worked with complex flat files using logic to transform data for mapping.
Worked extensively with Business Analyst during requirement gathering.
Was responsible for Testing, Requirement Analysis, Design and Development of the project.
Performed extensive source Data analysis for effective design.
Extracted data transformed and loaded into Oracle databases according to the required provision
using DataStage 7.5.
Designed Star schema model for Target system.
Responsible for mapping and workflow techniques.
Enhanced the Job Performance by using proper partitioning methods and analyzing resources utilized using Job the Monitor.
Extensively used most of the stages available in DataStage (Transformer, Aggregator, Join, Lookup, Merge, Sort, Link Collector, Link Practitioner, Hash file, Sequential file).
Used Datastage Manager for Importing, exporting projects, Individual components, Metadata definitions, from and to Repository
Involved in all the stages of Software Development Life Cycle (SDLC).
Extensive experience working with complex mainframe EBCDIC files using complex file stage
Experience in UNIX shell scripting, for various functions such as maintenance, backup, and configuration.
Automated Job Control using Batch logic, batch testing to execute and Schedule various Datastage jobs.
Designed Datastage jobs for loading history data into Warehouse table.
Ensured Data Quality by validating the data and maintaining proper data structure
Used Environment Variables and Stage Variables for developing Parameter Driven Jobs and debugging them.
Enhanced the Job Performance by using proper Partitioning methods, integrations, deployments, transformations and analyzing the resources utilized using Job Monitor.
Cable vision(June 2010- June 2012)
Worked on Slowly Changing Dimensions.
Manually mocking up the data both in source files according to the scenario with header and trailer records and mocking up also the table data in DB2 database according to the scenario.
Triggering CA7 Batch Jobs:- CA7 scheduler for scheduling data stage jobs on daily, weekly and monthly runs and acknowledging on successful execution otherwise to research and troubleshoot the CA7 jobs for errors.
Involved in Unit Testing, Selenium Testing, (automation), QC
Incorporate ETL best practices.
Using SQL loader scripts processed the text file data with delimiters and delivered the output file.
Dashboard updates.
File system & Storage issues
Dell(Feb 2009 - May 2010)
Creating a dash board with problem tickets using patrol event alerts on any job failures for the jobs queued transforming the Siebel i/p to the different target stake holders.
Minimize the workload on source systems
Consolidate the data into a central repository and link it to target systems.
Leverage single version of the truth, any consumer system (read data from EDH) can join tables from different source systems.
Standardize data management tools across the organization.
Can be used as a base for data profiling and data cleansing.
Minimize the resources and skills required for batch integration.
Sathya power loom(June 2008 - January 2009)
Widely worked with Microsoft excel using Macros for data analysis and validation and preparing data charts.
Worked on crontab for job scheduling on UNIX environment.
Troubleshoot the system issues, infrastructure, network(wired & wireless) issues, configuring system setup.
Trained on Informatica, selenium and cognos.
Worked on SOAP UI testing on java code.