SHANTHABAI. D
Senior Software Engineer.
Email: ad0vrf@r.postjobfree.com Candidate Number:
Mobile: +91-886******* VB36068438
PROFESSIONAL SUMMARY:
8+ years of experience ETL Development and Testing (Analysis, Design, Implementation).
Has development and testing experience in Informatica (Power Center 10. 1. 1, 10.2, Test Data Management/TDM, DVO & Secure@Source/DPM), Talend.
Good knowledge on Power centre Objects like Transformations (Active and Passive), Mappings, sessions, Session logs and workflows also Data Warehousing concepts.
Experience on Web Application functional and System testing, Application Developer, Core JAVA, XML, Selenium, VB-Scripting
Expertise in writing and executing SQL queries manually for back-end Database.
Good Knowledge in Dimensional Data Modelling using Star & Snowflake Schemas, Fact & Dimensions tables.
Experience on SAP BO Reporting and Cognos Tool for Report Testing.
Involved in preparing, reviewing, and executing Test Cases.
Well acquainted with the Software Development Life Cycle (SDLC), Software Testing Life Cycle (STLC) and Bug life cycle.
Well experienced in Agile methodology, Sprint, DSU, Retrospective meetings, etc.
Has experience on Quality Center test management tool (ALM) and JIRA
Hands Alteryx Designer
Quick Learner, Team player, willing to learn new technologies, flexible, Hardworking, Positive Attitude.
Excellent communication and inter-personal skills.
WORK HISTORY:
Working as Staff-4 with Enrst and Young, Hyderabad from March 2021 to till Date.
Worked as Senior Software Test Engineer with Cigniti Technologies Pvt. Ltd, Hyderabad from April 2018 to Feb 2021.
Worked as Software Test Engineer with eMids Technologies Pvt.Ltd, Bangalore from Nov 2016 to Jan 2018.
Worked as Software Test Engineer with Cenduit (India) Services Pvt.Ltd, Bangalore from Jan 2016 to Nov 2016.
Worked as Process Analyst in Fidelity National Financial India Pvt. Ltd., Bangalore from September 2013 to April 2015.
CERTIFICATIONS & AWARDS:
Certified in ISTQB Advanced Level - Test Manager.
Certified in Microsoft AZURE – 900 Fundamentals.
Certified in ISTQB Foundation Level.
CAUSE FOR APPLAUSE awarded for outstanding performance and continued dedication in eMids Technologies.
Certified SQL Server 2003 R2 –Basic with New Horizons Computer Learning Centers, India in Cenduit (India) Services Pvt. Ltd.
EDUCATIONAL PROFILE:
MCA (Master of Computer Applications) from Sri Indu College of Engineering & Technology, Hyderabad in 2011, affiliated to JNTU Hyderabad.
TECHNICAL DETAILS:
Testing : ETL/DWH testing& manual testing
Operating Systems : Windows, UNIX
Databases : Oracle SQL Developer 3.1.07, 10g, TERA DATA, SQL Server 2005, 2012, 2008
ETL Tools : Informatica Power Center 10. 1. 1, 10.2, Talend
Data Access Tool : TOAD
Test Data Management Tool : Informatica Test Data Management
Data Security Tool : Informatica Secure@Source/DPM
Project Management,
Test Management
&Defect Tracking Tool : JIRA, HPQC V11, Team Tracker
Scheduling Tool : Tidal
Reporting Tool : Informatica DVO, SAP BO reporting tool, Data Validation Option, Cognos.
EXTRA CURRICULUM ACTIVITIES:
Participated as a Volunteer in Events and well organized in Dandiya Event, Birthday Bash, SMASH team member, FlashMob & Dance and Team outings.
WORK EXPERIENCE:
Project# 1:
Title : Morgan Stanley Broad Ridge.
Client : Morgan Stanley,
Environment : Informatica 10.2, Tera Data, Windows, UNIX, JIRA, TDM, WINSCP, Azure DevOpps, Agile methodology, Application Developer
Role : ETL Informatica Developer
Duration : April 2021 to Till Date
Project Description:
Morgan Stanley is a global financial services firm that, through its subsidiaries and affiliates, provides its products and services to a large and diversified group of clients and customers, including corporations, governments, financial institutions, and individuals.
The Scope of the project is to convert stocks data and load into different databases as requested by the client. Source data will be received in different formats Flat files like .csv, .txt and excel sheets and the same source data will be loaded into Development server by the development team by using transformation logics through Power Centre. Once the data is loaded into Dev system QA team will do the transformations based on a mapping sheet. Dev team has to make sure all the source table info should get load in QA environment and the testing process will be the same which was conducted in Dev.
Once the Dev and QA testing is done data will be loaded into final round of testing in UAT. If there are no defects in UAT transformed data will be loaded into production post confirmation from the client side.
Responsibilities:
Review and analyze the functional requirements, mapping documents, problem solving and troubleshooting.
Developed ETL Programs using Informatica to implement the business requirements.
Created UNIX scripts to tune the ETL flow of the Informatica workflows.
Parameterized for defining mapping variables, Workflow variables, FTP Connections.
Used Debugger in identifying bugs in existing mapping by analyzing the data flow, evaluating transformations.
Created and Ran the Jobs/Workflow for ETL process.
Prepared and ran SQL queries in Tera Data.
Verifying the data in target database after ETL process.
Performed column data mapping between source & target database.
Analyzing and fixed Defects JIRA.
Interactions with BA & Dev teams to resolve the issues.
Identifying sensitive fields (Customer personal information) in database.
Creating Masking Rules, Data Domains, Policies like PCI, PII, etc. in TDM for database columns.
Creating a plan for the rules applied columns and generate Workflow, Execute plan.
Validating the obfuscated data w.r.t original data in Database and consistency between them.
Project# 2:
Title : Informatica Test Data Management
Client : Synovus, USA
Environment : Informatica 10.6.2, Informatica TDM, DVO, SQL Server 2008, 2012, Windows, Agile methodology, Application Developer
Role : Senior Software Engineer, Test
Duration : April 2018 to Feb 2021
Project Description:
Test Data Management tool used for Synovus to discover the sensitive production customer data in the lower environments and obfuscate the sensitive data and ensure that the sensitive columns are obfuscated. After successful obfuscation perform application testing by changing the current application database with obfuscated database and ensure that application works as usual without any issues.
Data validation is the process of verifying the accuracy and completeness of data integration operations such as the obfuscation, migration or replication of data. Using Informatica Data Validation Option we validate the original data with obfuscated data and confirm that all the sensitive columns data is obfuscated. Data Validation Option tool helps to generate the PDF reports to submit for IRR approvals.
Responsibilities:
Involved in functional study of the application.
Identifying sensitive fields (Customer personal information) in database.
Creating Database Connection, Project in TDM for Database which needs to Obfuscate.
Creating Obfuscate Rules, Data Domains, Policies like PCI, PII, etc. in TDM for database columns.
Creating a plan for the rules applied columns and generate Workflow, Execute plan.
Validating the obfuscated data w.r.t original data in Database and consistency between them.
Troubleshooting the errors from TDM or Informatica while obfuscating data.
Creating Test plan and Tests in Data Validation Option for Obfuscated data w.r.t backup data.
Running Test plans, Generating reports in DVO.
Validating generated reports and uploading into share point.
Reporting Daily, Weekly and Monthly status reports.
Prepared Data Quality, Operational Metrics, Also prepared project estimations for the team.
Handling PCI of the project as a POC.
Handling a team of 2, and mentoring them to execute the tasks and explain the business functionalities
Run UFT script to update the source data to give test data requested by sub teams in Synovus
Run Web services script in SOAP to create test data in source system for sub teams in Synovus
Create and edit Workflow mappings in Informatica power center to pull matched records from source system which are request by sub teams.
Creating Customers, Accounts, and modifications of customer details in FIS application supply sub teams in Synovus.
Project# 3:
Title : Informatica Secure@Source
Client : Synovus, USA
Environment : Informatica Secure@Source 5.0, SQL Server 2008, 2012, Windows
Agile methodology, Application Developer, Core JAVA, XML, Selenium, VB Scripting
Role : Senior Software Engineer, Test
Duration : April 2018 to Feb 2021
Project Description:
Informatica Secure@Source provides global visibility of sensitive data assets, actionable insights into sensitive data risks, timely detection of insider and outsider threats, and accurate identification of high risk conditions to support data security, compliance, and governance.
By using this tool, Synovus scanning all their databases for finding Sensitive Fields and Risk conditions of those databases to decide, Whether the need to keep the data or Obfuscate the data and give them to lower environments.
Responsibilities:
Involved in functional study of the application.
Creating Data Domains and Classification Policies like PCI, PII, etc. in Secure@Source for database columns.
Creating Data Stores for databases in Secure@Source.
Creating Scans by selecting those Data Stores for Meta Data and also Meta Data with Data.
Run the Scans and find the Sensitive Fields.
Validate sensitive fields in backend by running SQL queries in MS SQL
Export Scans into local machines and place in SharePoint.
Handling a team of 2, and mentoring them to execute the tasks and explain the business functionalities
Reporting Daily, Weekly and Monthly status reports.
Prepared Data Quality, Operational Metrics, Also prepared project estimations for the team.
Handling PCI of the project as a POC.
Run UFT script to update the source data to give test data requested by sub teams in Synovus.
Run Web services script in SOAP to create test data in source system for sub teams in Synovus.
Creating Customers, Accounts and modifications of customer details in FIS application to supply sub teams in Synovus.
Project# 4:
Title : ACS
Client : QuintilesIMS
Environment : Talend, Oracle 12c, Windows, JIRA, HP QC (ALM) 11.0
Role : Sr. Test Engineer
Duration : Nov 2016 to Jan 2018
Project Description:
QuintilesIMS has contracted with the American College of Surgeons (ACS) to build a new registry platform, called the Infosario Registry Platform (IRP). As the IRP is a single platform that can underpin any clinical registry, it becomes a rich source of data for any organization that requires aggregated clinical data from across care settings. In order to ensure that all historical data from registries is also included, legacy data from registries needs to be migrated into the IRP platform.
ACS – Data Migration is developed in TALEND DI & ESB, which will migrate all the legacy registries (NSQIP, SSR, Trauma & cancer) to IRP – System using web services. JSON & XML files are generated & fed to the respective services to consume and store into IRP System.
There are a total of six registries managed by ACS, viz.:
NSQIP Adult
NSQIP Pediatric
NSQIP Bariatric
Surgeon-Specific Registry (SSR)
Cancer
Trauma
In order to go live with the new system, it is mandatory to migrate the legacy registry data to IRP system. This project is constituted to migrate the legacy registry data to the new IRP system.
Responsibilities:
Involved in functional study of the application.
Building E2E ETL Testing scenarios by developing Data preparation and SQL Queries
Generating Test data by using Java script
Created test data to execute the scenarios in a Controlled environment and UAT
Executed SQL queries to validate the results
Reported and tracked defects. Verified changes and closed bugs in HPQC Application Lifecycle Management.
Upload and follow the process of Project Management Tool JIRA
Handling a team of 3, and mentoring them to execute the test cases and explain the business functionalities
Developed and maintained ETL Testing test cases using Oracle11g SQL standard
Participated in the daily status calls to address issues.
Designed and developed reconciliation report between legacy database to XML DB
Interactions with BA & Dev teams to resolve the issues.
Project# 5:
Title : IRT Management
Client : Spring Bank Pharmaceuticals, Inc, USA
Environment : Informatica 9.6.2, Oracle 10g, Windows, UNIX, Toad 9.7.2, QC 9.0
Role : Test Engineer
Duration : Jan 2016 to Nov 2016
Project Description:
Interactive Response Technology (IRT) systems automate patient and drug management, reducing errors and increasing efficiency. As the largest IRT specialist in the world, Cenduit has the IVRS vendor expertise to empower sponsors for success with a completely personalized system that puts you in control of your clinical trials. With the needs of investigator sites and patients top of mind, Cenduit offers clinical supply chain intelligence and clinical operations know-how through its IRT-driven services
Client wants to have some system which can give some intelligent Informational reports & his existing business situation and how he can analyze specially his business. Data Warehouse (DW) to facilitate near current reporting and ETL process to populate DW.
Responsibilities:
Involved in functional study of the application.
Running the Jobs/Workflow for ETL process.
Prepared and ran SQL queries.
Designed & executed test cases.
Verifying the data in target database after ETL process.
Performed column data mapping between source & target database.
Prepared test data for testing.
Defect Analyzing and Reporting in QC.
Interactions with BA & Dev teams to resolve the issues.
Involved in the Report testing.
Reporting daily testing status report.
Project # 6:
Title : BKFS.
Client : Black Knight Financial Services, USA
Environment : Informatica 9.6.2, Oracle 10g, Windows, UNIX, Toad,
Quality Centre 9.0.
Role : Process Analyst
Duration : April 2014 to April 2015
Project Description:
Black Knight Financial Services Data & Analytics is a trusted resource for mortgage lenders, servicers and investors providing market-leading data and analytics solutions that provide powerful, unparalleled insight. These offerings enable our clients to improve performance, Proactively identify risk, create mitigation strategies and accurately estimate collateral value.
Client wants to have some system which can give some intelligent Informational reports & his existing business situation and how he can analyze specially his business. Data Warehouse (DW) to facilitate near current reporting and ETL process to populate DW.
Responsibilities:
Involved in functional study of the application.
Running the jobs for ETL process.
Verifying the ETL data in target database.
Source and Target data mapping.
Verifying the data which is loaded into target database.
Interactions with BA & Dev teams to resolve the issues.
Reporting daily testing status report.
Developed, Reviewed & Executed Test Scripts and Designed test data.
Identified Test Scripts for Regression testing.
Defect Analyzing and Reporting in QC.
Project # 7:
Title : Title Point Express.
Client : FNF
Environment : ASP.NET, SQL Server 2005, Team Tracker 5.0
Duration : September 2013 to March 2014
Role : Process Analyst
Project Description:
TitlePoint sets the standard for efficiency in title searching and order management. Its leading-edge architecture and design enable users to easily produce comprehensive, examination-ready title packages – with the convenience of a Web-based browser.
Using TiltePoint’s smart, powerful, and full-featured search engine, you can quickly find the information you need and streamline the search process. Perform combination searches of tax, title, and images, including starter records. Create merged searches that combine multiple parcels into a single package. Monitor open orders for new plant postings, and much more.
Responsibilities:
Involved in understanding of the Software Requirement Specification
Developed, Reviewed and Executed Test cases
Test Activities included Functional Testing and Regression Testing.
Defect Analyzing and Reporting.