CHANDINI KUMAR *** Newyork Avenue,Jersey City,NJ-*7307
********@*****.*** • Mobile 551-***-****
Professional Summary:
9+ years of experience in Analysis, Development, and Implementation of business applications in various domains like Banking and Wealth/Asset Manageme
Relevant professional experience as a Data Analyst specialized in
with focus on data analysis for Regulatory reports.
Extensive experience in complete SDLC, including System Requirements gathering, Design, Development, Testing, Production Support, Maintenance and Enhancement in variety of technological platforms.
Experience in interacting with business users to analyze the business process and requirements as well as transforming the requirements in to technical terms, converting Functional Specs to Technical Specs and rolling out the deliverable.
Identify and document the scope of projects, create work plans, assign tasks and review progress and organize future work effort.
Worked extensively on Informatica with various sources and end points like flat files, XML files, Mainframe through power exchange, convention RDBMS like Oracle and SQL Server.
Strong knowledge and work experience in designing, building and managing large complex DW systems.
Proficient with data quality, cleansing, migration and warehousing activities with an exposure in ETL process design.
Responsible for all activities related to the development, implementation, administration and support of ETL processes for large scale data warehouses using Informatica Power Center 8.x and 9.x.
Expertise in implementing and managing new infrastructure over a large user base with comprehensive knowledge of setting up & maintaining data warehouses (ETL tools selection) to ensure smooth working of systems.
Strong skills in Data Analysis, Data Requirement Analysis and Data Mapping for ETL processes.
Possess capabilities in constructing triggers, tables, user defined functions, views, indexes, user profiles, relational database models, data dictionaries and data integrity
Hands on experience in performance tuning, identifying bottlenecks and improve performance at database level, Informatica mappings and session level.
Well versed in developing the complex SQL queries, unions and multiple table joins and experience with Views.
Experience in Informatica Administration.
Experience in Informatica data analyst.
Experience in Informatica Metadata Querying.
Expertise in creating the end-to-end lineage for key business elements (KBEs/PDEs) using the Informatica Metadata Manager & Business Glossary.
Good experience in UNIX Shell Scripting and scheduling tool Autosys.
Extensive experience in Mainframe technologies COBOL, CICS, DB2, JCL, VSAM, Easytrieve
Proficient in using Tools like TSO/ISPF, FILE-AID, ENDEVOR, XPEDITER, STARTOOL, BMC, ALCHEMIST, CHANGEMAN, SPUFI, GMI HARNESS, TEST HARNESS, WEBSPHERE DEBUGGER, IBM RATIONAL DEVELOPER FOR SYSTEM Z(RDZ), PRINCETON OPTIM, JIRA, SNOW, SM7, PEREGRINE, OPC scheduler, Autosys, Putty, Winscp, Toad, Microsoft Visio.
Performed rigorous unit testing, system testing for both Batch COBOL and online CICS modules
Experience in different testing levels like Unit testing, regression testing, systems testing & user acceptance testing.
Experience training end users through online and in-class sessions.
Extensive experience interacting with Department Managers and different levels of users and developers.
Ability to communicate business and technical issues to both business users and technical users.
Experience in working in an onsite-offshore structure and effectively coordinated tasks between onsite and offshore teams.
Extensive experience in development, support and maintenance projects.
A highly motivated self-starter and a good team-player with excellent verbal and written communication skills
Passionate about learning new applications, learning the functional and technical concepts of new systems.
Education:
SRM University, Chennai, India
Bachelors of Engineering (B.Tech) – 8.6 GPA
Technical Skills:
ETL Tools
Informatica Powercenter 8.x, 9.x,9.6.x, Informatica Metadata Manager, Informatica Analyst/IDQ
Databases
Oracle 9i, 10g, 12c, UDB/DB2, SQL Server 2012
Methodologies
Star Schema, Snow Flake Schema, Dimensional Modeling
Operating Systems
Z/OS, Windows 2000/XP, UNIX
Scheduling Tools
Autosys
Languages
SQL, PLSQL, UNIX Shell Scripting, COBOL, JCL, CICS, Easytrieve
File System
VSAM
Mainframe Tools
Xpeditor,Fileaid,Alchemist,Changeman,ISPF,BMC,SPUFI,Endevor
ERP Tools
JIRA, HP Quality Center, SNOW
Other Tools
MS-OFFICE, MS POWER POINT, VISIO, WIN SCP, PUTTY, SVN
Professional Experience:
Lenmar Consulting [Client: BNP Paribas]
Senior DQ Mart Analyst/Developer [Onsite, Jersey city NJ] April ’16 – Till Date
IHC DataSolutions :
The Intermediate Holding Company (IHC) program, structured at the U.S. level across poles
of activities of BNP Paribas provides guidance, supports the analysis, impacts assessment and
drives adjustments of the U.S. platform’s operating model due to drastic changes introduced
by the Enhanced Prudential Standards (EPS) for Foreign Banking Organizations (FBOs)
finalized by the Federal Reserve in February 2014, implementing Section 165 of U.S. Dodd-
Frank Act. As rules stemming from the EPS have been finalized, the IHC program will also
recommend & drive implementation of new ‘normal’ state as required by rules & regulations
irrespective of location, business, function or activity of BNP Paribas in the U.S.
Datastore
Datastore was the replica of landing database used to keep the landing data for the 6 months
retention period. All the Datastore table were partitioned (interval partition) based on
business date and local indexed on business date and land id key. ETL processes were
designed to pull the data from Landing to Datastore based on source system. Landing area
was transient and was stored only until the next scheduled load begins.
DQMART-Data Quality Exceptions
Currently there is no centralized data quality process in place across all US Operations overseeing quality and ownership of data. Quality Controls are in place for some perimeters and controls exist in the input of information in the systems (4-eye reviews, level 1 and 2 controls), but there are no centralized process ensuring overall quality and ownership of the data throughout the lifecycle of the information.
DQ Mart is consolidated repository for collecting data from sources like OFSAA, AXIOM in IHC Journey & storing data quality results on the data that is subject to data quality controls. DQ Mart was intended to support all data quality reports & dashboards developed to inform data governance community of data quality issues and facilitate root cause analysis and remediation.
As part of Data Quality Checks, which consisted of technical rules usually, resulted from
technical or integrity violation. Some of the technical data quality type checks were Record
Counts, Hash Total, Invalid Source File Extract, Data Type Validations, Data Precision,
Mandatory Field Check/Null Constraint Check, Business Key Validation/Uniqueness Check,
Primary Key, and Foreign Key Constraint.
Responsibilities:
Gathering Business Requirements for FR-Y9C/FFIEC- 002/ CCAR Regulatory Reports.
Analyse data based on the profiling results and identify the impact in regulatory reporting like FFIEC-002, FR-Y9C, FR-14A, FR-14Q,5G and US LCR on monthly or quarterly basis.
Involved in the Project Design/Development/implementation of integrated data lineage and DQ Mart.
Worked in IDQ to generate the Data Quality rules to cover various DQ dimensions (Completeness, Precision, Validity, Integrity, Uniqueness, Accuracy).
ETL processes design and implement in informatica power center capture the exception data that does not meet the Data Quality rule and generate a summary report that has the consolidated exception data for each rule.
Perform pre-assessment on these rules to identify the exceptions if any and discuss with the respective data steward to clean up the data or modify the rules based on the recommendation from data steward.
Streamlined tasks to meet deadlines for ETL teams both onsite & offshore
Gathering functional requirement specifications from Data Governance team and understand the business requirement to convert the business rules to technical rules/checks.
Analyzing & providing estimations for ETL deliverables and monitoring the progress for quality ETL deliverables
Performing other activities such as:
Moved data from DB2 to SQL server as part of DB2 decommission.
Identified the impacted components for DB2 decommission and modified the shell scripts and Informatica Workflows to pull the data from SQL server tables as part of DB2 decommission.
Worked on the migration of Informatica from 9.1.0 to 9.6.1
Worked on the migration of application from old Linux server to the new Linux server as part of consolidation and upgrade.
Designed the Database Purge Process
Prepared technical design documents, low level functional design documents and velocity documents
Created complex mappings and Mapplets for implementing various DQ checks at multiple hops in the IHC Architecture and publish the exception data into DQ Mart.
Developed the Informatica Mappings/Mapplets/Worklets and Workflow for loading the finance and reference data to Oracle Data Marts
Developed PL/SQL and UNIX Scripts
Optimizing performance tuned mappings to achieve higher response times
Generating Autosys jobs that schedule the Informatica Data Loads
Senior Informatica Metadata Developer
Informatica Metadata Lineage
Metadata Manager project has been implemented to effectively measure BCBS239 rules & regulations and help BANK data stewards, business analysts, and data architects collaborate to define and maintain a common taxonomy of metadata definitions, along with their business names and implementation was based upon regulatory expectations, control function expectations, the need to control risks, or support the broader Enterprise Data Management Strategy for all Reporting identified Key Data Elements (KDEs), End to End Lineage dynamic visual map shows all the data flows and dependencies in the data integration environment across all systems, processes, and business glossaries from source to target.
This end-to-end data lineage helps BANK IT organization, data stewards, business analysts, data architects, and Auditors to understand what the data means, where it came from, where it is going, and how it has been transformed and the DQ rules (Business/Technical) applied on each of the Key Data Elements across the organization.
Responsibilities:
Gathering Business Requirements for FR-Y9C/FFIEC- 002/ FR-Y9C, FR-14A, FR-14Q,5G and US LCR Regulatory Reports.
Created complex custom metadata models and templates using Informatica Metadata Manager
Created various reusable custom models for creating the metadata catalogs
Created complex mappings using Designer to pull the metadata from csv files of various application systems using ETL tool Informatica
Optimizing and doing performance tuning of Metadata resources to achieve higher response times.
Involved in creating Metadata manager templates, resources, catalogs.
Created the End to End Lineage for IHC using Metadata Manager
Lead the design and development for scheduling all the batch jobs and real time jobs using Autosys
Optimizing and doing performance tuning of mappings/sessions/workflows to achieve higher response times.
Worked with Data Stewards of each group to gather all the Business Terms and definitions.
Created Business Glossary and loaded the Business terms and definitions provided by the business user in Informatica Analyst.
Created the workflow process based on business requirement in order for the defined business term to go through the appropriate approvals before publishing.
Interactively work with Data Stewards & business users to resolve any issues they come across during the workflow process of any Business Terms.
Created automatic process to bulk upload the business terms in Informatica Analyst.
Created automatic process to bulk publish/reject the business terms in Informatica Analyst.
Environment: Informatica Metadata Manager 9.6.1, Informatica Business Glossary, Informatica PowerCenter 9.6.1, Informatica Analyst, Oracle 12c, Oracle SQL Developer, UNIX Shell Scripting, Autosys, SVN and HP ALM
IRIS Software Inc [Client: Bank of America] Sep 2015 to March 2016
Senior Informatica & Metadata Developer [Onsite, Jersey city NJ]
RDA: The Data Quality Index (DQI) and Data Quality measure (DQM) calculations are run on a weekly basis to analyze breaks and notifications for all controls linked to a particular KBE. The DQI/DQM calculations are run and analyzed by the ARCTIC DQ&DG team and are then shared with the Business-GCF (business owners) of each KBE and associated control. Separately, a consolidated DQI/DQM calculation rolled up at the KBE level and these metrics will be utilized KBE-level data within info portal.
Metadata Manager project has been implemented to help BANK data stewards, business analysts, and data architects collaborate to define and maintain a common taxonomy of metadata definitions, along with their business names and implementation was based upon regulatory expectations, control function expectations, the need to control risks, or support the broader Enterprise Data Management strategy for all BASEL reporting identified Key Business Elements (KBE’s), End to End Lineage dynamic visual map shows all the data flows and dependencies in the data integration environment-across all systems, processes, and business glossaries from source to target.
This end to end data lineage helps BANK IT organization, data stewards, business analysts, data architects and Auditors to understand what the data means, where it came from, where it is going and how it has been transformed.
Responsibilities:
Lead the project Design/Development/Implementation of integrated data store.
Lead and coordinated with the offshore team for Design and development.
Involving in the preparation of documentation for ETL standards, procedures and naming convention.
Captured the entire data lineage (Source to Target information) in Informatica Metadata extensions.
Created complex mappings to retrieve the data lineage information from Metadata extensions to load the data in to MDM landing layer.
Created reusable transformations and mapplets for their reusability in multiple mappings to have efficient coding.
Created the mappings for setting up cleansing and standardization rules to load the data into MDM staging layer.
Created Match & Merge rules in MDM Hub to create consolidated Golden record
Created Metadata manager models.
Created Metadata manager templates, resources, catalogs.
Created End to End Lineage in diagrammatic view in Metadata Manager tool.
Lead the design and development for scheduling all the batch jobs and real time jobs using Autosys.
Coordinating all releases successfully according to all compliance rules.
Optimizing and doing performance tuning of mappings/sessions/workflows to achieve higher response times.
Environment: Informatica Power center 9.1, (MDM – Meta Data Manager), DB2, SQL Server, Windows NT, UNIX Shell Scripting, Autosys scheduler.
HCL Technologies [Client: UBS]
Senior Software Engineer [Onsite, Weehawken NJ] June 2012 to Sep 2015
FA Comp: FA compensation system is used to set up, maintain and administer Financial Advisor compensation plans and FA numbers. This system gets transactions from various feeds and creates trade and order records which are used for calculation of compensation of FA.
Business Analytics: Business Reporting & Analytic system designed for the Financial Advisors. This system extracts Advisors accounts, assets, production, T12, net new money, trading trends and put them together for business reporting. This provides a central location for transparent reporting of Financial Advisor compensation, Loans, practice management (Assets, Revenue, NNM, Client Accounts) and FA profile information, also enables FA’s to maximize growth potential and enhance the UBS client experience. The system also intended to provide WMUS with an analytics and data management capability that will enable deeper client and FA insight to influence WMUS’s marketing, sales and service strategies.
CAS : Client Analytics system is a combination of mainframe and informatica system intended to provide WMUS with an analytics and data management capability that will enable deeper client and FA insight to influence WMUS’s marketing, Sales and service strategies.
As part of this project, worked on few others applications like EAA (Emerging Affluent Accounts), OCC (Organisation Cross coverage), VBP (Value Based pricing), LOU (Loan Application)
Responsibilities:
As part of the enhancement/ Maintenance project, worked as technical lead and coordinated with offshore team to provide them technical support & solution.
Performed requirement gathering discussing with business and worked on the enhancements/changes for the Mainframe & ETL process flow.
As part of enhancements/change request, generated technical specifications on the requirement for various process flows in Mainframe & ETL process flow, created mapping sheets based on the requirement for various process in ETL.
Worked on Mainframe code (COBOL/JCL/CICS) enhancement/development to load data into DB2.
Worked on enhancement/development of Informatica mappings, workflows, PL/SQL to pull data from various source systems/EDW data warehouse and load into data marts/transactional tables in Oracle.
Work with the business users, discuss the issues raised on client accounts and provide solution for the issues.
Analyze and debug the existing COBOL programs/ ETL/PL SQL, find out the root cause for the technical issues and data issues.
Generate data fixes, test and implement it in production.
Do impact analysis on functional and technical side for the code/data changes and releases.
Responsible for coding, testing and driving the release.
Coordinate with upstream/downstream application teams to analyze/study the impact caused by introducing new changes of any upstream application.
Generate and provide adhoc reports for business users.
Involved in Production Support in fixing the production abends.
Find the root cause for production job abends/issues and apply appropriate fixes.
Manage and Co-ordinate the Weekly & monthly releases for data and code changes/fixes for fixing business issues and for new business requirements.
Review all unit test plans, coding, data fixes and other deliverables of the team.
Responsible for reviewing code, test cases, test results and other deliverables of the team for changes and delivering to Business on the expected date.
Coordinated offshore team with a total of 6 developers
Environment: Z/OS, Informatica Power center 9.1, DB2, Oracle 10g, Windows NT, PLSQL, UNIX Shell Scripting, Autosys scheduler.
HCL Technologies [Client: Lloyd’s Banking Group]
Software Engineer [Offshore, Chennai] November 2008 to May 2012
Mortgage Processing: Mortgage processing was an application of HBOS, during the integration of HBOS with Lloyds TSB Group plc business has agreed to retain HBOS mortgage processing application as the target application.
eChannel Harmonisation: Corporate Online is a banking application of HBOS for corporate customers, during the integration of HBOS with Lloyds TSB Group plc business has agreed to retain Corporate online application as the target application. It is a development project followed the complete SDLC model and started to integrate Lloyds customer and its specific products into COL application. Corporate Online manages the banking process of corporate customers, its front end is dot net and back end is mainframe. Registration, administration of accounts and processing of payments are done at front end and all the data’s will be sent to mainframe as XML through message switch. Then the necessary validations, processing of input requests and communication with third party vendors are done at mainframe end.
COG Bottom Copy: COG System which is part of Wholesale – Advance Bank Platform. COG is an application which maintains the historical information for Corporate and Commercial Banking customers for the risk analysis for HBOS. This project was to add the Lloyd’s customers to existing HBOS COG system during the integration of Halifax Bank of Scotland with Lloyd’s bank.
Responsibilities:
Designed and developed Informatica mappings, workflows to extract data from various sources like flat files, DB2 and load it into oracle databases after incorporating various business rules.
Implemented Slowly Changing Dimensions for accessing the history of accounts and transaction information.
Developed email, command tasks, Mapplets, Reusable transformations and worklets.
Extensively worked with Aggregator, Sorter, Router, Filter, Join, Expression, Lookup and Update Strategy, Sequence generator transformations
Developed mapping parameters and variables to support SQL overrides.
Worked on workflows and sessions, events raises, event waits, decision mails, commands, worklets and timers.
Responsible for the performance tuning of the ETL process at source level, target level, mapping level and session level.
Used session partitions, dynamic cache memory and index caches for improving performance of Informatica services/ server.
Developed Reusable Transformations and Mapplets using Transformation Developer and Mapplet Designer for use in multiple Mappings.
Designed mappings that involved Target Load Order and Constraint Based Loading.
Worked on SQL stored procedures, functions in SQL.
Investigated and fixed problems encountered in the production environment on a day to day basis.
Created basic UNIX shell scripts to automate sessions.
Involved in Unit Testing, User Acceptance Testing to check the data loads into target are accurate, which was extracted from different source systems according to the user requirements
Used debugger to test the mapping and fixed the defects.
Wrote technical documentation to describe program development, logic, coding, testing, changes and corrections.
Updated detailed test plans and test cases conforming to the code release and project requirements.
Responsible for pre implementation and also post implementation.
Prepared a production monitoring and support handbook for ETL Process.
Provided warranty support after the integration.
Environment: Z/OS, Informatica Power center 8.6, DB2, Oracle 9i, Windows NT, UNIX Shell Scripting, Autosys scheduler.