Post Job Free

Resume

Sign in

Data Developer

Location:
Edison, NJ
Posted:
September 23, 2020

Contact this candidate

Resume:

NAME: SMITA KHILARI Mob: +1-201-***-****

Email: adgc3r@r.postjobfree.com

Location: Edison, New Jersey

SUMMARY:

IT professional with 13.5+ years of extensive experience in data warehousing, data integration and data governance project design, architecture, development and testing of various applications using ETL Tool under AIX UNIX, Linux, Windows environments tool and Collibra data governance tool, MuleSoft Data Integration tool, databases Oracle, Teradata, SQL server.

Experience in building next generation of global financial market data systems to support front office trading, middle office processes and back office risks.

Lead, design and develop futuristic solutions using cutting edge technologies.

Extensive experience with development in complex and high volume Abinitio ETL applications including requirement gathering, design, implementation, unit testing, debugging, documentation under Unix Environment.

Working experience in design & development of Collibra governance workflows using BPMN Flowable/Activiti.

Experience in Data Integration tool MuleSoft.

Excellent Data warehousing technical skill set with robust knowledge of DWH architecture.

Experience in analyzing data that is both structured and semi structured (JSON, XML).

Hands on experience in PDL metaprogramming, XML data processing, generic graphs implementation, advanced Abinitio components, psets, Conduct IT plans, Continuous Flows, ICFF, Micrographs and Abinitio products like Abinitio GDE, Co>operating system, Abinitio operational console, EME Technical Repository, Metadata Hub, Control Center, Express>It, BRE, Query>It, Data Profiler, Bridge and Reporter.

Strong experience in both waterfall and agile methodology. Experience in the SDLC, PMLC.

Experience in requirement gathering, design, impact analysis, data cleansing, data transformations, data relationships, source systems analysis. Adequate knowledge and experience of Data Warehouse, Data Mart, Data Modeling, Dimensional Modeling, OLTP, OLAP, star schema, snowflake schema, fact and dimensions tables, logical & physical data modeling, ETL processes, SCD, data analysis, surrogate keys and normalization/de-normalization.

Strong working experience in RDBMS development using Oracle, PL/SQL, Teradata and MS SQL Server and large-scale database management including building and tuning complex SQL queries. In-depth understanding of Teradata Architecture and load utilities.

Highly Proficient in the use of T-SQL for developing complex Stored Procedures, Triggers, Functions in achieving high performance business needs.

Highly Proficient in the use of Query Optimizer in understanding the Execution plans to identify the bottlenecks and perform optimization.

Good exposure to UNIX shell scripting, commands, problem solving and analytical capabilities.

Insightful on Collibra and its integration with other systems using MuleSoft/Collibra Connect, Ab Initio software suite.

Experienced in implementation of technical metadata repository using Ab Initio Metadata Hub.

Good experience in various kinds of testing like Unit testing, Regression testing, and System integration testing, Manual testing including preparation of test plan, creation of test cases, test case documents, test data creation.

Extensive knowledge and experience of Banking and Financial, Retail Domain applications.

Experience in providing 24*7 Production Support and real time issue resolution for ETL applications.

Excellent communication, interpersonal skills and strong ability to work as part of a team and as well as handle independent roles with positive attitude towards responsibilities and challenges.

TECHNICAL SKILLS:

ETL Tools: Ab Initio (GDE 3.3.4.1, Co-Operating System 3.3.4.6), Continuous Flows, ICFF, Micrographs, EME, Ab Initio Operation Console 3.1.3.2, Metadata Hub 3.1.3.1, Technical Repository, Control Center, Conduct> It, Express>It 3.3.1.2, BRE, Query>It, Talend 7.1, MS SQL BI(SSIS, SSAS, SSRS)

Databases: Teradata 13.0, Oracle 11.2.0.4, PL/SQL, MS SQL Server 2000, Oracle SQL Developer 3.2.2, Toad for Data analyst 2.6

Data Modeling: Star schema and Snowflake Schemas, Enterprise Data Modeling, Logical Data Modeling, Physical Data Modeling

Tools: IBM Job Scheduling Console, Secure CRT, Lotus Notes, SQL Assistant, Service Now, JIRA SQL Assistant, Control M, Autosys, HP ALM QC, RLM, GitHub Desktop, Anypoint Studio, Eclipse, Postman, Power BI, Cognos, SVN Tortoise

Programming: C, C++, SQL, UNIX Shell Scripting, Korn Shell, Java, Python, Groovy

Operating Systems: Linux 2.6.32, Windows 7/NT/98/2000/XP, IBM AIX 5.0/5.2/5.3, Mainframe

Data Governance: Collibra BPMN Workflow 5.7.2

Data Integration Tool: Mulesoft 4.2.1

Big Data: Hadoop (HDFS, HBase, MapReduce, Hive, Apache Pig, Spark)

EDUCATION: BE Computer (First class with Distinction)

PROFESSIONAL EXPERIENCE:

Organization: Incandescent Technology Inc Data, NY

Period: 19th Dec 2016 – till Date

Address: One World Trade Center, 285 Fulton St #83e, New York, NY 10006

Role: Principal Consultant

Client: Wells Fargo, NJ, USA

Role: Onsite Senior ETL and DGC Developer

Environment: Abinitio (GDE 3.3.5, Co-Operating System 3.3.5.8), EME, Metadata Hub, Technical Repository, Control Center, Conduct It, Express>It 3.3.1.2, BRE, UNIX shell Scripting, Oracle 11.2.0.4, Oracle SQL Developer 3.2.2, HP ALM QC, Service Now, JIRA, Teradata 13.0, Collibra BPMN Workflow 5.7.2, Mulesoft 4.2.1, Postman, Github Desktop, Talend 7.1, MS SQL BI(SSIS, SSAS, SSRS)

CDGC: Collibra DGC and Governance Center is used for Enterprise Data Governance and Reporting. It is designated interface for input, stewardship and reporting Enterprise Metadata. Metadata management involves co-ordination of technical and Business team members, roles, responsibilities, processes, repository metadata content, technology and governance across organization. Collibra custom workflows are built to meet the organization needs. MuleSoft Integration Services are developed to move data between Collibra DGC and other Wells Fargo Interfacing systems and Abinitio Metadata Hub. PDE Export application of it focuses on export from Abinitio Metadata hub into EDGC. Data Quality metrics are calculated and shown in Collibra Data Governance tool.

Responsibilities:

Design and develop generic Ab Initio framework components, psets, and plans to deliver complex business solutions.

Prepared complex Unix scripts to cater to business-critical applications rollout in production working in Agile and Waterfall programming model.

Developed applications for data preparation, data quality, data integration, application integration, data management using Talend ETL tool.

Conduct design reviews, code reviews to ensure code developed meets the requirements.

Interact with release management, configuration management and quality assurance, database modeling support, upstream and downstream development team, operation support team.

Conducting unit testing, module testing, and regression testing and system integration testing of the scripts, graphs, psets, plans and queries, provide support to DIT, SIT, UAT, PROD Testing.

Set up Collibra Communities, Domains, Types, Attributes, Status including custom dashboard with data quality, metrics, status, workflow initiation and issue management for each Domain specific requirements.

Collaborated with business stakeholders and data architects to analyze Collibra BPMN workflows requirement.

Developed custom workflows to managed Business Rules, Data Quality rules and Data Quality Metrics lifecycle.

Working on Ab Initio MHub to integrate Collibra for exchanging DQ metadata and results.

Mulesoft utilities for PDE Export and DGC asset existence check.

Educate, train and mentor user community on the usage of Collibra and Data Stewardship process and Issue Management along with their roles and responsibilities adhering to defined SLA's and change request.

Client: Citigroup, Jersey City, NJ, USA

Role: Onsite Senior ETL Developer and Lead

Environment: Abinitio (GDE 3.2.7.2, Co-Operating System 3.2.6.3), EME, Metadata Hub, Technical Repository, Control Center, Conduct It, Express>It 3.3.1.2, BRE, UNIX shell Scripting, Oracle 11.2.0.4, Oracle SQL Developer 3.2.2, HP ALM QC, Service Now, JIRA, Teradata 13.0

CitiKYC: CitiKYC is a consistent repository for KYC information across the entire bank, including all business and regions. It facilitates methods for assessing money-laundering risk for clients. It will enable single shared KYC record. It is being rolled out for products across ICG (Institutional Client Group) and Global Consumer Bank (GCB). To implement globally consistent KYC approach customers are assigned to one of four main client categories: Individual, Financial Institutions, Corporate and Government. For information collection and due diligence purpose they are further divided into subcategories.

Involved in Global framework design application risk on demand used when there is a need to re-compute risk or correct existing risks. This backend solution built up using Abinitio plans, graphs, Unix scripts provides ability to refresh risk score, risk rating and calculates next periodic review date for selected customers which will reflect on UI screen. ROD application mainly uses build risk lookups, build risk request, compute risks, load calculated risk in temp area, validate risk and then finally update risk to main tables, audit changes in risk attributes, generate control reports for compliance team validations. Involved in requirement gathering, design, implementation, testing, defect fixing, enhancements, performance tuning, building custom components, data patch requirements, PRB resolutions, environment swim lane set up across multiple servers etc.

Within GCB current CitiKYC functionality has been developed to support Mexico Retail Bank and extended further to support Mexico GCB Credit cards. Also involved in providing solutions and implementation for Mexico Cards. It also includes enhancements to retail processing. Core, Member, Country and APF related data is processed.

Responsibilities:

Build the next generation of global financial market data systems to support front office trading, middle office processes and back office risks. Lead, design and develop futuristic solutions using cutting edge technologies.

Work with business users and provide support to translate requirements into system flows, data flows, develop functional requirement, technical design, impact analysis and data mapping documents.

Design and develop generic Abinitio framework components, psets, and plans using parallel processing, phasing etc to deliver complex business solutions.

Performance tuning of Abinitio applications and frequently used SQL queries, query optimization, script building, testing and debugging applications, automating processes, scheduling, monitoring jobs in control center.

Writing complex Unix scripts to cater to business-critical applications rollout in production working in Agile and Waterfall programming model.

Conduct design reviews, code reviews to ensure code developed meets the requirements.

Implemented end to end applications using Abinitio graphs, psets, plans, Unix Shell scripting.

Working proactively and independently to articulate issue/challenges to address project delivery risks, prepare RCA and lessons learned documents.

Interact with release management for change requests, configuration management and quality assurance, database modeling support, upstream and downstream development team, operation support team.

Conducting unit testing, module testing, and regression testing and system integration testing of the Unix scripts, graphs, psets, plans and SQL queries.

Mentor and guide offshore counterpart team.

Organization: Cognizant Technology Solutions Pvt Ltd

Period: 27th Aug 2012 – 23rd June 2016

Address: Plot # 26, Rajiv Gandhi Infotech Park, MIDC, Hinjewadi, Pune, India 411057

Role: Senior Associate-Project

Client: KeyBank, Cleveland, OH, USA

Environment: Abinitio (GDE 3.1.6.2, Co-Operating System 3.2.4.6), EME, Metadata Hub, UNIX shell Scripting, Teradata 13, Mainframe, Oracle SQL Developer 1.5.5, Unix AIX, HP ALM QC, Control M

Oracle Banking Platform application is an application that re-platforms digital offerings to bring ease value, expertise to life of clients. It involves technical components of OBP, MDM and security for digital program to enable forthcoming business capability for online and mobile banking.

The ETL part of OBP program is handled in this project. MDM is primary source of customer information for various products whereas CIX provide remaining information as part of enriched feed in OBP. Orphan record process, exception handling process, new extracts for individual, organization, account, account to account, customer to account, merge, debit card extract, credit card extract creation are part of this process. MDM source files are fetched from harmonization layer and then enriched with CIX information by applying various business rules and transformations, CDC is performed and then staging files are generated for each extract. Extracts are then split into header, detail, trailer, master and then loaded into oracle staging tables. After load completion message is triggered through JMS queue to the downstream OBP team. The downstream OBP team fetches data from staging and then loads it into OBP main table.

Responsibilities:

Implemented end to end Abinitio applications for one of the Individual extracts and real time mini batch processing application, process to auto generate and send statistics consolidated report for all extract using Abinitio graphs, psets, plans, Unix Shell scripting.

Requirement gathering by interacting with client/business, creating impact analysis and design documents, proposing solutions, defect fixing and estimating the hours. Working proactively and independently to articulate issues/challenges to address project delivery risks.

Worked in onshore-offshore model and lead offshore team. Interact with release management, configuration management and quality assurance, database modeling support, other upstream and downstream development teams and operation support team.

Handling various kinds of activities like development of complex code, enhancements, testing, problem resolution etc.

Providing unit testing, module testing, and regression testing and system integration testing of the scripts, graphs, psets, plans and queries.

Code reviews to ensure code developed meets the requirements.

Provide KT sessions, preparing documentation, mentoring junior resources.

Client: Sainsburys, London, UK

Role: Onsite ETL Tech Lead and Developer

Environment: Abinitio (GDE 3.1.5, Co-Operating System 3.1.5), EME, Metadata Hub, UNIX shell Scripting, Teradata 13, Mainframe, Unix AIX, HP ALM QC, Control M

Helios CMT Program is to personalize every interaction with JS customers by establishing an integrated multi-channel campaign management capability. It allows for the optimizing of inbound and outbound customer contacts across all channels, lines of business and potential future joint ventures. It enables Sainsbury to provide customers with fully personalized, real time targeted IBM EMM solutions and steps to operate by adding more value.

First phase of application focuses on core functionality of the EMM product suite, a basic datamart and migration of 3 key campaigns to validate the technology and demonstrate new ways of working.

The second phase is based on agile methodology. It is for finalizing EMM product functionality, deliver integration between Sainsbury’s and external third parties and migration of most of the existing campaigns. It includes around 5 campaigns to help customers. The overall process involves loading of CDM using Abinitio, Teradata utilities from different source system which are EDW, ODS and feed from SAS server. This data from CDM is further accessed by campaign analyst to create campaigns and produce selection file, append PII data using views present in CDM. All dimension data which involves direct mapping is loaded using Teradata BTEQ scripts. Complex transformations and integration with different data sources is done using Abinitio.

The source data and SAS scripts are analysed during requirement gathering. The key steps are implementation of base layer using Abinitio, Teradata and Unix functionality followed by presentation layer which consists of views built on top of it. My involvement is in building the campaign functionality through Abinitio graphs, Teradata scripts and testing them. Finally, the presentation layer views will be accessed by downstream team through IBM EMM tool to achieve the required functionality.

Responsibilities:

Involved in requirement gathering meetings, data planning, sprint planning, retrospective sessions and stand up calls with client, UAT with business users. Worked in agile methodology.

Prepared impact analysis and design documents, proposing solutions, defect fixing and estimating the hours.

Managing all development tasks and activities in JIRA.

Working proactively and independently to articulate issues/challenges to address project delivery risks.

Worked in onshore-offshore model and coordinating team from onsite.

Interact with release management, configuration management and quality assurance, database modeling support, other upstream and downstream development teams and operation support team.

Handling various kinds of activities like development of complex code, enhancements, testing, problem resolution etc.

Code reviews to ensure code developed meets the requirements.

Provide KT sessions, preparing documentation, mentoring junior resources.

Client: John Lewis Partnership, London, UK

Role: Onsite ETL Tech Lead and Developer

Environment: Abinitio (GDE 3.1.2.2, Co-Operating System 3.1.2), EME, Metadata Hub, UNIX shell Scripting, Teradata 11, Mainframe, Unix AIX 7.1, HP ALM QC, Abinitio Operation Console, SQL server 2008, Service Now, Toad for data analysis, Conduct IT, BRE

John Lewis IT operations team provide live support to the JL branches, distribution centers and other divisional units and responsible for investigating problems, resolving incidents, implementing and rolling out systems. The JLP IT Abinitio involves applications designed with key features like modular design, operational stability, scalable and parallel processing which include file transfer from/to target systems etc. Applications are developed using generic modules, effective use of Abinitio plan for arranging sequence of tasks/graphs/psets, applying transformation, batching the service to restart from the point of failure, adding archive and purge process, rejecting any inbound data record that fail validation/transformation rules with appropriate error message and processing of remaining records continue up to a certain threshold defined by reject threshold percentage, etc. It involves all end to end activities from requirement gathering, analysis using metadata hub, operational console, development of Abinitio processes, enhancement to existing applications, unit testing, change requests, fixing live job abends, resolving incidents by interacting with various teams, implementing permanent solutions to various problems, monitoring live jobs through hitbox, service now application along with admin activities like providing accesses to users for operational console, metadata hub etc.

POS “Point of Sales” module has a customer requirement to run a report of sales at specific time and compare the values of current sales with the previous years old sales, find out the total profit, the percentage of profit across various branches in a specified report format. This report has to be generated at 4 different times in a day and provide a summary in a spreadsheet. Process involves extracting data from various tables in Teradata, implementing the complex calculations to find the sales and profit. Generic Abinitio process has been built in order to provide flexibility to customers to generate report for comparison at specified time involving calculations of sales figure. This automated process completes and generates report within a short time span and saves manual efforts.

Responsibilities:

Involved in requirement gathering meetings, data planning, client meetings to understand the new requirements, existing applications problems.

Prepared impact analysis and design documents, proposing solutions, defect fixing and estimating the hours.

Working proactively and independently to articulate issues/challenges to address project delivery risks.

Worked in onshore-offshore model and coordinating team from onsite.

Interact with release management, configuration management and quality assurance, database modeling support, other upstream and downstream development teams and operation support team.

Handling various kinds of activities like development of complex code, enhancements, testing, problem resolution etc.

Code reviews to ensure code developed meets the requirements.

Provide KT sessions, preparing documentation, mentoring junior resources.

Providing support to developed applications, resolving problems and closing tickets.

Generating reports for weekly status meetings with clients.

Client: Keybank, Cleveland, OH, USA

Role: Offshore ETL Tech Lead and Developer

Environment: Abinitio 3.0.5, 3.1.5, SQL server, Teradata, Unix AIX, Mainframe, Autosys

Keybank data factory project’s objective is to accelerate the revenue growth, improve cross sell and deepen the customer relationship, quickly spot the marketing opportunities through strong decisions, reduce overall cost by using reusable graphs, components and scripts, improving data quality and consistency, increase the overall business productivity. It consists of various categories of projects like sourcing projects, consumption projects, enhancement projects etc.

EDP “Enterprise Data Program” is one of the sourcing module under which the data is extracted from heterogeneous sources. It consists of various source applications like MDH (Mortgage Data Hub), LCP (Line of Credit), CRI (Credit Risk Information), KMS (Key Merchant Services), FXG (Foreign Exchange) etc. Data Profiling, Sourcing, Harmonization, Consumption, Creation of load ready files under staging area, Data loading into staging tables and then Target table, FTPying target files on remote server are main stages of this project. The data profiling consists of collecting statistics about source data which helps business analyst in requirement gathering. Data sourcing includes data extraction, application of DQ checks and harmonization rules and creation of final harmonized output file. The harmonization process includes loading of reference data, unloading of the enterprise level reference values and checking the source data against it through lookup functionality. In case of failure of any DQ checks or harmonization check, the corresponding record is flagged and erroneous data with its required statistics is collected and loaded into EDQ tables for further use of Cognos team to generate reports. Harmonized files act an input for consumption applications which are further processed to create load ready files under staging layer by applying transformations. The load ready file is loaded into staging tables as well as target table.

TSYS module uses mainframe files which are daily credit card feeds sent to keybank. The data from all these files is cleansed, processed according to business need and made available under harmonization layer. Challenges like processing of huge volume of data for some of the feeds, creation of big conditional dmls from copybook layouts etc are overcome during harmonization, consumption and loading process.

FINDUR application is one of consumption application. Earlier the process needs manual execution of a big SQL in Teradata consists of around 18 source tables. Under this module, the entire Teradata SQL query has been analysed, splitted and implemented in Abinitio graph to create a final file which is then FTPed to remote server.

ACAPS application uses sourcing, harmonization and consumption processes which include implementation of specific logic as-per business requirements to identify delta record after applying all the filter criteria and join criteria specified by business. The challenge in this module was populating information into a single file from data which resides in different mainframe files having large data volume. Abinitio graphs are implemented to handle all the scenarios and verify the client-specific unit test cases.

HRS application which stores the online HR information in XML files. The XML data has been processed by creating a separate generic graph to achieve the functionality of FTPying XML data, processing it as-per business needs and creating a final file. Various reusable components for CDC processes, reusable generic graphs, sourcing, loading, data profiling of applications, generic wrapper scripts for creating job definitions, clean up disk usage etc are built under this project.

Responsibilities:

Implemented end to end Abinitio applications for one of the Individual extract and real time mini batch processing application, process to auto generate and send statistics consolidated report for all extract using Abinitio graphs, psets, plans, Unix Shell scripting.

Requirement gathering by interacting with client/business, creating impact analysis and design documents, proposing solutions, defect fixing and estimating the hours. Working proactively and independently to articulate issues/challenges to address project delivery risks. Worked in onshore-offshore model and lead offshore team.

Interacting with release management, configuration management and quality assurance, database modeling support, other upstream and downstream development teams and operation support team.

Handling various kinds of activities like development of complex code, enhancements, testing, problem resolution etc.

Providing unit testing, module testing, and regression testing and system integration testing of the scripts, graphs, psets, plans and queries.

Code reviews to ensure code developed meets the requirements.

Provide KT sessions, preparing documentation, mentoring junior resources.

Organization: Capgemini India Pvt Ltd, Pune, India

Period: 13th Sep 2010 – 21st Aug 2012

Address: A-1, Technology Park, M I D C Talwade, Pune, India, 412114

Client: Barclays, USA

Role: Associate Consultant

Environment: AbInitio (GDE 3.0.2.1, Co-Operating System 3.0.2) and (GDE 1.15, Co-Operating System 2.15, UNIX shell Scripting, Teradata, Control M

Wealth Program is to extract credit and market risk data relating to Barclays Wealth Customers from various source systems and load it into ‘Wealth Management Risk Data Store’ (WMRDS). Data identified for Basel II reporting at the group level is extracted from WMRDS.

Barclays Wealth management is a participating business ‘cluster’ which contributes risk data to the overall group Basel II program as well as aggregation and reporting functions for it. The process extracts core risk information from disparate source systems to contribute to the overall Barclays Basel II reporting. In wealth_wi phase, standardized approach is used to create a CFF file. The wealth_wic and wealth_bsm phases use AIRB (Advanced Internal Ratings Based) approach to create a CFF file. The wealth_bsm consists of modifications as-per requirements and processing of data on daily and monthly basis. It includes extraction of data from various landed source systems, processing it as-per business needs and creation of Asset and liability CFF, Customer CFF and Collateral CFF (Common File Format) file for each landed source system UKBA (United Kingdom Branch Accounting), GTSA (General Treasury Settlement and Accounting), TM (Trade Manager) and ALS (Advance Lending System).The term Advanced IRB or A-IRB is an abbreviation of advanced internal ratings-based approach, and it refers to a set of credit risk measurement techniques proposed under Basel II capital adequacy rules for banking institutions.

Under A-IRB banks are supposed to use their own quantitative models to estimate PD (probability of default), EAD (exposure at default), LGD (loss given default) and other parameters required for calculating the RWA (risk-weighted asset). Then total required capital is calculated as a fixed percentage of the estimated RWA.

Responsibilities:

Understanding the business requirements, analyzing the FDR, creating impact analysis, technical design documents and source to target mapping documents.

Interacting with client to get the requirement and provide technical solutions.

Lead and mentor the junior resources of team.

Developing SQLs in Teradata for data analysis, verifying output and debugging of issues.

Developing Abinitio graphs to process data coming from source systems by extracting it from Teradata tables, applying business rules to create required final Unix file.

Unit testing, Module testing, Regression Testing and System Integration Testing of the scripts, graphs, psets, plans, queries.

Creation of test cases, test case documents as well as test data.

Code reviews, test case reviews to ensure code developed meets the requirements.

Suggesting the schedule and dependencies for the jobs.

Organization: Bitwise Solutions Pvt Ltd, Pune, India

Period: 3rd April 2007 to 3rd Sept 2010

Address: Bitwise World, Off Int'l Convention Centre, Senapati Bapat Road,

Pune, India- 411 016

Client: Discover Financial Services, Chicago, USA

Role: Programmer Analyst

Environment: Ab Initio (GDE 1.13, 1.14, 1.15), Teradata V2 R5, Teradata, IBM AIX 5.3, Secure CRT, IBM Tivoli JS Console

Discover Financial Services (DFS) is one of the largest credit card companies in the US, both as



Contact this candidate