Post Job Free

Resume

Sign in

Data Analyst

Location:
Mason, OH
Posted:
April 12, 2021

Contact this candidate

Resume:

Ravindra Thammegowda

adll21@r.postjobfree.com

+1-513-***-****

CAREER OBJECTIVE

To implement my talent and contribute significantly in the field of technology through continuous focused hard work, innovation & research, utilizes my skills and provides me an opportunity towards personal and organization growth.

SUMMARY

Having total 10.10 years of experience in Requirement Understanding and Analysis, Data Modeling, Creating Conceptual, Logical, Physical data model, and Data Dictionary documents, creating ETL design documents like Data Mapping document, HLD & LLD,TSD, Hands on experience on ETL Development, Database Development - SQL/TSQL complex Queries, QA, UAT and Production Support & deployment activities for ETL and DB.

Proficient in working with Erwin tool, IBM – “InfoSphere Data Stage 11.5, 8.5”, and SQL/TSQL, PL/SQL, complex Queries, Unix scripts, development and production support

Responsible for understanding the business specifications/functional specifications document, understanding every single source like flat/complex flat file/CSV/Excel/XML database structure and perform the data profiling in order to meet the requirement.

Understanding every single field in the mapping document and in the existing system with the help of subject matter of expert and having detailed sessions with Business Analysts/data architects/data analysts

Creating data model once requirement analysis is completed, like conceptual/logical/physical data model, generating data dictionary and get it review with architect before sharing with team

Creating high level document when complete workflow is available for the respective subject area and then working on the low-level document and reviewing with Architects prior to ETL development

Leading the team to provide complete guidance on the design phase like explaining every single piece of data model, complete workflow (HLD/LLD) diagram of ETL processes

Driving agile stand up meeting in order to provide complete details to project managers behalf of the team like updates from data analysts/ETL developers and QA team etc.

Leading the ETL development team efforts and make sure that team is following best practices and guidelines in order to ensure that jobs are efficient to load large volume of data and maintaining quality in the code like reviewing ETL code when development is completed for every single process

Ensure that team is preparing Technical Specification Document, unit test case document and deployment guide to move jobs from lower environment to higher environment and review and provide the feedback for the same and ensure that all the documents are uploaded into share point path

Proactively prioritize the multiple task for the team and help project manager to align with the timeline and bring the issues to project manger’s notice

Working with multiple teams, primary focus is on the core team where things are very much critical in order to complete project on time and other team is working on the merger process, which means addition to existing architecture, mentoring both the team and providing proper guidance/direction in order to bring everyone into same page

Consolidating complete status of the project and preparing weekly status report and monthly status report for the client management and presenting same along with project manager.

Review the test case execution results and ensure that required details are available in order pass it on to line of business team or any other team and present the same to line of business team & help them to enhance their testing

Review the deployment guide and work with scheduler team and ensure that jobs are scheduled as per the requirement

Review the stand operation procedure document or run book for the production support and help production team in order to maintain the jobs in the future.

Extensively worked on Data Stage job design, development, scheduling, execution, testing and defect fixing

Involved in creating the logical and physical design of the Data Warehouse (Fact and Dimension tables) using Star Schema and Snowflake schema approaches.

Good experience in agile methodology and communicating with Project Managers in providing estimations for development efforts to meet the project milestones and deliverables.

Proficient in Stored Procedures, Functions views and indexes, T-SQL - performance tuning and query optimization, like understanding metrics based up on the execution plan and building indexes if required, re-writing query etc.…

Involved in Deployment activities — like migrating database DDL scripts from lower to higher environment or migrating ETL jobs from lower to higher environments

Positive attitude, strong work ethic with good analytical skills.

Flexibility and adaptability with new technologies and environments highly result oriented and driven for success.

Creating reports using oracle BI publisher, maintaining oracle BI and WebLogic server and oracle enterprise manager middleware – worked on POC project

Production support like incident management and resolving tickets based on the SLA

Working on the new proposals with business relationship manager

Having some tool experience in the informatica power center 9.0.1— worked on POC project.

EDUCATIONAL QUALIFICATIONS

Bachelor Engineering of Computer Science (B.E) from Visvesvaraya Technological University, Belgaum in Karnataka.

Technical Skills:

Data Model Tool : Erwin Data modeler

ETL Tool : IBM – InfoSphere Data Stage 11.5, 8.5, SQL Server

Integration Services(SSIS), Informatica power center 9.0.1

Reporting Tool : ORACLE BI Publisher, SQL Server Reporting Services(SSRS)

Languages : SQL,T-SQL, PL/SQL, unix scripts

Operating Systems : Windows 7/2003/XP/Windows Server 2008 R2 and AIX

Database : SQL Server 2008/2008 R2, Oracle 9i, Oracle 12c and DB2

Query Tool : SQL server management services(SSMS), Oracle developer, Advance query tool (AQT)

Training (Attended) Skills:

Reporting Tool : Oracle Business Intelligence Enterprise Edition(OBIEE), IBM Cognos reporting tool

PROFESSIONAL EXPERIENCE

Worked as an ETL/SQL Developer and Data Modeler in SLK Software Services, Bangalore from May-10-2010 to Jun-23-2017.

Worked as a data migration Lead Developer in SLK America INC, America from Jun-24-2017 to 31-Dec-2018.

Working as production support lead, data warehouse technical lead and data migration Lead in TATA Consultancy Services Limited, America from Jan-02-2019 to till date.

PROFESSIONAL PROFILE

PROJECT DETAILS:-

Project #1 : Consumer ETL enhancements and operation

Client : Fifth Third Bank, Cincinnati Ohio USA

Duration : Mar-2020 to till date

Role : ETL Technical Lead and production support lead

Environment : Erwin, IBM – Infosphere data stage 11.5, DB2 queries using Advanced Query Tool

Summary: Consumer Business Intelligence (CBI), Wealth Asset Management (WAM) & Personal Deposit Account (PDA)

Fifth Third Bank has many applications in consumer space. The main objective is to maximize the consumer experience and profitability. The Consumer ETL project houses many ETL project and enhancements. It has wealth Asset management or Investment Advisor or Brokerage & consumer/commercial applications – Referral consumer/commercial, Balance spend products, One view consumer services, Commercial services and personal digital assistant

Investment advisor or wealth asset management has several source systems which are sending various types of data into data mart like customer, account, revenue and commission, advisor license information. Basically data mart is a critical process for line of business for their day to day activities; basically we are having 3 different opportunities work on like development for new process, production support and enhancement for existing feature

Personal deposit Account system is having some of the batches to process payment process, basically there are many files which are dropping into given folder location in the form of batches, ETL is picking up every single file based on the SLA time and processing, ETL is running round the clock.

Responsibilities:

Responsible for understanding the business specifications/functional specifications document, understanding every single source like flat/complex flat file/CSV/Excel/XML database structure and perform the data profiling in order to meet the requirement.

Understanding every single field in the mapping document and in the existing system with the help of subject matter of expert and having detailed sessions with Business Analysts/data architects/data analysts

Creating data model once requirement analysis is completed, like conceptual/logical/physical data model, generating data dictionary and get it review with architect before sharing with team

Working with SME’s or business analysts or LOB to prepare data mapping document in other words playing data analyst role

Development activities in terms of feasible work assign to team and provide required support and development with complex requirement

Production support v/s 11000 jobs

Recovering the job aborts and closing the ticket with TWS team.

Performing daily manual activities like running the jobs, setting Unifi trigger, Morning report, Updating abort trackers etc.

Checking for the jobs finishing with warnings and analyzing for fixes

Monitoring critical jobs which has abort history and prod issues in recent past

Monitoring 55+ FTPs and taking action for the failed ones

Running weekly/monthly/quarterly jobs when they are due and updating stake holders.

Analyzing the jobs which have frequent abort history and recommending for permanent fixes.

Preparing System Maintenance Technical Document Run books which will help the team to understand functionality and taking actions for aborted jobs.

Documentation for scheduled outages and working as per plan on outage day.

Analyzing the jobs for requests coming from clients on different subject areas as per the requirements

CRQ creation, execute and support

Planning and reviewing with leads.

Validation of jobs post implementation

Execution of the jobs as per the plan

Coordination with different teams involved.

Value Additions and Improvements

PROJECT DETAILS:-

Project #2 : Anti Money Loundering (AML)

Client : Fifth Third Bank, Cincinnati Ohio USA

Duration : Jan-2019 to Feb-2020

Role : ETL Technical Lead and Data Migration Lead

Environment : Erwin, IBM – Infosphere data stage 11.5, DB2 queries using Advanced Query Tool

Summary: Anti Money Loundering

The Bank Secrecy Act(BSA) requires an Financial Institutions monitor customer activity to detect certain known or suspected violations of federal law or suspicious transactions related to anti money laundering or violation of the BSA. Current AML(Legacy) will go end of 12-2018, this project will replace the current AML system based up on the bank compliant applicable and AML regulations, AML provides an automated solution that is able to be cutting edge in terms of analytical environments while giving investigators the necessary capabilities to swiftly investigate and disposition cases along with filing of Suspicious Activity Reports(SAR) with Financial Crime Enforcement Network(FinCEN)

Identify and implement a platform to monitor transactions and associated alerts across all relevant channels in order to identify potentially suspicious activity related to money laundering

Allow Financial Crimes Compliance to work AML cases in an integrated case management solution

The current Transaction monitoring(Legacy) system requires a review of each alerted potentially suspicious activity event via a multi-level triage/investigative process, as such process is resource intensive, consists of various manual processes, lacks adequate controls and is insufficient mitigating AML risk, there is a need to replace existing processes with an Automated Transaction Monitoring system which can effectively support the ability to utilize risk-based approach to focus analytical and investigative resources on the most important areas of the bank requiring scrutiny in terms of Financial Crimes.

Responsibilities:

Responsible for understanding the business specifications/functional specifications document, understanding every single source like flat/complex flat file/CSV/Excel/XML database structure and perform the data profiling in order to meet the requirement.

Understanding every single field in the mapping document and in the existing system with the help of subject matter of expert and having detailed sessions with Business Analysts/data architects/data analysts

Creating data model once requirement analysis is completed, like conceptual/logical/physical data model, generating data dictionary and get it review with architect before sharing with team

Working with SME’s or business analysts or LOB to prepare data mapping document in other words playing data analyst role

Data analyst role would be kind of exposure in this project since I got an opportunity to work with different teams like data warehouse, eCIF team etc…

Creating high level document when complete workflow is available for the respective subject area and then working on the low-level document and reviewing with Architects prior to ETL development

Leading the team to provide complete guidance on the design phase like explaining every single piece of data model, complete workflow (HLD/LLD) diagram of ETL processes

Driving agile stand up meeting in order to provide complete details to project managers behalf of the team like updates from data analysts/ETL developers and QA team etc.

Leading the ETL development team efforts and make sure that team is following best practices and guidelines in order to ensure that jobs are efficient to load large volume of data and maintaining quality in the code like reviewing ETL code when development is completed for every single process

Ensure that team is preparing Technical Specification Document, unit test case document and deployment guide to move jobs from lower environment to higher environment and review and provide the feedback for the same and also ensure that all the documents are uploaded into share point path

Proactively prioritize the multiple task for the team and help project manager to align with the timeline and bring the issues to project manger’s notice

Working with multiple teams, primary focus is on the core team where things are very much critical in order to complete project on time and other team is working on the merger process, which means addition to existing architecture, mentoring both the team and providing proper guidance/direction in order to bring everyone into same page

Consolidating complete status of the project and preparing weekly status report and monthly status report for the client management and presenting same along with project manager.

Review the test case execution results and ensure that required details are available in order pass it on to line of business team or any other team and present the same to line of business team & help them to enhance their testing

Review the deployment guide and work with scheduler team and ensure that jobs are scheduled as per the requirement

Review the stand operation procedure document or run book for the production support and help production team in order to maintain the jobs in the future.

Used Data stage Designer to develop processes for Extracting, cleansing, transforming, integrating and loading data into DB2 v11.1.4.4 different source like database, file etc...

Used various Stages like complex flat file, change data capture, sequential file stage, Join, Aggregator, Transformer, Filter, lookup, sort, remove duplicates, Funnel, Column Generator, DBBC stage, File Set, datasets, filter stage for designing the jobs in the Data Stage.

Following are the list of sources system feed files which are fed into AML system, Amtrust, NFS, Time Deposit, GPR, TSYS, AFS, ACBS, UDS, MICR, SWIFT, MTS, ALS, Loan Serv, IDW and Wall Street, historical data migration from legacy system, like 30 million accounts needs to be pulled from data ware house and 2.9 billion transactions are required to be migrated from legacy to new system and million number of transactions will coming on a daily basis form these various source systems.

Used Data Stage Director for running and monitoring & scheduling the jobs.

Worked with Data Stage Designer for creating jobs, Parameter Sets, Data Connection Parameter, Export & Import Jobs.

Worked with Data stage Administrator for creating Environment variables.

Involved in creation of the Test Cases, execution of Test cases in order to migrate 3.7 billion records which would be leveraged 8TB including staging data, validation or cleansing and moving cleansed data into data mart

Worked on handling/capturing exception/error records in parallel jobs and handling exceptions in the Sequential job.

Involved in preparing deployment guide or implementation guide for Migration and daily batch load process.

Involved in migration jobs from Development to UAT/QA to Production Servers.

PROJECT DETAILS:-

Project # 2 : 1) Wealth Management System, 2) ACBS 3) BPM 4) Master Card

Client : FHN (First Tennessee Bank) Bank, Memphis TN USA

Duration : July-2014 to Dec-31-2018

Role : Data Modeler, ETL Developer and SQL/TSQL developer, Erwin

Environment : SQL/TSQL, Erwin, IBM – Infosphere data stage 8.5, SSIS 2008, SSRS 2008, SQLSERVER 2008, SQL server 2012

Summary: Wealth Management System

The wealth management system is an application which is the system of record for

Sale of wealth management products

Revenue earned on the sale of wealth management products (like investment, insurance and trust accounts etc… ) by the frontline bank sales staff

Commission paid on revenue earned

Incentive paid on referrals created

Bonus paid

In order to provide a solution to meet the business needs of different user group of this solution, streamline existing processes support a wide range of needs going forward, there is a need to rewrite this application.

The replacement application, while retaining the existing functionality and interfaces, would aim at streamlining the menu options and reports, provide scalability with regards to adding new products, incentive structures and new roles.

Responsibilities:

Responsible for getting, understanding the business specs or functional specification, preparing technical Specs.

Understanding every single field in the existing system with the help of reverse engineering from Erwin tool and then started building business story with the help of existing application as well as LOB sessions.

Identified the issues in the existing application during code analysis and explained the business scenarios and provided solution to streamline the process

Created logical/physical data model using Erwin tool, prepared the data dictionary in the Erwin data model

Walkthrough the data model for all the audience like BA’s, data architect, application architect, developer etc..

Understanding the HLD with the help of Business Analyst/Project owner and then preparing LLD.

Understanding source and target system then preparing mapping document.

Used Data stage Designer to develop processes for Extracting, cleansing, transforming, integrating and loading into MS SQL Server 2008 R2 data base.

Used various Stages like Join, Aggregator, Transformer, Filter, lookup, sort, Remove duplicates, Funnel, Column Generator, Sequential file, DBBC stage, fileset, datasets, filter stage for designing the jobs in the Data Stage.

Used Data Stage Director for running and monitoring & scheduling the jobs.

Worked with Data Stage Designer for Parameter Sets, Export & Import Jobs.

Worked with Data stage Administrator for creating Environment variables.

Involved in creation of the Test Cases, execution of Test cases.

Worked on handling exceptions in Sequential job and capturing exception records in parallel jobs.

Involved in preparing deployment guide or implementation guide for Migration.

Migrating jobs from lower environment to higher environment along with database scripts

Involved in performance tuning of data stage jobs, writing SQL queries, stored procedures

Co-ordinated and worked with multiple projects with multiple teams during merger or acquisition time.

Project #3 : Marketable Securities or MarginNet

Client : FHN (First Tennessee Bank) Bank, US

Duration : OCT-2012 to Jun-2014

Role : ETL Developer, Data Modelling and development of SP’s

Environment : SQL/TSQL, Informatica Power centre 9.0.1, SSRS 2008, SQLSERVER 2008, ERwin.

Summary: MarginNet

FHN Bank was, the statements of marketable securities are received from different sources and the RM has to manually compile the values and update their respective spreadsheets based on monthly statements.

The Application/tool aims at tracking and monitoring the performance of loans against collateral considered securities. The tool will automate the process of listing accounts managed by RMs, pull information from files created by data pools from different source systems, create a match process and execute mathematical calculations using fields from both the files. The tool will display the information in an integrated format on a dashboard.

Responsibilities:

Responsible for getting, understanding the business specs or functional specification, preparing technical Specs.

Understanding every single field in the existing system with the help of reverse engineering from Erwin tool and then started building business story with the help of existing application as well as LOB sessions.

Identified the issues in the existing application during code analysis and explained the business scenarios and provided solution to streamline the process

Created logical/physical data model using Erwin tool, prepared the data dictionary in the Erwin data model

Walkthrough the data model for all the audience like BA’s, data architect, application architect, developer etc..

Understanding the HLD with the help of Business Analyst/Project owner and then preparing LLD.

Understanding source and target system then preparing mapping document.

Methodology for supporting Data extraction, transformations and loading processing, in a corporate wide ETL solution using Informatica power center and developer tool

Extensively worked on informatica designer components – Source Analyzer, warehousing designer, transformations developer, Mapplet and mapping designer

Used informatica mapping Designer to develop processes for Extracting from various sources like flat file, fixed width file, Cobol file, relational databases etc.., cleansing, transforming, integrating and loading data into Oracle database, DB2 v11.1.4.4

Used various transformations like Source definition, Source Qualifier, Lookup transformation, join transformation, Aggregator transformation, Expression transformation, Filter transformation, sort transformation, update transformation, Union transformation, update strategy transformation, Router Transformation, Sequence Generator, Aggregator Transformation, Target Definition etc...

Project #4 : Systematic collateral Management System (STOC) and SmartLine

Client : M&T Bank, US

Duration : APR-2011 to Sep-2012

Role : BI Developer, Data Modelling and development of SP’s

Environment : SQL/TSQL, SSIS 2008, SSRS 2008, SQLSERVER 2008, ERwin.

Summary: STOC

In securities lending and borrowing, collateral comprises assets given as a guarantee by a borrower to secure a securities loan and subject to seizure in the event of default. Collateral Management refers to the handling of all tasks related to the monitoring of collateral posted by a borrower

SmartLine

SmartLine is similar to overdraft, Loan arrangement under which a bank extends credit up to a maximum amount (called overdraft limit) against which a current (checking) account customer can write checks or make withdrawals. An SmartLine allows the individual to continue withdrawing money even if the account has no funds in it. Basically the bank allows people to borrow a set amount of money. If the overdraft is secured by an asset or property, the lender has the right to foreclose on the collateral in case the account holder does not pay. But SmartLine is a unsecured line of credit product.

SmartLine contains some rules, hence qualified customer can only use this product:

Offer a small ($500 maximum) amount. Customer cannot access an amount higher than their credit limit

Credit Card holder not eligible to SmartLine product.

After using SamrtLine, if the customer paid within 10 days then smartline account is active else Account is freeze etc…..

Project #5 : Banking Built for Business (BBFB)

Client : M&T Bank, US

Duration : DEC-2010 to APR-2011

Role : BI Developer

Environment : SQL/TSQL, SSIS 2008, SSRS 2008, SQLSERVER 2008.

Summary:

A package of services designed for companies to help save money and time, BBFB is a business bundle will include incremental benefits to business owner, who enroll in the bundle and have the ability to receive even greater value by consolidating their business and personal relationship with M&T, as a preferred business/personal customer they will automatically qualify for exclusive benefits on both sides of their banking relationship.

BBFB project has a whole compromise of various technologies like design and implementation of DB, application design real time data movement using ESB and non real time data movement using ETL

Project #6 : Sonic Inventory Management System

Client : Crowe Horwath, US

Duration : Mar -2010 to NOV-2010

Role : SQL, T-SQL Developer & BI Developer

Environment : SSMS 2008 r2, SSIS 2008 r2, SSRS 2008 r2, SQLSERVER 2008 r2.

Summary:

Sonic Automotives dealerships provide comprehensive services, including sales of both new and used cars and light trucks, sales of replacement parts, performance of vehicle maintenance, warranty, paint and collision repair services, and arrangement of extended warranty contracts, financing and insurance for the company's customers. The key objective is to design the web application. SIMS Project has whole compromise of various technologies like design and implementation of db and application design.

Responsibilities:

Understanding the business requirements, interacted with end users to understand the requirements and the purpose of those changes in the data model.

Created logical model and physical model along with applying changes changes to the data model and preparing data dictionary for the same

Design and develop SSIS packages, store procedures, configuration files, and functions implement best practices to maintain optimal performance.

Used Control Flow Tasks like Execute SQL, File System, send mail, Dataflow Task, Execute Process Task and Execute package Task.

Construct SSIS packages to pull the data from heterogynous sources

Notification, Auditing, logging, package configuration incorporated etc incorporate as standard procedure

Data profiling done from PS to ES such as concatenation of dates, remove blank spaces, handling Nulls etc

Understand configuration control – check-in / check-out mechanism

Perform peer review / participate in code walkthrough with client.

Perform unit tests (including test case development, review & execution)

Apply relevant software engineering processes (reviews, testing, defect management, configuration management, release management)

Deployment of the packages in the test and cert environment and supporting for testing

Ravindra



Contact this candidate