Post Job Free
Sign in

Manager Data

Location:
Eden Prairie, MN
Posted:
February 25, 2015

Contact this candidate

Resume:

(Meganathan Malli Viswanathan

**** **** ****** *****, *** # 406,

Edina, MN-55435

Phone: 612-***-****

E-mail: **********.***********@*****.***

Profile

10+ years of work experience in Object Oriented Design, Modeling,

Programming and Testing, Data warehousing, Relational Databases, ETL

Tools, Java, J2EE.

Vast experience in data warehouse design and provided solutions to

handle high volume data.

Experience in project management and handling the team.

Experienced in all phases of Software Development Life Cycle (SDLC).

Expertise in Data warehousing, system integration, performance tuning,

batch designing and object oriented designing.

Leading the team technically and functionally.

Forecasting the resource projection, assigning tasks, estimation,

offshore and onshore coordination and working with multiple

cross-commits.

Expertise in UNIX Shell Script programming.

Expertise on relational databases including Oracle, PL/SQL and involving

stored procedures, triggers, functions, indexes, and packages.

Hands on experience on ETL tools like Informatica 9.1 and Data Junction.

Hands on experience on RDBMS concepts and involved in the performance

tuning of batch process which involves high volume data.

Hands on experience in MS Visio, VPN, Putty, winSCP, FTP, SFTP, Text

Editors (UltraEdit, Textpad, Editpad) Test Director, etc.

Hands on experience in Object Oriented Programming using Java and J2EE

related technologies.

Strong knowledge of Design including J2EE Design Patterns Singleton,

MVC, DAO, Builder, Business Delegate, Session Facade, Service Locator

etc.

Good knowledge on Teradata.

Good domain knowledge in Banking and Financial industry.

Strong analytical skills with ability to quickly understand client's

business needs.

Involved in requirement gathering and requirement elicitation

discussion.

Ability to accept challenges, passion to learn.

Research-oriented, motivated, proactive, self-starter with strong

technical, analytical and interpersonal skills.

Education

Masters of computer Applications (2000 - 2003) from Madurai Kamaraj

University, India.

Bachelors of Science (Physics) (1997 - 2000) from Madurai Kamaraj

University, India.

Technical Expertise

Data warehousing Technologies

Informatica 9.1, SQL Loader, Data Junction.

Databases

Oracle, DB2 UDB, SQL Server, MS Access

Programming Languages

Java, J2EE SQL, Unix Shell scripting, PL/SQL, Markup languages.

Source control

VSS, CVS, CMSynergy

Middleware

IBM MQ Series

ETL Tools

Data Junction, Informatica 9.1

Web/App Server

Weblogic, Websphere, Tomcat, Oracle 10g

Tools

SQL Developer, Informatica, Eclipse, SQL Loader, Test Director,

Microsoft Visio.

Experience

Current Role & Responsibilities : Application Architect

Discuss with Portfolio architects, Data architects and other stake

holders to discuss with proposed solutions.

Helping the team to make strategic and technical level decision by

analyzing the existing process.

Involved at every stage of the software development cycle and would be

working as a Tech Lead.

Participate in design review. JAD session and detailed design

discussion.

Creation of application architecture document and design artifacts.

Design review with portfolio architects and senior architects.

Mentor the team to deliver high quality codes by defining the best

standards and checklist.

Coordinate design discussion with internal team, cross commits and key

stakeholders.

Analyze and resolve issues to ensure high quality deliverables at each

stage of Software Development Life Cycle (SDLC) within the guidelines,

policies and Ameriprise Quality Management System (AQMS).

Code review with architects and application team.

Subject Matter Expert:

As a Subject Matter Expert (SME) for Historical Operational Data Store

(HODS) application provided ideas to improve the performance of batch

process and meeting the System Level Agreement (SLA).

Both technically and functionally guided Informatica team during

migration. Training new resources on Historical Operational Data Store

(HODS) batch application and other process.

Key solutions provided to improve the batch performance:

Pre-schedule the copy partition job which yielded in good improvement in

the batch process for high volume files. This idea got implemented

during May Conversion 2012 and the batch process were able to handle the

high volume delta files without breaching the System Level Agreement

(SLA).

Solution provided to handle high volume update files. This solution got

implemented in production and the system was able to process 60 million

records against 150 million records less than 4 hours.

BDIL/HODS Technical Initiatives

Role : Application Architect

Mar 2014 - till date Ameriprise

Financials Minneapolis, MN, USA

BDIL is Brokerage Data Integration Layer which receives data from

different data sources and maintains the data. HODS is a data store

which maintains the brokerage data and serves data to clearing and other

business for their daily reporting and reconciliation activities. Both

BDIL and HODS data loads were not having restart feature that leads to

data synchronization and performance issue in case of any failure during

the batch run. Proposed few options to address this issue and worked

closely with application team to introduce the restart logic with

minimal performance impact.

Proposed Informatica Metadata Manager tool to create the lineage between

source and targets. This has been incorporated in test environment and

business impressed with the tool for ease of analysis process. These

lineages are extensively used by the application team for any code

analysis and downstream impact analysis.

Environment:

Informatica Power Center 9.1, (Source Analyzer, Mapping Designer,

Transformations Developer, Workflow Manager, Workflow monitor,

Repository Manager, Power Exchange), Oracle 11g, MS Visio, Windows, UNIX

Shell Script, PL/SQL and SQL Loader.

Enterprise Money Movement - E2E reporting

Role : Technical Lead/Solution Architect

Jan 2014 - Oct 2014 Ameriprise Financials

Minneapolis, MN, USA

EMM E2E (Enterprise Money Movement End to End) reporting is an

initiative from EMM application to identify the fraudulent information

or suspicious transaction by verifying the client address and other

details. It stores client and account information in different data

stores. Multiple clients use the same information in different layout.

To avoid redundancy the system required to identify the master data and

federate the data by keeping it in appropriate exposure layer.

Informatica Master Data Management principles were applied to identify

the master data from multiple data sources with different layout. Had

extensive data mapping & data profiling session with Data Architects,

Solution Architects and business partners to identify the master data.

Master data rules were defined in the Informatica mapping

transformations.

Environment:

Informatica Power Center 9.1, (Source Analyzer, Mapping Designer,

Transformations Developer, Workflow Manager, Workflow monitor,

Repository Manager, Power Exchange), Oracle 11g, MS Visio, Windows, UNIX

Shell Script and PL/SQL.

AITT

Role : Technical Lead & Project Lead

Feb 2013 - Feb 2014 Ameriprise Financials Minneapolis, MN, USA

AITT is an enterprise level program at Ameriprise financials to upgrade

and relocate the servers in St. Louis data center. As part of this

program ETL Informatica servers, HODS App server and DB servers will be

migrated from legacy data center to new data center. Existing

functionalities should be migrated without any issues by application

remediation.

HODS is a 18TB of data store which stores historical and operational

data.

Migrating 18 TB of data without impacting to business is a challenging

task.

Careful planning with DBAs and business partners without impacting their

day to day activities.

Migration plan activities like hard and soft cutover.

Optimize batch execution.

Impact analysis by turning on Data Guard at Database level.

Redesigning the batch to reducing the archival log generation.

Validate the ETL, UNIX Shell scripts and DB migration.

Secured file transfer using sftp and ftp between different servers.

Environment:

Informatica Power Center 9.1, (Source Analyzer, Mapping Designer,

Transformations Developer, Workflow Manager, Workflow monitor,

Repository Manager, Power Exchange), Oracle 11g, MS Visio, Windows, UNIX

Shell Script, PL/SQL and SQL Loader.

HODS (Historical & Operational Data store) BAU

Role : Project Lead/ Architect

May 2012 - Till date Ameriprise Financials Minneapolis, MN,

USA

The project involves development and maintenance of BETA Data

Integration layer (BDIL) and Historical and Operational Data Store

(HODS). This system keeps track of the account and Trade transaction

details from BETA System (provided by Thomson Reuters). BDIL system

publishes all this data from BETA to all the Ameriprise Internal system.

HODS will store all this data for compliance and regulatory purpose.

This data from HODS will be used by Ameriprise clearing team for all

client data analysis, data distribution via reports etc. This is a

development & maintenance project involving requirements elaboration,

design, build, testing, implementation and support. The system will be

developed using System Development Life Cycle Method.

HODS is a nightly batch application with the SLA of 6 AM CST.

Potential risk of breaching SLA due to increase in volume.

Introduced stored procedures to gather stats and copy partition stats

which increased the batch performance.

Pre-schedule the copy partition job which yielded in good improvement in

the batch process for high volume files. This idea got implemented

during May Conversion 2012 and the batch process were able to handle the

high volume delta files without breaching the System Level Agreement

(SLA).

Solution provided to handle high volume update files. This solution got

implemented in production and the system was able to process 60 million

records against 150 million records less than 4 hours. Avoided all

updates and delete statement which normally time consuming DML

statements.

Extensively involved in performance tuning of the ETL process by

determining the bottlenecks at various points like targets, sources,

mappings, sessions or systems. This led to a better session performance.

Mapping property changed to run with dynamic partitions which internally

triggers and distributes the load between multiple nodes.

Enabled bulk mode in the DB target mapping which will go for direct load

instead of conventional load.

Discussion with Data Architects to decide on data model, data quality,

data integration and exposure pattern.

Developed PL/SQL and UNIX Shell Scripts for scheduling the sessions in

Informatica.

Experience in debugging and troubleshooting Sessions using the Debugger

and Workflow Monitor

Worked with sessions and batches using Server Manager to load data into

the target database.

Responsible for development, test and production mapping migration using

repository manager.

Worked on Tuning the Informatica Repository and Mappings for optimum

performance. Effectively played a multi dynamic role facing all

challenges and managed working on similar projects as well.

Used Informatica Data Quality for data cleansing and data profiling.

Secured file transfer using sftp and ftp between different servers.

Environment:

Informatica Power Center 9.1, (Source Analyzer, Mapping Designer,

Transformations Developer, Workflow Manager, Workflow monitor,

Repository Manager, Power Exchange), Oracle 11g, MS Visio, Windows, UNIX

Shell Script, PL/SQL and SQL Loader.

FINOPS (Financial Operations) Reporting

Role : Technical Lead

Jan 2012 - May 2012 Ameriprise Financials Minneapolis, MN, USA

FINRA (Financial Industry Regulatory Authority) they make sure that all

the financial organizations are having most current stock exchange

details. Every year they will audit the reports from different financial

organizations. This is a regulatory project to generate the FINOPS

reports from multiple NYSE (New York Stock Exchange) data.

Extensively involved in performance tuning of the ETL process by

determining the bottlenecks at various points like targets, sources,

mappings, sessions or systems. This led to a better session performance.

Mapping property changed to run with dynamic partitions which internally

triggers and distributes the load between multiple nodes.

Enabled bulk mode in the DB target mapping which will go for direct load

instead of conventional load.

Developed PL/SQL and UNIX Shell Scripts for scheduling the sessions in

Informatica.

Experience in debugging and troubleshooting Sessions using the Debugger

and Workflow Monitor

Worked with sessions and batches using Server Manager to load data into

the target database.

Responsible for development, test and production mapping migration using

repository manager.

Worked on Tuning the Informatica Repository and Mappings for optimum

performance. Effectively played a multi dynamic role facing all

challenges and managed working on similar projects as well.

Utilized text editor (Editpad, UltraEdit) tools to analyze the data.

Used Informatica data mask transformations to mask some of the

confidential data.

Used Informatica Data Quality for data cleansing and data profiling.

Secured file transfer using sftp and ftp between different servers.

Environment: Informatica Power Center 9.1, (Source Analyzer, Mapping

Designer, Transformations Developer, Workflow Manager, Workflow monitor,

Repository Manager, Power Exchange), Oracle 11g, MS Visio, Windows, UNIX

Shell Script, PL/SQL and SQL Loader.

CDS (Clearing Data Store)

Role : Technical Lead

Aug 2010 - Dec 2011 Ameriprise Financials Minneapolis, MN, USA

Clearing Data Store is an extension of HODS application migrating

towards Informatica technology. Inbound process to load the files

through using Informatica. Outbound APIs developed to send the update to

Thomson Reuters BL server using Informatica web service. Secured Layer)

certificates for the web services using Informatica 9.1 and introduced

the new tool Informatica Metadata Manager (MDM) and worked on this tool

as a part of this project. UNIX scripts are used for all the validations

and TWS

Responsibilities:

. Participated in the Design Team and user requirement gathering

meetings.

. Performed business analysis and requirements per high end users and

managers and performed requirement gathering.

. Extensively used Informatica Designer to create and manipulate source

and target definitions, mappings,, transformations, re-usable

transformations.

. Created different source definitions to extract data from flat files

and relational tables for Informatica Power Centre.

. Used Star Schema approach for designing of database for the data

warehouse

. Developed a standard ETL framework to enable the reusability of

similar logic across the board.

. Created different target definitions using warehouse designer of

Informatica Power Center.

. Used mapping designer to generate different mappings for different

loads.

. Created different transformations such as Joiner Transformations,

Look-up Transformations, Rank Transformations, Expressions, Aggregators

and Sequence Generator.

. Created stored procedures, packages, triggers, tables, views,

synonyms, and test data in Oracle.

. Extracted source data from Oracle, Flat files, XML files using

Informatica, and loaded into target Database.

. Created medium to complex PL/SQL stored procedure for Integration with

Informatica using Oracle 10g.

. Developed complex mapping also involving SCD Type-I, Type-II, Type-III

mapping in Informatica to load the data from various sources

. Utilized text editor (Editpad, UltraEdit) tools to analyze the data.

. Involved in extensive Performance Tuning by determining bottlenecks in

sources, mappings and sessions.

. Created Models based on the dimensions, levels and measures required

for the analysis.

. Validate the data in warehouse and data marts after loading process

balancing with source data.

. Worked closely with the business analyst's team in order to solve the

Problem Tickets, Service Requests. Helped the 24/7 Production Support

team

. Good Experience as an Onsite Tech Lead coordinating the \offshore

teams.

Environment:

Informatica Power Center 9.1, (Source Analyzer, Mapping Designer,

Transformations Developer, Workflow Manager, Workflow monitor,

Repository Manager, Power Exchange), Oracle 10g, MS Visio, Windows, UNIX

Shell Script, PL/SQL and TWS Scheduler.

HODS (Historical & Operational Data Store)

Role : Technical Lead

Feb 2009 - Dec 2010 Ameriprise Financials Bangalore, India

The project involves development of BETA Data Integration layer (BDIL)

and Historical and Operational Data Store (HODS). This system keeps

track of the account and Trade transaction details from BETA System

(provided by Thomson Reuters). BDIL system publishes all this data from

BETA to all the Ameriprise Internal system. HODS will store all this

data for compliance and regulatory purpose. This data from HODS will be

used by Ameriprise clearing team for all client data analysis, data

distribution via reports etc.

Environment :

Oracle 10g, Informatica 8.6, MS Visio, Windows, UNIX Shell Script,

PL/SQL and SQL Loader, Control M Scheduler.

SRA - Short Response Allocation

Role : Programmer Analyst

Jun 2008 - Jan 2009 Goldman Sachs Group Bangalore, India

The Goldman Sachs Group, Inc. is a bank holding company and a leading

global investment banking, securities and investment management firm.

GASS is an existing system which contains lot of modules to allocate the

short positions, Entitled position calculation, client communication

etc.,. Asset servicing technology group decided to re-write the existing

code to improve the performance, scalability and handle high volume on

peak period and with the latest technologies. As part of the

re-engineering, short response allocation is the first module. It

contains UI for Entitled Positions, Instruction Management/SRA, Manual

Position Adjustment, Manual Allocation, Audit History, Exception etc.,.

It is built on top of the Ocean desktop application, Ocean is an in home

application developed by Goldman Sachs.

PENTA (Mortgage Lending and Fund Management)

Role : Module Lead

May 2007 - May 2008 LaTrobe Financial Chennai, India

LaTrobe Financial is one of the leading financial institutions in

Australia and they are leaders in mortgage. LaTrobe plan to maintain

their application in online for ease of maintenance. The system is

automated to maintain the life cycle of a mortgage to make the business

easier and it is an intranet application. It consists of around 14

modules and I worked on the following modules, Financial Control,

Customer Services, Securities, EOD (End of Day) batch process.

Environment :

Windows XP, Eclipse, Oracle 10g Application server, Oracle 9i, Stateless

Session Bean.

Faster Payment Services

Role : Software Engineer & Module Lead

OCT 2006 - MAR 2007 Halifax and Bank of Scotland Edinburgh,

Scotland

According to new UK Banking policy the entire bank under United Kingdom

has to provide the Faster Payment services (FPS) to their clients. FPS

is a quicker payment services so that the payment has to be cleared

within 15 seconds. HCL is engaged to provide couple of services which

provides important information about the account and beneficiary bank.

The services are Sort Code Lookup Service, Account Location Service.

Sort code Lookup service confirms whether a Beneficiary bank can accept

a Faster Payment services or not. Other responsibility of this service

is to refresh the sort code data store. Refresh is scheduled weekly once

and it is bulk refresh of the data store. Account Location Service is

gives the information on a particular account for the incoming payments.

ALS provides online update service as well as batch update service.

Environment :

IBM Websphere RAD, Oracle 10 g, Websphere Application Server 6.0, MQ

Series, MDB.

TEPP R1 Investigation

Role : Software Engineer & Onsite coordinator

APR 2006 - SEP 2006 Halifax and Bank of Scotland Edinburgh,

Scotland

Payment is the important daily activities in a bank and investigation on

payments is another important activity carried out if any payment goes

wrong. Halifax and Bank of Scotland are using various systems for a

Payment, customer will make a call after the payment is not realized,

and from that point investigation process starts. In current scenario

all these investigation process is happening manually hence it is time

consuming one. HBOS decided to make this process as automated one by

integrating all the systems to Smart Investigate application (product of

Pega). For each system need an interface and it will communicate with

Pega SI.

NEXTGEN Maintenance

Role : Software Engineer

APR 2005 - Mar 2006 Deutsche Bank Singapore Bangalore, India

dbDirect Internet is web-based application of Deutsche Bank for its

customers with many modules each catering to a specific need. DBDI

application is based on the NextGen Architecture. Nextgen Maintenance

which involves all the modules of the DBDI Applications and the modules

are Asian Payments, FI Payments, CPIEC, Direct Debits, Free Format

Instructions, EAI, Loan Deposits, F3I, Netting, File Transfer, System

Features, FX etc., NG Maintenance is enhancement and maintenance project

and it covers all the modules of the DBDI application.

Environment :

Java, JSP, Servlets, JUnit, JWEBUnit, Oracle, Data Junction.

F3I (Fixed File Format Import)

Role : Software Engineer

SEP 2004 - MAR 2005 Deutsche Bank Singapore Bangalore, India

File upload is one of the modules of dbDirect web-based application of

Deutsche Bank for its customers. The customer will upload their payment

file through the application and it will convert into an intermediate

UDF. This process is done by a DJS file, which developed in Data

Junction tool. The java file calls the corresponding DJS file for the

upload and converts into the UDF, if the uploaded payment file is

correct format.

Environment :

Java, JSP, Servlets, JUnit, JWEBUnit, Oracle, Data Junction.

CPIEC (Corporate Payments Import Export Consolidation)

FEB 2004 - AUG 2004 Deutsche Bank Singapore Bangalore, India

dbDirect Internet is a web-based application of Deutsche Bank which

provides multiple features to their clients catering to a specific

need. 'Payments' is an important module of the dbDirect Internet

Application. This module caters domestic and international payments. It

is further divided for single and bulk payments. Much functionality has

been included and code is now sleeker and maintainable than the previous

version. Extensive use of patterns like Value Objects, DAO etc is made.

CPIEC got Best project for the year 2004. Extensively worked on

'Extended Payment Details' module.

Environment :

Java, JSP, Servlets, JUnit, JWEBUnit, Oracle.

Recognitions:

Appreciation from Technical Director and Vice President of Ameriprise

Brokerage team on solution to process high volume files.

Received Sleuth Award for detailed level analysis on an issue and

providing the solution.



Contact this candidate