MANOJ
**********@*****.***
• Over * years of IT experience in RDBMS & Data warehousing experience in Modeling and
Analysis, development, testing and support of enterprise data warehouse focused experience as a
Data Analyst in Brokerage, Investment Management and Financial Services.
• Good understanding of business processes in Brokerage, Banking, Investment Management and
Financial industries.
•
• Has strong knowledge of ACATS (Automated Customer Account Transfer Systems), Portfolio
Management, securities, Alternative Asset Management (Private Equity, Real Estate) and to support
strategic planning and goal setting.
•
• Experienced in requirement analysis, data design development, test preparation, test execution,
defect management, and management reporting.
•
• Experience developing conceptual, logical and physical data models as per enterprise standards.
• Experience designing data models using star and snowflake schema modeling.
• Strong understanding of dimensional and relational modeling techniques. well versed in
Normalization / Denormalization techniques for optimum performance in relational and dimensional
database environments.
• Has a strong background in developing data mapping, performance tuning and identifying
bottlenecks of sources, mappings, targets and sessions and strong understanding of data modeling
in data warehouse environment such as star schema and snow flake schema.
•
• Experience in working with BI Reporting teams which use tools such as Micro-Strategy, Business
Objects and SSRS as well as development of ETL mapping and scripts.
•
• Strong understanding of data quality assurance processes and procedures.
•
• Has experience in software development life cycle (SDLC) methodologies like Waterfall, Agile and
Rational Unified Process (RUP).Very good T-SQL and PL/SQL Programming skills and query
optimization & performance tuning.
•
• Used SQL Profiler, Execution Plan, Performance Monitor and DBCC Commands. Created complex
DTS Packages for Data Migration and Transformation.
•
• Experienced in data migration from Oracle/Access/Excel Sheets to SQL Server 2005/2000/7.0.
•
• Experience in MS SQL Server upgrades from 7.0 to MS SQL Server 2000 and from 2000 to MS
SQL server 2008.
•
• Has very strong background in a disciplined software development life cycle (SDLC) process and
has excellent analytical, programming and problem solving skills.
TECHNOLOGIES:
LANGUAGES:
VISUAL C# C
VISUAL BASIC C++
TOOLS:
1
ERwin MS VISIO
INFORMATICA V 9.1.0 RATIONAL SUITE
MICROSOFT .NET REQUISITEPRO
MICRO STRATEGY CLEARQUEST
JAVASCRIPT CLEARCASE
HTML RATIONAL ROSE
CSS AXURE PRO
XML / XSL / XSLT BLUEPRINT
MICROSOFT IIS 7.0/IIS 6.0/IIS 5.0 CALIBERM
MICROSOFT SERVER 2003 BUSINESS OBJECTS XI
VISUAL STUDIO 2008/2005 COGNOS BI STUDIO
SHAREPOINT 2007 MS OFFICE SUITE 2007
IRISE TELELOGIC DOORS MS PROJECT
DOCUMENTUM 6.X HPQC QUALITY CENTER V9.0
DATABASES:
MS SQL SERVER 2008/2005/2000
MS ACCESS
DB2 FOR ZOS. V8
ORACLE 8/10G/11I
MYSQL
OPERATING SYSTEMS:
WINDOWS 98/NT/XP/2007
LINUX UNIX
EXPERIENCE:
Implementation Partner - Advantage Technical Resourcing
End Client - Wells Fargo Advisors, Saint Louis, MO.
Duration :May 2013 - Present
Sr. Data Analyst
Responsible for Data Analysis related work under team - Data management, at Wells Fargo Advisors. Data
management team manages Brokerage Data Warehouse (BDW) and Operational Data Store (ODS) along with
several other applications which support internal applications and external applications which are built to serve end
users and applications. Several projects are initiated to meet the growing needs of end users to serve clients in a
better manner.
Advisory Best Practices: Securities Backed Lending - Phase II Wave III
Service Request Work Flow related entity Changes and Brokerage Account Current Dimension related Changes.
As a part of Securities Backed Lending initiative, Security Backed Lending needs to be reported via Actuate
Reporting which requires additional attributes in Brokerage Account Current Dimension and also Service Request
Work Flow related entities.
Drafted requirements for Entity Related Changes on Brokerage Data WareHouse.
Participated in JAD Sessions with Application team to understand the business requirement to report
Securities Backed Lending related Asset Amounts.
Conducted impact analysis using existing In-House built Unix tool, auto-sys functions, Queried Data base
System Catalog Engine to understand and draft data requirements.
Performed Data Profiling to understand the usage of Account product class code by account category and
counts.
Conducted review sessions for Functional Data Requirements.
2
Presented Functional Data Requirements during formal review session with review committee and had the
Functional data requirement document approved. Participated in Physical data base design,
including Star schema and data mart design
Implementation Partner - Merritt Technical Associates
End Client - T. Rowe price Corp, Baltimore, MD. Oct 2012 – Apr 2013
Data Analyst/Modeler
Responsible for Global Business Solutions – IT, at T. Rowe Price Corp is AUM(Assets Under Management) Data
Warehouse and BI Reporting along with Upstream Systems, such as IDM – ODS (Investment Data Management –
Operational Data store), PRD (Product Reference Database), RPS(Retirement Planning Services) on ODS.
AUM (Assets under Management) – Support for Monthly/Quarterly and Year End Reporting:
Responsibilities:
Understand the defects flagged through HPQC Quality Center V.9 by Business personnel and perform
initial analysis.
Summarize and present a solution that will help deliver a short term solution for Assets under
Management Reporting team (ART), Assets under Management Project Team (APT) and Corporate
Finance.
Post delivering a short term solution a long term fix for delivering Automated ETL Load procedure
changes/new processes, BO changes/New reports or a File Deliverable via ETL processes and delivered
through FTP servers for Auditing Assets on a Monthly/Quarterly and Year end reporting.
Responsible for logical and physical modeling per business requirements using Erwin
Created new entities and made changes to existing entities in the data model for Short term and/or Long
term fixes for the Defects flagged through HPQC Quality Center V9.0 by Assets under Management –
Reporting Team.
Clean-up reference data and/or source data by using DML statements in-order to fix the defect.
Develop load procedure using DB-Visualizer as a SQL over-ride in Informatica’s Source Qualifier from
Multiple sources such as entities for different Schema’s typically ODS or a Source file provided by internal
systems such as HiNet and files generated from Product Reference database.
Develop Source to Target Mapping Documents for ETL development which provides all the necessary
transformations, Aggregations and Look-Up Operations.
Develop test strategy and test cases for the changes being made for the fix, validate data in QUAL region
after the unit testing is complete in DEV region.
Work with business folks and provide insights on any questions or concerns that the reporting team might
have while performing UAT.
Validate data in PROD region post implementation.
Product Reference – Sub-Strategy Changes
Product reference Data base contains Strategies, Sub-Strategies, Share-Classes, external identifiers such as
CUSIP’s and other investment vehicle portfolio details and this initiative was to update several Sub-strategy names,
Strategy names along with establishing relationships between a existing/new Strategy to newly established sub-
strategies.
Responsibilities:
Initial analysis to understand the existing relations and profiling the data to summarize the number of
underlying accounts/portfolios that fall under a given Strategy/Sub-Strategy.
3
Summarized the analysis performed to come up with queries to Assets under Management – Reporting
Team in-order to determine the requirements.
Developed SQL Scripts for implementing Changes in Lower environments, unit and regression test data
in the lower environments.
Regression test data in AUM Data Warehouse whose source of Investment Vehicle data is Product
Reference DB.
AUM – RPS OMNI Data (Retirement Planning Services – Large Plans) Integration.
A single source system is currently unavailable to report all of the Assets under Management under Retirement
Planning Services to Senior Management and reports are currently being generated with manual intervention. A part
of the RPS related data which helps report Assets Under Management is currently available in AUM Data
Warehouse via Standardized feeds from several external data providers which were integrated to AUM Data
warehouse in 2011. I was a part of this initiative from Design Phase which involves integrating RPS Assets related
data not available in the AUM Data Warehouse.
Responsibilities:
Participated in JAD sessions with Senior Solution Analysts and Business users to under the
requirements, existing small retirement plans related data, and Source data Structures.
Data design specifications for Business party information – Plan and Client related data.
ETL Technical specifications for Plan, Client, Positions and Business Party relationships.
Test Data/Test Strategy for Transactions, Positions, Business Parties and Business Party Relationships.
Implementation Partner - Advantage Technical Resourcing
End Client - WELLS FARGO ADVISORS, ST. LOUIS, MO.
Jul. 2011 – Sep. 2012 Data Analyst
Responsible for Data Analysis related work under team - Data management, at Wells Fargo Advisors manages
Brokerage Data Warehouse (BDW) and Operational Data Store (ODS) along with several other applications which
support internal applications and external applications which are built to serve end users. Several projects are
initiated to meet the growing needs of end users to serve the end clients in a better manner.
ISI Release 4 – ACATS: ISI is a merger initiative designed to provide business planning and decision making
through delivery of information. ACATS - Automated Customer Account Transfer Services is a system which tracks
transfer of accounts from one firm to another and a project has been initiated to provide a solution which tracks the
transfer of these accounts along with the assets associated the each of the account. Solution was to have new
Database tables on top of which reporting application [via Micro Strategy] will serve reporting needs to the end
users with appropriate data.
Responsibilities:
Worked on data profiling to understand source data available is existing tables using TOAD for Oracle
and DB2.
Validated the existing data in Brokerage data warehouse to ensure that the existing data would meet the
requirements of end users.
Met with business users to understand few of the drawbacks in existing source data to provide a better
solution, which were tracked using Microsoft Share-Point services.
Worked on JAD sessions with Data Modelers to develop a proper design on ERWIN to track and store
data to meet the requirements which involves data analysis to provide manual reports developed using
SQL Scripts.
Drafted test strategy to validate the data loaded into tables per requirements for both historical and
ongoing data loads.
Developed Test scripts and executed the same to validate the data during system integration testing.
4
Performed root cause analysis on invalid data that is loaded into several tables during system integration
testing.
Reported defects with analysis details to perform code changes in time to have the product installed in
time
User acceptance support per end users request to help them understand the design in detail and perform
validations.
Supported the system post implementation and provided users with manual reports from database that
can be compared with reports driven out of Micro-strategy.
Responsible for analysis to resolve post implementation issues with source data and developed SQL
scripts to remove bad data that was loaded into the database which didn’t meet the requirement.
Worked on performance enhancement of SQL Scripts for Validation of Data during and after
implementation of ACATS and Margin and planned for additional validations of data.
Completed Master Validation List and Validation tracker for implementation.
Validated data loaded into dimension tables and corresponding fact tables post implementation and
triaged detects.
Helped in fixing detected bug in the code during warranty period thru quicker analysis and faster
resolution since the analysis was complete during the validation phase of implementation which resulted
in proper reporting of metrics to end users of reports.
FiNet Gaps – Human Resources: Post merger event – Wachovia Securities to Wells Fargo Advisors several
applications have been migrated, one of the applications which are to be decommissioned is a HR application
which tracks contracting employee details pertaining to accounts. Involved in this project from the design phase.
Responsibilities:
Conducted Impact analysis on the existing processes in place using UNIX, databases on both Oracle and
DB2, manual research on existing documentation such as source system analysis documents which
details the existing documents in place
Drafted summary of the impact analysis on Autosys and broad casted the same to a large set of user
group to communicate the impact of changes that will be performed to provide a solution using Microsoft
office Exchange
Worked with the team members to build a solid test strategy, since there is a known impact to the existing
processes already in place
Worked on JAD sessions with data modelers and test lead to help understand the requirements to design
and test the system which meets the requirements and doesn’t break the existing processes in place
ELID Remediation – Human Resources: ELID’s aka Employee ID’s are assigned to all employees(Full time
employees, Contractors) in Wells Fargo, due to the merger there are few contractors and employees from legacy
Wachovia who weren’t assigned one and use Wachovia ID to access systems at Wells Fargo. This project was
initiated to provide a single Employee ID to all the employees with which they can access applications and
systems.
Responsibilities:
Profiling employee ids which are not a part of Wells Fargo employee id’s for modeling and design
purposes with the intent of not losing any historical information that an employee has had.
Used TOAD for ORACLE and DB2 to perform the required analysis, profiling on several tables that host
and historical employee information.
5
Used Impact Grepper tool on UNIX platform to come up with a list of all impacted applications
downstream, Impact due to the changes communicated to the downstream interface
partners/Applications.
Drafted Functional Data Requirements for the project to establish a system that helps updating employee
information across multiple platforms to stream line the accessibility.
Participated in JAD sessions with data modelers and Tech leads to come with a solution that help create
the system in a robust fashion using several tools such as RDL(MS Access application that helps to
identify impacted downstream partners).
Drafted end to end data lineage on the changes and their respective impacts to the downstream partners
and presented the same to several interface partners to address the issue of impact due to the changes
master test strategy for system integration testing using HP – ALM – Quality Center.
Responsible for post implementation validations and Support during User acceptance testing.
Asta CRS Inc. Mar 2010 – Jun. 2011
Implementation Partner - Tata Consultancy Services
End Client - RUSSELL INVESTMENTS, SEATTLE, WA.
Data Analyst/Modeler
Russell Investments was implementing an Enterprise wide solution for Data Management (EDM) which had
different modules including on-boarding new data feeds to meet the end-user data needs, Fund Reference Data
Mart, Index Constituents, Constructing EDW (Enterprise Data Warehouse) and Metadata Management. Russell
initiated the DQM (Data Quality Management) Program to feed Quality data into the EDW and other new feeds
that are in the process of on-boarding. Currently engaged in both Data Quality Management and Metadata
Management Programs.
Responsibilities:
Involved from beginning of the project, published charter for EDM (Enterprise Data Management) and
Statement of Work (SOW) for DQM and MDM.
Worked with data from MCH and FDR systems which was sent by the custodians (namely Pricing, FXA
and Positions).
Assisted Project Manager in scope estimation and work breakdown structure (WBS) to meet Project
Milestone. Gap Analysis for ABOR data sets for streamlining the process of Data acquisition, consistent
transformation and Storage – for implementing an enterprise - SOA.
Instrumental in developing a score card for the DQM program to report the Current state of Quality Vs.
future state.
Profiled the data in-order to understand the current state and statistics of numerous attributes that Russell
consumes on a daily basis.
Current state of data quality – is being analyzed manually using MS-Access for storing and generating
reports.
Responsible for differentiating the data quality rules into different tiers to ensure that rules are prioritized
during the implementation.
Worked closely with SME’s to ensure that all the requirements are captured up-front, documenting the
same to form a technical requirement package Performed Source to target mapping to identify the source
data that needs to be moved into the target tables.
Conducted workflow, process diagram and gap analyses to derive requirements for existing system
enhancements.
6
Designed and created Data Quality baseline flow diagram, which includes error handing data flow
through various internal and external data sources before reporting.
Developed SSIS packages for implementing a Proof Of Concept (POC) for Data Quality Management
program – Complex queries to update table with records which have inconsistencies in the data provided
by the Custodian.
Developed Script tasks in SSIS for File movement and table updates on data bases
Responsible for designing Conceptual Data Model (CDM), Logical Data Model(LDM) and
Physical Data Model (PDM), Star and Snowflake schema design, implementation and support.
Created Data Model Using Erwin for DQM. Gathered requirements for Reporting needs of the
end user – SSRS.
Created liaison between the Business users and Technical Development team to meet the Reporting
needs through SQL Server Reporting Services (SSRS). Created Adhoc-reports and tested the reports
developed using SSRS.
Raised tickets using Remedy and prioritized the tickets as per the need, analyzed the tickets to know the
status and followed up with IT – Help Desk team to get the problem resolved.
Configured different types of applications such as – SQL Server, Oracle, Erwin, MS-Access Data bases,
MS Office Applications and Modeling tools to load metadata into Informatica - Power center.
Observed Workflows and lineages from several sources to their respective targets for various
applications across the enterprise.
Documented and discussed the changes that can be made with Project Manager, Application Support
team for robust and streamlined flow of data.
Participated and Ran Use cases for Metadata POC using Informatica Metadata Manager.
Developed SharePoint site for holding metadata.
Created workflow by properly documenting all the tasks that a metadata admin should perform for
consistent and proper maintenance of Metadata – Metadata Admin Guide.
Developed a user guide to help first time users of metadata site to help understand the structure and
content of the site, detailed all the tasks that an end user could perform such and adding, recommending
changes to the current metadata for an application that the user works on a daily basis. – Metadata User
Guide.
Developed SQL Queries that will help Metadata Site admin maintain the site for updating any future
changes to the data structures, tables and columns.
Populated SharePoint site with Metadata for Various Data Structures, tables, Columns and their
Constraints using Structured Query Language.
Planned and developed use-cases and documenting them in detail for User Acceptance testing.
Developed test cases and ran the same for metadata implementation using SharePoint.
IRIS Info, Hyderabad, India Jul 2006 – Apr. 2011
DB2 Database Developer
Iris info is a leading software company providing Offshore Software Development & Solutions with
services such as outsourcing Application Development, Product Development, Web
Development, E-Strategy Consulting, WAP Development, ERP, SAP, Multimedia and Design
7
Solutions. A system is developed to track the sales, inventory & financial subsystems
development. Counter sales maintenance, daily cash & bank reconciliation.
Responsibilities:
Involved in the entire System study, analysis and Design.
Involved in gathering user requirements along with the Business Analyst.
Used Star schema methodology in building and designing the logical data model in the
dimensional Models.
Part of the team responsible for the analysis, design and implementation of the business
solutions.
Gathered and translated business requirements into detailed, production-level technical
specifications, new features and enhancements to existing technical business
functionality.
Prepared business case for the data mart and then developed it.
Part of team conducting data analysis and data modeling JAD sessions, communicated
data-related standards.
Determined data rules and conducted Logical and Physical design reviews with business
analysts, developers and DBAs.
Created Entity relationship diagrams, Function relationship diagrams, data flow diagrams
and enforced all referential integrity constraints using Oracle Designer.
Involved in the creation of schema objects like indexes, views, stored procedures and
synonyms.
• Redefined many attributes and relationships in the reverse engineered model and
cleansed unwanted table/columns as part of data analysis responsibility.
Involved in generating, documenting and maintaining Metadata.
Involved in the analysis of how the purchase order process was organized.
Created SQL tables with referential integrity and developed queries using SQL,
SQL*PLUS and PL/SQL (Legacy database).
Involved in summarizing Triggers which internally called procedures and functions in the
legacy database.
Involved in testing the database for queries that were generated and handled the
performance issues effectively.
Involved in documenting the entire technical process.
Supported technical teams involved with DB2 Database issues and operations.
Involved in the maintenance of the database. Performed tuning of SQL queries to
improve the response time. Created new indexes and provided hints in the process of
tuning for better cost based tuning.
Environment: DB2 for Z/OS V8.0, Oracle 8.0, PL/SQL, Toad for DB2, Oracle Designer and
Unix.
8
EDUCATION:
Bachelor’s Degree in Information Sciences and Technology, Koneru Lakshmaiah University, Guntur, AP, India.
9