Post Job Free
Sign in

Data Analyst

Location:
Tysons, VA
Posted:
February 24, 2019

Contact this candidate

Resume:

Anwar

Email: *****.****@*****.***

Contact : 559-***-****

Business/Data Analyst

PROFESSIONAL SUMMARY:

7+ years of Industry experience as a Data/Business Analyst with solid experience in Information Technology with expertise in data warehouse design, development, implementation and maintenance. Worked extensively with Microsoft Excel (Macros, VLOOKUPS, and Pivot Tables). Proficient Microsoft Word, Power Point, Access and Lotus Notes.

Strong knowledge in Portfolio Management, Performance Measurement, Fixed Income, Equity, Credit Derivatives, Interest Rate Derivatives.

Expertise in Project Management i.e. Project Scoping, Planning, Estimating, Scheduling, Organizing, Directing, Controlling, Budgeting and Drafting Remedy Procedures.

Extensive experience in conducting Market Research, Feasibility Studies, Data Analyses, Data Mapping, Data Profiling, Gap Analyses, Risk Identification, Risk Assessment, Risks Analyses, and Risk management.

Develop requirement and functional documents including BRD, FRD, High Level Design document.

Worked as Commercial Fire and General Liability Insurance domain expert.

Experience in working on Policy Administration tool (POINTIN) and Agent Portal(AGENCY LINK)

Preparing the policies of different scenarios and submitting them in different environment like QA and Development.

Sound Knowledge of data architecture concepts, data warehouse and DataMart.

In depth knowledge Rational Unified Process (RUP) methodology, Use Cases, Software Development Life Cycle (SDLC) processes, Object Oriented Analysis and Design (OOA/D).

Competent in Creating Unified Modeling Language (UML) diagrams such as Use Case Diagrams, Activity Diagrams, Class Diagrams and Sequence Diagrams.

Expertise in broad range of technologies, including business process tools such as Microsoft Project, MS Excel, MS Access, MS Visio, technical assessment tools, MicroStrategyData Warehouse Data Modeling and Design.

Experience with data migration (ETL development), document data manipulation processes and scripts

Extensively worked with WINTEL, a computer running Windows on an Intel processor, created RAID volumes to improve overall performance.

Experienced in data warehouses and data marts for business intelligence reporting and data mining along with developing and documenting process flows for business processes.

Developed extensive SQL queries using Query Manager against oracle database for data verification and backend testing

Responsible to Track, Document, Capture, Manage and Communicate the Requirements using Requirement Traceability Matrix (RTM)which helped in controlling numerous artefacts produced by the teams across the deliverables for a project

Strong experience in conducting User Acceptance Testing (UAT) and documentation of Test Cases. Expertise in designing and developing Test Plans and Test Scripts.

Interface with clients from Operations, Marketing, Sales, Technologies, and Outside Vendors and act as their customer interface point as the lead of the Projects.

Highly motivated team player with excellent Interpersonal and Customer Relational Skills, Proven Communication, Organizational, Analytical, Presentation Skills, and Leadership Qualities.

PROFESSIONAL EXPERIENCE:

Client: Wells Fargo Bank, Minneapolis, MN Aug2017 – Present

Role: Sr. Business Data Analyst

Wells Fargo has been providing banking and financial services for more than 150 years. Wells Fargo is in the process of implementation of Basel II the Revised capital framework. Basel II compliance is become a global top priority within the financial industry. Introducing capital charges and measure for operational risk, Basel II encourages financial institution to more actively address operational risk management. The scope of this Project was to implement the processes necessary to provide financial information for Basel II reporting. It is a web-based application and master ID complaint, that will allow the users to import data files from different source systems, reconcile and enrich finance data and create output files and send it to the System GCBC(Global consumer Basel Calculator).

.

Role & Achievements:

Responsibilities include gathering business requirements, developing strategy for data cleansing and data migration, writing functional and technical specifications, creating source to target mapping, designing data profiling and data validation jobs in DataStage, and creating ETL jobs in DataStage.

Performed data analysis, data reconciliation and data retrieval from various systems along with identification of data gaps and data quality issues for Basel II Credit Risk Mitigation System (CRMT)

Developed BRD documents for Securitized Inventory (CMO, ABS, MBS, and ABCP). BRD targeted new Basel II Compliance requirements.

Gathering requirements from Line of Business, application development team, and database developers and writing and updating BRD according to that. Facilitating JAD Joint.

Proficient in using Agile Scrum methodologies, performed roles of Scrum Master following sprint/standup sessions and used Excel extensively to write user stories, analyzed the Iteration Burn Down charts and reviewed defects

Perform Financial Statement Analysis (FSA) relating to Fund of Funds (FOF) Portfolio Valuations in comparison to the financial statements of the underlying securities & brokerage reports.

Involved in designing and developing Data Models and Data Marts that support the Business Intelligence Data Warehouse.

Worked as a Data Analyst in order to gather requirements and reaching each and every deadline well ahead of the time line. Methods employed were in compliance with IIBA standards.

Associated extensive backend testing to verify correctness of the report data using SQL scripts and data comparison.

Lead multiple project teams of technical professionals through all phases of the SDLC using technologies including Oracle, Erwin, DataStage, Data Warehousing, WebSphere and Cognos.

Data mapping, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data within the Oracle database.

Created a test plan and a test suite to validate the data extraction, data transformation and data load and used SQL and Microsoft Excel.

Reviewed the data model and reporting requirements for Cognos Reports with the Data warehouse/ETL and Reporting team.

Worked with developers to make sure that they understood the Use Cases.

Obtained Data requirements, identified data sources, determined the content of data fields and created Data Mapping Documents and performed Data Extraction and Data Compilation using SQL queries.

Worked with the Business, Operations, and Engineering stakeholders to define the functional integration and end-2-end test cases for the CRM project.

Used Requisite Pro for the Requirement Documents Preparation and Prepared Business Process Models that includes modeling of all the activities of the business from the conceptual to procedural level. Followed top down, levelled technique for building Business Process Models.

Worked with key members from various cross-functional teams to analyse various third-party CRM solutions to make a ‘buy vs build’ decision. Conducted cost/benefit and impact analysis.

Conducted Joint Application Development (JAD) sessions with IT groups. Identified the Key Changes, and participated in Stakeholder Management to communicate effectively with them

Creation and maintenance of Relational Databases for information access and dissemination using MS-SQL Server, MS-ACCESS and Windows Server

Conducted current state study which included data collection methodologies, data quality and data integration problems, data storage and infrastructure related issues.

Designed and implemented basic SQL queries for QA Testing and Report / Data Validation.

Documented business workflows textually as well as in UML diagrams (State diagrams) according to RUP, for the stakeholder review.

Data conversion from other platforms to native database. Creating Customized modules or data maintenance.

Performed all necessary programming, testing and documentation for COBOL, DB2 batch programs and sub-routines.

Responsible for Data Extraction, Data Compilation, Data Analysis, Data Manipulation and Data Validation using SQL queries in a MS SQL Server 2005 environment

Prepared test Data sets and performed data testing using the PL/SQL scripts. Also used MS excel for data mining, data cleansing, data mapping, data dictionary and data analysis.

Used Data warehousing for Data Profiling to examine the data available in an existing database.

Worked with Data Warehouse in the development and execution of data conversion, data cleaning and standardization strategies and plans as several small tables are combined into one single data repository system MDM (Master Data Management).

Used Microsoft Visio for Business Process Modeling and Data flow diagrams.

Involved in Managing Data modeling project from Logical design and implementation of Sybase Database

Worked on data modeling and produced data mapping and data definition documentation

Used Informatica to extract and transform data from various DB2 database to the data warehouse.

Conducted User Acceptance Testing, gathered and documented User Manuals and Business Rules

Environment: Microsoft Office Suite, Microsoft Visio, Documentum, Windows NT/XP, Oracle, SQL, SQL Server, Sybase, EVPN, RUP,VBScript, CRM, SharePoint, Unix, DB2, Adobe Photoshop, Caliber RM MIS-IP, C#. NET, ASP.NET, Business Objects, TestDirector, LoadRunner.

Client: Wells Capital Management Minneapolis, MN Jan 2016 – July 2017

Role: Business Analyst

As the Investment Management Division of Wells Fargo & Company, Wells Capital Management has over $193 billion under management. The project aimed at integrating Bloomberg Order Management System to support Fixed Income trading (Corp. Bond, Muni Bond, US Treasury, ABS, MBS/CMO/Pass Through/TBA)and to support Equity and Derivative trading(Options, Futures, Forwards and Credit Default SWAPS) to realize Straight Through Processing internally from Front Office to Downstream Systems (e.g. ADP & PMR) and Custodian Agents. The new system helps portfolio and asset managers, financial advisors and traders gain a simple, cohesive and transparent view of service data and enable multiple specialists and bankers to collaborate more seamlessly to service customers with complex money management and investment concerns.

Role & Achievements:

Designs and develops the logical and physical data models to support the Data Marts and the Data Warehouse

Apply custom changes to Calypso software in order to replicate vanilla and exotic OTC derivative products

Interacted with department heads to finalize business requirements, functional requirements and technical requirements and also created Business process model.

Worked with SMEs to understand various assets and security classes, exchange traded derivatives (such as futures, options on futures, equity options and index options), OTC derivatives (such as interest rate swaps, swaptions, credit default swaps and total return swaps) and portfolio allocation and management.

Conducted Gap Analysis, and Gathered User Requirements by Interviews, user meeting, JAD session, and Requirement Elicitation Sessions

Acting as First point of Contact from the Wintel Services for the VMware and System Management Server Activities.

Extensively used Visual Basic to create macros in order to make data migration testing more useful and compared results from the legacy databases to the new CRM database.

Creation of Data Comparison Utilities and Excel Macros for IDW to Data Mart Comparisons

Identified/documented data sources and transformation rules required to populate and maintain data warehouse content.

Highly proficient in writing User stories, creating Use Cases, Use case diagrams, Workflow Diagrams, Sequence Diagrams, and Class Diagrams etc. Extensively used Rational Rose and MS Visio for UML.

Used MS Visio for Process modelling, Process mapping and Business Process flow diagrams.

Designed and developed Use Cases using UML and Business Process Modelling.

Worked on Performance, Tuning and loading data for fast access of reports in Client/Database. Server balancing, business Rules Implementation, MetaData, Data Profiling.

Worked with data management and other groups to make sure that quality procedures are integrated into their work processes.

Performed Data mapping, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data

Prepared graphical depictions of Use Cases, Use Case Diagrams, State Diagrams, Activity Diagrams, Sequence Diagrams, Component Based Diagrams, and Collateral Diagrams and creation of technical design (UI screen) using Microsoft Visio.

Created Source to target data mapping documents identifying key data elements and prepared Data Flow Diagrams

Used Visual Basic Editor (VBE) for developing and testing Macros in Excel

Created SQL reports, data extraction and data loading scripts for different databases and Schemas.

Wrote complex SQL code with business logic for Production Data Maintenance (PDM) Impact Analyses and fallow up with implementation monitoring.

Involved with all the phases of Software Development Life Cycle (SDLC) methodologies throughout the project life cycle.

Used Test Director and Mercury Quality Center for updating the status of all the Test Cases & Test Scripts that are executed during testing process.

Prepared a handbook of standards and Documented standards for Informatica code development.

Performed backend testing using SQL queries and analysed the server performance on UNIX.

Designed conceptual and logical data models of Data Warehouse and tables.

Using Shared Containers and creating reusable components for local and shared use in the ETL process.

Prepared Logical Data Models that contains set of Entity Relationship Diagrams and Data Flow Diagrams and supporting documents and descriptions of the Relationships between the data elements to analyze and document the Business Data Requirements.

Defined the duplication of data and columns between tables using TOAD and SQL.

Verified the Business Scenarios on new builds to allow extended testing by the QA team.

Environment: Rational Enterprise Suite (Rose, ClearCase, ClearQuest), RUP, Visual Basic, SQL, SQL Server, Oracle, DB2, CRM, Wintel, Unix, MS Visio, MS Office, LoadRunner, WinRunner, DOORS, Project Management.

Client: Liberty Mutual June 2013 - Dec 2015

Role: Data Analyst

Liberty Mutual is one of the largest insurance companies dealing in commercial as well as personal lines of insurance. The project was to automate the claims processing systems and enhance the CRM by providing real time monitoring for its Crop insurance division.

Role & Achievements:

Prepared Business Context Diagram, Use Case diagrams and corresponding Activity Diagrams using Rational Rose to depict the workflows to be incorporated into the development of Pega Business Process Management (BPM) tool.

Worked with data migration (ETL development), document data manipulation processes and scripts

Use of data transformation tools such as DTS, SSIS, Informatica or Data Stage.

Using UNIX console commands to check database has been connected the other functionalities.

Wrote database interface specifications and documented in Data Manager data dictionary

Worked on development of an information access strategy for the client including integrated data and information requirements, Tools & Process recommendations.

Create and maintain data model/architecture standards, including master data management (MDM)

Analysed the existing as-is system for Insurance Personal Lines, tracked defects, provided recommendations to make enhancements in the future to-be system.

Involved in generating Test cases for property and casualty Insurances for Different Levels of Business.

Extensively involved in the modeling and development of Reporting Data Warehousing System.

Performed Business Process Mapping and performed AS IS and TO BE analysis

Created and managed Project Templates, Use Case Project Templates, Requirement Types and Traceability Relationships in RequisitePro.

Prepared scenarios, Use Cases & UML State Diagram for scenarios using Rational Rose.

Integrate Data from wide range of sources including in-house clinical data management systems, labs and contract research organizations and perform data cleaning.

Involved in creating automated Test Scripts representing various Transactions, Documenting the Load Testing Process and Methodology. Created meaningful reports for analysis and integrated the Performance Testing in the SDLC.

Responsible for 24x7 coverage for various mainframe-based Cobol applications using DB2, VSAM and IMS databases

Developed and managed Project Plans and Schedules. Managed resolution of Project issues and conflicts.

Provided technical support to the Database Team on Oracle 9/10 based architecture, developed SQL queries for data extraction purposes.

Assisted in mapping the requirements to the source systems, the DW and applicable data marts.

Organized the data for the court report performing data extraction from different data repositories

Tested the final application for Usability testing to verify whether all the User Requirements were catered to by the application.

Environment: Rational Clear Case,Doors, IBM WebSphere, XML, Java, J2EE, JSP, RUP, VoAVPN, Oracle, MS Visio, MS Office, DB2, Sybase, SQL Server Reports, WinRunner, LoadRunner, Quick TestPro

Client: Yes Bank Jan 2012 – May2013

Role: Data Analyst

Role & Achievements:

Completed a thorough customer analysis and prepared a Request for proposal (RFP) document and got the shortlisted vendor document signed off. Finally, CRM was decided upon by the stakeholders among shortlisted as final implementation customer

Create and maintain Use Cases, visual models, including activity diagrams, logical Business process models, and sequence diagrams using UML.

Created source table definitions in the Data Stage Repository by studying the data sources.

Prepared Use cases and Data flow diagrams and Work flow diagrams, considering the scope of the project with MS VISIO 2003.

Reviewed Stored Procedures for reports and wrote test queries against the source system (SQL Server) to match the results with the actual report against the Data mart (Oracle).

Created and scheduled Sessions based on demand, run on time and run only once using Informatica Server Manager.

Developed demonstrated prototypes including data models, business objects universes, reports and business dashboards and received their feedback to define the information access strategy for an integrated data warehousing and reporting solution.

Worked on MS Excel Macros for creating automatic reports

Responsible for executing User Interface Testing, System Testing, Data Quality Testing on the configuration design and prototypes.

Prepared Data Comparison tool by using VBA (Excel Macro).

Assure that all Artifacts are in compliance with corporate SDLC Policies and guidelines.

Functioned as the primary liaison between the Business line, operations, and the technical areas throughout the Project Cycle.

Defined the test criteria and project schedules, and baseline the Test Plan with the help of project meetings, walkthroughs.

Used master data management (MDM) to define reference data

Developed Functional Specification Document and Supplementary Specification (non-functional) Document.

Participated in the Logical and Physical Design sessions and developed Design Documents.

Developed schemas for extraction, transaction, and loading (ETL) using Solonde Warehouse Workbench to expedite data integration between systems.

Worked with Quality Control Teams to develop Test Plan and Test Cases.

Develop User Manuals, and Training Manuals as per Project Specifications and timelines.

Environment: Microsoft Office Suite, MS Visio, Windows XP, Dream Weaver, CRM, SQL, PLSQL, XML, VB, ASP, Crystal Reports, Adobe Photoshop, Mercury TestDirector, WinRunner, LoadRunner



Contact this candidate