Resume

Sign in

Data Analyst

Location:
Ashburn, VA
Posted:
July 01, 2020

Contact this candidate

Resume:

RAMYA KRISHNA MUVVA

312-***-****

add9ip@r.postjobfree.com

OBJECTIVE

Seeking a Sr. Business Analyst/Data Analyst position that uses my skills and experience to make a positive contribution to the organization

SUMMARY

●Over 10+ years of total IT experience in the Data Analysis, Design, Development, Testing, Implementation, Administration and Support of Database Systems in UNIX, Mainframes (MVS) and Windows environments.

●Extensive knowledge in Financial Services and Mortgage Industries.

●Excellent Business Analysis skills and extensive experience in Requirements gathering, SDLC and writing functional Specifications, Use Case Requirements, Test Plans and full life cycle development using Rational Rose.

●Expertise in data modeling, project planning and requirement analysis.

●Experience in working with SAS procedures like Proc Print, Proc Report, Proc Summary, Proc Freq, Proc Mean, Proc Transpose, Proc Download, Proc Upload and Proc SQL.

●Created Financial and Retail Modeling Datasets. Used the procedures SQL, TABULATE, FREQ, FORMAT and GRAPH to generate modeling reports.

●Involved extensively in Data Extraction, Transformation, Loading, Analysis, and MIS Reports for Banking.

●Expertise in using MS Access, MS Excel and SAS macro programming to analyze data and generate reports.

●Strong knowledge of AGILE, Software Development Life Cycle (SDLC) methodologies - Iterative, Agile, Waterfall.

●Successful implementation with Agile and SCRUM methodology.

●Participate in Daily Agile Scrum "Stand-up", Bi Weekly Sprint Planning and Retrospective Sessions and update the team on status of upcoming User Stories.

●Experience in creating Ad Hoc reporting ready universes using Business Objects Universe Designer.

●Involved in UAT validations and creating test data for UAT based on business scenarios. Specific expertise includes UAT Testing, End-End, GUI Testing, Functional Testing, and Integration Testing.

●Involved in Data Migration between DB2, Teradata, MS SQL server and Oracle.

●Excellent Programming experience in DB2 SQL, Teradata SQL, Stored Procedures, Macros and Triggers.

●Proficient in Data Warehousing concepts including in Data mart and Data Mining using Dimension Modeling (Star Schema and Snow-Flake schema Design).

●Expert in writing Stored Procedures, Stored Functions, Database Triggers, Packages using PL/SQL.

●Experience in building and retrieving data using Universes, Personal data files, and creating complex ad-hoc Reports using the Business Objects suite applications.

●Expertise in conducting walkthroughs with stake holders and end-users

●Excellent analytical skills in understanding business requirements, developing functional and technical specifications for the refinement and automation of business processes.

●Proficiency in writing Requirements that match Technical system setups, front/ middle/ back office, and Test scripts for Client/Server and web based applications.

●Proficient in creating mapping spreadsheets, Source to Target mapping documents including the transformation rules.

●Proficient in creating the Test plans, Test Case Templates & Test cases, reporting and tracking defects and Modification Requests, maintaining the defects database.

●Experienced with IBM Rational toolset: Requisite Pro, Rational DOORS.

●Detail oriented, organized and enthusiastic to work in a fast paced and team oriented environment.

●Capability of handling multiple projects simultaneously under tight deadlines.

TECHNICAL SKILLS

Operating Systems

Windows 7/8.1/10, UNIX, IBM Mainframe

Programming Languages

SQL, PL/SQL, Python, Excel VBA, C, VB.NET

Software Packages

MS Access, MS Excel, MS Word, MS Power Point, MS VISIO, ER Studio, MS Project

Databases

AWS RedShift, DB2, Teradata V2R13/V2R14, Oracle 10.x/11.x, SQL Server 2012, MS Access

SAS Tools

SAS/Base, SAS/Stat, SAS/Access, SAS/Graph, SAS/Macro, SAS/SQL and SAS/ODS

Business Modeling Tools

Rational Rose, Enterprise Suite, MS Visio, MS Project, Erwin

Automation tools

Data Testing Framework(DTF), HP Quality Center, ALM, TestManager, Requisite Pro, Mercury Interactive WinRunner, LoadRunner, TestDirector

ETL Tools

DataStage 7.5/8.1, Informatica 9.5/9.0, Ab Initio

Reporting Tools

Business Objects XI 3.0/3.1, Tableau 10.0/10.1/10.2

Change Management Tools

Rational Rose, Rational Clear Case, Rational Clear Quest

Agile Tools

Jira, Rally

WORK EXPERIENCE

Capital One Mclean, VA

Role: Sr. Data/Business Intelligence Analyst Aug 2015-Present

Capital One is one of the diversified companies in the world with activities as credit cards, banking and financial services with mostly recognized brands in America. Capital One continues to be defined by two things – great people and bold strategies. The Small Business Banking Incentive program is just one of the elements driving these competitive advantages. SBB Incentive Program is an integral part of the banker’s compensation package. It is designed to reward new and existing SBBs for exceptional performance in acquiring new customers, retaining and deepening existing relationships and generating revenue for the bank. SBB has products like Deposit, Lending & Treasury Management and also serves Merchant services with Vantive and Spark Pay.

Responsibilities:

●Involved in gathering of business requirements, interacting with business users and translation of the requirements to ETL High level and Low-level Design.

●Documented both High level and Low-level design documents, Involved in the ETL design and development of Data Model.

●Expertise in performing GAP analysis, Requirements Traceability Matrix (RTM), Data Modelling and Data Mapping.

●Worked with AWS Cloud Platform and its features which include S3, Analytics (Athena), Redshift, Code Commit, Compute (EC2, Lambda).

●Extensively worked with the AWS Redshift using SQL Workbench, Zeppelin.

●Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, SQL Server 2005 with high volume data.

●Developed complex ETL mappings and worked on the transformations like Source qualifier, Joiner, Expression, Sorter, Aggregator, Sequence generator, Normalizer, Connected Lookup, Unconnected Lookup, Update Strategy and Stored Procedure transformation.

●Delivered mapping specifications (Source to Target), maintain Metadata Repository, reviewed source systems and proposed data acquisition strategy.

●Gathering requirements to create BRD and User Story Acceptance Criteria refining specifications, business processes, and recommendations for system application solutions.

●Working closely with design and development and do make sure they understand the requirements.

●Working with Product Owner and team on Product Backlog, Spring Backlog, Sprint Burndown, Release

●Facilitating sprint planning, daily stand-up call as ScrumMaster for scrum team

●Create and update user stories analyzing the business requirements in Rally.

●Maintained the project requirement documents in the Jira confluence page.

●Experienced in Rally Agile based tool

●Configure JIRA for each project and track all the defects.

●Involved in creating the Jira Stories with the detailed description and acceptance criteria.

●Participate in requirement analysis and compile systems test documents

●Validate the acceptance criteria for the user story with product owner

●Implement the Agile and SCRUM methodology by using best practices process.

●Worked on loading the data from different sources like Oracle, DB2, EBCDIC files (Created Copy book layouts for the source files), ASCII delimited flat files to Oracle targets and flat files.

●Used SAS extensively for Match merge, Append data validation, and Data correction and Cleansing.

●Extracting data from the database using SAS/Access, SAS/SQL procedures and created SAS data sets.

●Extracted data from the Teradata database using SAS/Access, SAS SQL procedures and created SAS data sets.

●Extracted data from database using various SAS/Access methods including Libname statements and the SQL Pass Through facility.

●Worked on loading of data from several flat files sources to Staging using Teradata MLOAD, FLOAD, TPUMP, TPT and BTEQ.

●Involved in performance tuning of the mappings and SQL statements, worked on Query Optimization and Explain Plan utilities for Optimum Performance.

●Applied appropriate field level validations like date validations, default values for cleansing the data.

●Involved in designing an error handling strategy for data validation and error reporting.

●Used tasks like Command, Decision, Event Wait, Event Raise, Assignment and Timer in the workflows and worklets to meet business requirements logic effectively.

●Used ETL methodology for supporting data extractions, transformation and loading process.

●Responsible for preparing ETL strategies for extracting data from different data sources like Teradata, Oracle, DB2, SQL Server, Flat file and XML.

●Involved in the logical and physical design of the database and creation of the database objects.

●Ensured data integrity and referential constraints for accurate database operations.

●Involved in the performance tuning of the application through creation of necessary indexes.

●Responsible for the migration of data from Extract Tables to the flat files and relational tables

●Performed Unit testing, Integration testing and generated various Test Cases.

●Worked with DBA’s for transition from Development to Testing and Testing to Production.

●Worked with the users and testing teams to implement the business logic as expected.

●Written several Teradata BTEQ scripts to implement the business logic.

●Physical and logical design of the dimensional model using ERWIN/ER Studio.

Navy Federal Credit Union Vienna, VA

Role: Sr. Business Analyst/Data Analyst Nov 2014-Jul 2015

Navy Federal is a credit union which serves Army, Navy, Marine Corps, Coast Guard and Air Force with activities as credit cards, Mortgage and financial services. NFCU is the top most Credit union in the Industry. Market and competitive trends are driving member expectations for personalized business relationships across multiple convenient channels, NFCU

increases income through pricing/risk analysis refinements, collections improvement, lending automation, and marketing/cross-servicing efficiencies. Lending Business Intelligence Opportunity (Lending BIO) is one of the projects with in the MxData program. This project’s goal is to build and deploy the first Business Intelligence Opportunity (BIO). BIOs are collections of pre-selected production data that is loaded into the Integrated Data Warehouse and extracted to the best format to support the Navy Federal workforce.

Responsibilities:

●Extensively gathered the business requirements and interacted with the business team to implement improved plans and requirements.

●Participated in the full SDLC process using Requisite Pro, Quality Center and SharePoint to document each phases in an agile waterfall style environment such as requirement, scope, traceability matrix and testing results summary.

●Worked with business and tech teams on test planning by reviewing and analyzing requirements; prioritizing and evaluation testing coverage; designing/documenting test cases; and tracking and communicating project exit criteria.

●Executed test cases manually and SQL queries to verify expected results; performed defect analysis to determine severity and business impact.

●Documented and tracked defects/results by display a high-level of critical thinking to provide solution to high-impact, complex and cross-functional defects and enhancement.

●Assessed testing results in order to continually improve future testing processes and practices through in-depth inspection and adaptation of previous iterations.

●Mentored other team members in key aspects of technical as well as functional processes, and quality assurance best practices.

●Automated pricing/offering in the context of credit care and consumer lending, to enhance collections and to leverage risk splitting to drive incremental net income

● Integrated data from source systems into a data warehouse environment and make data from the data warehouse available to business users in the form they require.

●Monitored the data flow using the Informatica DVO from different sources to Staging, staging to load ready and finally to the Integrated Data warehouse.

●Developed advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted

●Developed test plans based on test strategy. Created and executed test cases based on test strategy and test plans based on ETL Mapping document.

●Conducted UAT for Functional and Regression Testing.

●Developed Positive and Negative Test Data based on business requirements and Source to Target (S2T) mapping document to make sure all the business rules are covered.

●Wrote SQL queries for the business to read the results of the dialer file.

●Reported bugs and tracked defects using ALM.

●Worked on issues with migration from development to testing.

●Participated in daily scrum meetings, coordinated with the developers and other Quality Analysts to resolve the defects and close them.

●Created reports and raw data in MS Access and pivot tables in Excel depending on customer needs.

●Worked on Ad-Hoc requests from the internal customers.

●Optimized and fine-tuned existing SQL scripts in Oracle using the Optimizer Hints like parallel, with, and ordered clause.

Capital One Mclean, VA

Role: Sr. Business Analyst/Data Analyst Aug 2012-Nov 2014

Capital One is one of the diversified companies in the world with activities as credit cards, banking and financial services with mostly recognized brands in America. Credit Risk Management(CRM) represents a significant challenge for financial institutions as they seek to enhance current approaches to measuring, managing and allows for interdependencies among credit risk, liquidity risk, market risk and contrasts various quantitative analyses with qualitative considerations. US Card Account Level Loss Forecasting (ALLM) model is used widely across business segments for LF quarterly assessments, as an input in US Card Basel Models, as an alternative to scoring incoming portfolios. ALLM was developed using survival analysis with logistic regression. ALLM estimates probability of charge off over multiple horizons for four targets (CTCO, BKCO, DECO & ATTR). Model leverages internal (Charge off Data, CIT Data, Driver Table Data, SPR Data, TRIP Data, Month End Data, bureau (EFX Data) and economic data (Unemployment rate, Price Indexes, BK Filings).

Responsibilities:

●Involved in business requirements, technical requirements, high-level design, and detailed design process.

●Delivered mapping specifications (Source to Target), maintain Metadata Repository, reviewed source systems and proposed data acquisition strategy.

●Involved in developing Unit Test cases for the developed mappings.

●Extracted data from the Teradata database using SAS/Access, SAS SQL procedures and created SAS data sets.

●Worked on loading of data from several flat files sources to Staging using Teradata MLOAD, FLOAD.

●Extensively worked in the performance tuning of transformations, Sources, Sessions, Mappings and Targets.

●Used ETL methodology for supporting data extractions, transformation and loading process.

●Responsible for preparing ETL strategies for extracting data from different data sources like DB2, SQL Server, Flat file and XML

●Involved in the logical and physical design of the database and creation of the database objects.

●Developed Test Cases based on Business Requirements, System Requirements, High Level Design Documents and Detailed Design Documents.

●Developed UAT test plans based on business scenarios.

●Extensively used the Quality Center for writing Test Cases and updating the Test Results after execution of test cases.

●Involved in entire QA Life Cycle, which includes Designing and Developing and Execution of the entire QA Process and documentation of Test Plans, Test Cases, Test Procedures and Test Scripts.

●Ensured data integrity and referential constraints for accurate database operations.

●Involved in the performance tuning of the application through creation of necessary indexes.

●Developed complex ETL mappings and worked on the transformations like Source qualifier, Joiner, Expression, Sorter, Aggregator, Sequence generator, Normalizer, Connected Lookup, Unconnected Lookup, Update Strategy and Stored Procedure transformation.

●Delivered mapping specifications (Source to Target), maintain Metadata Repository, reviewed source systems and proposed data acquisition strategy.

●Involved in developing Unit Test cases for the developed mappings.

●Responsible for the migration of data from Extract Tables to the flat files and relational tables

●Performed Unit testing, Integration testing and generated various Test Cases.

●Performed Data analysis and Data validations.

●Coordinating tasks and issues with Project Manager and Client on daily basis.

●Worked with DBA’s for transition from Development to Testing and Testing to Production.

●Worked with the users and testing teams to implement the business logic as expected.

●Written several Teradata BTEQ scripts to implement the business logic.

●Involved in gathering, analyzing, and documenting business requirements, functional requirements and data specifications for Business Objects Universes and Reports.

●Created complex and reusable Macros and extensively used existing macros and developed SAS Programs for Data Cleaning, Validation, Analysis and Report generation. Tested and debugged existing macros.

●Used the Import and Export facilities in SAS to exchange data between SAS and Microsoft Office environments (Excel, Access)

●Developed reports as per business requirements and created various reports like Summary reports, Tabular reports etc.

●Created Macros and developed SAS programs for Data cleaning, Validation, Analysis and Report generation using newly created as well as existing macros.

●Performed Statistical Analysis with Statistical procedures and Univariate procedures from Base SAS and SAS/STAT.

●Validated the Metadata changes and database schema changes during production roll out.

●Getting data to end user groups using VBA macros and Implemented Pivot tables.

●Physical and logical design of the dimensional model using ERWIN.

Capital One Mclean, VA

Role: Sr. Business Analyst/Data Analyst Jan 2010-Aug 2012

Capital One is one of the diversified companies in the world with activities as credit cards, banking and financial services with mostly recognized brands in America. Acquisitions CLIP decision science system enhances the value of technology by providing data warehousing and business intelligence capability for credit cards, banks to make strategic decisions supported by real life data. Accessing the data from geographically spread heterogeneous data environments and assimilating it into information views of decision-making build the target warehouse. Enterprise Data Warehouse (EDW) is a data warehousing solution for banking system for all the business units of Capital One. EDW is implemented using 3rd normal form. EDW contains terabytes of data mainly coming from Oracle, DB2, and legacy systems. It also holds data from supporting business areas like direct marketing and Internet marketing.

Responsibilities:

●Participated in requirements collection and played a key role in analyzing the business requirements including project design, data collection, analysis, summary of findings, recommendations and presentation of results.

●Understand and articulate business requirements from user interviews and then prepare technical specification documents.

●Responsible for the construction, design and testing of the primary modules: Maintenance, Upload/Transfer of reports, Reconciliation and Determination date statement (MS Access, VBA, Proc SQL, and SQL).

●Analyzed the existing Teradata applications for complexity and Enhancement.

●Participated in Joint Analysis Design with user community and Business users.

●Worked on loading of data from several flat files sources using Teradata MLOAD & FLOAD.

●Transfer of large volumes of data using Teradata Fast Load, Multi Load, T-Pump

●Query optimization (explain plans, collect statistics, Primary and Secondary indexes)

●Involved in physical and logical design of the applications.

●Transferred data between development and production environment using ARCMAIN.

●Worked on exporting data to flat files using Teradata FEXPORT.

●Rewrote the existing financial applications.

●Optimized database queries and applications.

●Used the Import and Export facilities in SAS to exchange data between SAS and Microsoft Office environments (Excel, Access)

●Used PC SAS to manipulate smaller pools of data as well as UNIX to extract and manipulate data stored in Teradata tables.

●Build tables, UPI, NUPI, USI, NUSI, macros and stored procedures.

●Written several Teradata BTEQ scripts to implement the business logic.

●Written COBOL and DB2 Programs.

●Developed new Control cards, JCL and PROC library members for data export, data loading and E-mail notification processes.

●Created tables, access views and update views in the development database for unit and system test.

●Developed FastExport, FastLoad and Multi Load scripts to simulate the environment in development database.

●Performed unit and system test for the modified code and loaded shadow data marts for testing prior to production implementation.

●Written several Teradata BTEQ scripts to implement the business logic.

●Worked exclusively with the Teradata SQL Assistant to interface with the Teradata.

Freddie Mac Mclean, VA

Role: Sr. Data Analyst Oct 2009-Dec 2009

Freddie Mac is a government-sponsored enterprise (GSE) chartered by Congress operates in the U.S. secondary mortgage market works with mortgage lenders and other primary mortgage market partners to help people get lower housing costs and better access to home financing at affordable rates.

Freddie Mac has three lines of business - Single-Family, Multifamily and Capital Markets. This project is about Data warehouses collecting data about mortgages and helping to control and manage Treasury sponsored programs. FAS-140 and SOX-404 projects require annual certification of Freddie Mac’s internal control over financial reporting and this project involved in providing required support in terms of CDW environments and ETL’s setup/execution.

Responsibilities:

●Attended meetings with business users to collect, organize data, and understand the process flow

●Received files from various databases and comparing the content of the files from one database to the other

●Design, execution, and analysis of data and programs support company strategic initiatives

●Extracting data from various data sources and generating ad-hoc reports for end-users, supporting senior management through special studies, project leadership, and tracking (Involves extensive use of SQL)

●Generate data marts to support the above applications using DTS packages on SQL Server and SAS.

●Participated in Joint Analysis Design with user community and Business users.

●Created SAS data sets, cleaned data, extracted data and merged data sets using Base SAS, SAS/SQL and SAS/Macro.

●Read data from Excel and flat files into SAS and transferred data between databases.

●Enhanced several existing Mainframe ETL JCL job streams for a large DB2 CDW to incorporate several new data elements.

●Extracted data from the DB2 database using SAS/Access, SAS SQL procedures and create SAS data sets.

●Created detailed reports and graphs for management review

●Manipulated multiple datasets, aggregating data, and merging datasets for analysis

●Validate technical designs created by IT developers against functional specifications

●Worked with QA team to design test plan and test cases for User Acceptance Testing (UAT).

●Implemented VBA object model for data manipulation and data calculation.

Prepared low level design documentation for implementing new data elements to CDW.

EDUCATION

●Bachelor of Technology in Computer Science Engineering.

●Teradata Certified Master.

●SAS Certified Advanced Programmer for SAS9



Contact this candidate