Mounika Bobba
● Cell: 972-***-**** ● Email: ************@*****.***
WORK EXPERIENCE
Professional Summary:
Through my education and work experience at various companies, I have gained valuable practical and theoretical experience in data analysis, data visualization, problem solving and data management. I have mastered several tools and technologies such as SQL for data pulls, developing databases, Python and excel for statistical analysis, ETL tools, data quality checks and tableau for data visualization
Capital One, Richmond, VA Dec 2018-Feb 2020
Data Analyst
Enterprise Reporting
Worked in enterprise reporting team and created reports for MBR and Board of Directors on the availability of various digital transactions that capital one customers can perform.
• Conducted the Gap analysis to identify transactions for which the report is not generating availability metrics. This helped us to on board metrics for 52 new transaction (75% increase in the transaction reported)
• Communication with various LOB's to understand methodologies to develop availability metrics for their respective transactions. Collated data from different data sources such as Snowflake, Splunk, Influx DB and Elk to populate the availability DataMart in MySQL.
• Developed visualization for various availability metrics of different transaction using Tableau, this helped easy analysis of data and early identification of problems with metrics reported.
• Extensively worked on data quality and monitoring. Collaborate with LOB's to resolve the identified data issues
• Reduced the cycle time of report which enhanced visibility into the metrics for leadership early in the month
• Implemented a RDBMS schema in MySQL for the Availability metrics DataMart, which leads to easier reporting
• Created data lineage for all the metrics. This provides visibility into the data sources, data flow and systems, people involved in generating the availability metrics data. Capital One, Plano, TX Mar 2018-Nov 2018
Data Analyst
Data Transformation Project at Capital One Auto Finance: Worked on data transformation project at capital one Auto Finance division, Purpose of the project is to create a data lake on AWS cloud. Current warehouse is built on Teradata and will be migrated to Snow Flake RDBMS in AWS cloud.
• Involved in data validation for migration and remodeling of data warehouse from Teradata to snowflake.
• Coded several business metrics related to auto loans, using SQL and Python. Also performed UAT (user acceptance testing) and post production validation.
• Remediated the existing Teradata scripts and reports with AWS snowflake to make sure the outcome of the reports match using python scripting.
• Developed the source to target mapping documents and identify the IDE and business metrics for calculations.
• Documented and reported data quality issues to the data loading team to resolve the issues.
• Closely worked with Data Risk management team for creating data governance documents like Lineage (Source to target mapping per LOB), Business Data Quality Rules (BDQ) document, Data Classification document.
• Used change management best practices for deploying DDL’s for tables and code movement for ongoing and historical data load.
• Used GITHUB, SVN and JENKINS to version control and store all the scripts, documents used for the Metrics development. Argus Information, White Plains, NY Dec 2014-Aug 2017 Data / Programmer Analyst
Client: Santander Bank
• Designed, Developed and tested the ALM (Asset and Liability Management) and CCAR (Comprehensive Capital Analysis and review) data models, capital reports and FED Regulatory Submissions (CCAR FRY 14 Q, M and A). Mounika Bobba
● Cell: 972-***-**** ● Email: ************@*****.***
• Analysed the existing data structure of the client and created distinct logical DataMart’s for Loan level, Deposit level and customer level information
• Create statistical data models to help development of client portfolio strategies.
• Interacted with client team to understand federal reporting requirements of bank, created data warehouse and ETL procedures to automate the reporting processes. Reporting processes were automated for different entities of the client and various federal reporting requirements such as CCAR, ALM
• Perform detail analysis of metadata and created source to target data mapping documents which enabled easy identification of key data element and helped users to have a clear explanation of data elements used in reports.
• Lead the UAT team for ALM Data model and handled multiple releases during this process
• On - boarding a new bank data into our system by normalizing banks data and its historical validation Client: Reserve Bank of Canada (RBC)
• Implemented Edit checks to the client for submissions which helped them for validate the date.
• Perform End to End reconciliation for FED submissions using Y9C data.
• Implemented Testing, development and Production environments for the reporting applications.
• Defect triaging using JIRA tool and experience in agile methodologies for delivering projects
• Provide banks with market share, customer preference, marketing strategies and new product introduction using customized reports and financial analysis.
• Conceptualized and implemented the Argus tool kit which helps the client to monitor the results of edit check and quality of data.
Procter and Gamble, Cincinnati, Ohio May 2014- Nov 2014 MS BI Developer
• Developed and deployed the SSIS packages using different kind of data flow transformations like Aggregations. Conditional split, derived columns, lookups, Merge join, Pivot etc.
• Created the database objects like tables, store procedures, triggers, indexes etc using T-SQL to provide structure to the store data and to maintain database efficiently and scheduled the jobs for packages and store procedure, Maintained the jobs on daily basis.
• Responsible for creating database objects like table, Views, Store Procedure, Triggers, Functions etc. using T-SQL to provide structure to store data and to maintain database efficiently.
• Included Report Design and Coding for Standard Tabular type reports, including Drill down and Drill through functionality and Graphical presentations such as Charts and Dashboard type metrics, Tableau Reports using SSRS Microsoft, Bellevue, Seattle, WA Jan2014-April 2014 MS BI Developer
• Analysis and requirement gathering for SRS document.
• Created Cosmos scripts to extract data from log files.
• Developed and Deployed SSIS Packages using different type of transformation and maintain the jobs running in the server.
• Applied various data transformations like Slowly Changing Dimension, Aggregate, Sort, Multicasting, Conditional Split, Derived column in SSIS.
• Used advance features of T-SQL to design and tune T-SQL to interface with the databases and other applications in the most efficient manner.
EDUCATION
Texas A and M University-Commerce, Commerce, TX Aug 2012-Dec 2013 Computer Science
Jawaharlal Nehru Technological University, Hyderabad, India Aug 2008-May 2012 Bachelor of Science – Information Technology
SKILLS
Teradata, Snowflake, AWS, Python, MS SQL Server 2000/2005/2008/2012, SQL Server Integration Services (SSIS), SQL Server DTS, Data Stage, SQL Server Reporting Services (SSRS), SQL Server Analysis Services (SSAS), Rational Rose, Oracle 9i, TOAD 6.0, VB6,Java,Spring, Microsoft Products MS Word, MS Excel, VSS, MS Access, MS Power Point, SQL, T-SQL, PL-SQL, HTML, XML, C++, JavaScript, SharePoint, Unix ODBC, MS Visio.