Post Job Free

Resume

Sign in

Data Analyst Analysis

Location:
Georgetown, TX
Posted:
November 14, 2023

Contact this candidate

Resume:

SOWMYA CHADALAVADA

678-***-****)

PROFESSIONAL SUMMARY

Nine years of experience in data analysis, production support and testing of applications on various platforms.

Experienced as Data Analyst dealing with large datasets in financial and telecom domains, providing support to many Production instances comprising of medium and highly complex applications.

Experienced in working with Python and Databricks Cloud and Notebooks.

Experienced in creating Databricks analysis scripts using Snowflake and AWS S3 as Data Sources and Syncs.

Experienced in working on AWS Cloud.

Experienced in converting scripts from Teradata SQL to Snowflake SQL.

Experienced in dealing with financial organization data in highly secured environment.

Expertise using SQL, Unix and Python Scripting to create reports.

Experienced in Data Analysis using SQL on large scale Oracle Database and Snowflake Database.

Experienced in creating Dashboards using AWS Quick sight Dashboard.

Experienced in working with Capital One internal software (Fractal) for customer credit adjustments and refunds.

Experienced in Gathering, Understanding requirements from Business teams in financial organization and implementing analytical reports as per the requirements.

Expertise in Cable domain working with Time Warner Cable.

Experienced in analyzing large datasets (over 600 million events per day) such as Channel tuning activity for Audience Measurement.

Experienced in identifying data issues by doing trend analysis and running adhoc reports using SQL.

Knowledge of Data Warehousing concepts and understanding of working with Teradata Data Warehouse.

Used statistical techniques to track and analyze data.

Interacted with developers, testers and all other teams to ensure quality application and played a key role in timely delivery of the application.

Experienced in writing complex SQL Queries and using advance features like materialized views.

Used advanced Microsoft Excel to create pivot tables, used VLOOKUP and other Excel functions.

Extensive experience in tracking bugs and issue status using JIRA, RALLY, TRELLO and ISSUE TRACKER.

Excellent technical and communication skills and ability to work effectively and efficiently in dealing with multiple teams.

Experienced in Conducting daily Status meetings with offshore team to monitor the state of the project

Highly motivated self-starter and can work independently in the support, testing, implementation and maintenance of daily jobs/processes.

Technical Skills:

Cloud Technologies : AWS, Databricks (Spark), Snowflake.

Databases : Snowflake, Redshift, AWS S3, Teradata, PostgreSQL, Oracle 11g/10g, My SQL, Datalake.

Languages : PySpark, SQL, SparkSQL, Python, Pandas, PL/SQL, UNIX shell scripting, SAS.

Issue Tracking : JIRA, ServiceDesk, TRELLO, RALLY, Sharepoint.

Operating Systems : Windows 10, Windows XP, Windows 7/8, UNIX (Solaris).

Others : QuickSight, SQL Developer, File Zilla, Putty, MS Word, Excel, Fusion Works, VNC Server, VNC viewer, BIRT Ihub Designer and Dashboard, SharePoint, Kibana, Tableau Desktop, Spyder.

Education Qualification:

Bachelor of Engineering in Electrical and Electronics Engineering, JNTU, Kakinada, Andhra Pradesh, India.

Professional Experience:

Aug 2021 – May 2023 (Capital One)

Project Name: Retail Bank – Bank Operations

Role: Data Analyst

Company: Capital One

Location: McLean

Project Description:

Retail Bank – Data Analytics Bank Ops team supports different Capital One Retail Bank Operations teams in data analysis. We develop and create reports on the Bank customers data as per the business requirements. We collect Business requirements from branch management teams and develope reports on Databricks and Airflow Platforms using Spark, PySpark, Python, Snowflake, S3 and Data lakes and run the reports as per the team requirements. We support different Bank Operations teams like Small Business team, AMLKYC team, Bankruptcy and Escheatment team and ATM Operations team on daily basis.

Roles & Responsibilities:

Responsible for gathering Business requirements from Bank Ops team and creating and testing scripts on Databricks, Airflow and Snowflake as per the Business needs.

Responsible for making changes to the existing reports based on Business Requirement Documents (BRD).

Creating and Maintaining Tableau Dashboards using Excel sheets and Snowflake Database data.

Responsible for sending quality data to the business team and fixing the data problems as per the intent owner requirements.

Responsible for demonstration of reports to business owners to get the feedback and make any changes necessary.

Coordinated with external vendor team (AIS) for report generation and delivery.

Responsible for sending NPI reports manually to the team in a highly secured environment.

Review and ensure analysis data to meet regulatory and company standards before sending reports to the business team.

Responsible for identifying correct Source data on different platforms (S3, Snowflake, Redshift, Onelake) and confirming with the intent owner as the part of new report built.

Responsible for researching data related issues in both source and processed data.

Responsible for creating Jira cards and resolving data related issue in the production reports.

Attend daily status/standup meeting to follow-up and resolve production issues.

Created Service desk tickets with Snowflake team in case of source data issues and communicate with different teams (Databricks and Airflow teams) through Slack channel for Platform issues.

Participated in monthly business review (MBR) meetings to learn about the new requirements and the business needs.

Jan 2018 – March 2021 (Capital One)

Project Name: Judgmental Underwriting – card and US Card Customer Management

Role: Data Analyst

Company: Capital One

Location: McLean

Project Description:

Judgment Underwriting – Card: Judgmental Underwriting team used different reports running on Teradata datawarehouse using shell scripting and SQL. As part of the modernization project these reports are migrated to Snowflake datawarehouse running on Databricks Cloud using Python.

US Card Customer Management: US Card Customer Management team has number of reports that are executed on Databricks and Snowflake to monitor regulatory and compliance reports. As part of the modernization project the conventional datawarehouse is migrated to Cloud based Snowflake datawarehouse. Older reports which were developed using SAS and SQL were migrated to Python and Snowflake running on Databricks Cloud. We collected Business requirements from Process management/ Business team and developed reports on Databricks Platform using Spark, PySpark, Python, Snowflake, S3 and Datalakes and run the reports as per the business team requirements.

Roles & Responsibilities:

Converted Teradata scripts to Snowflake using Python and scheduling them on Databricks Cloud.

Converted Partnership FAV -Chicago reports from SAS - Teradata scripts to Python-Snowflake on Databricks Cloud.

Responsible for gathering Business requirements and creating and testing scripts on Databricks and Snowflake as per the Business needs.

Responsible for sending Daily / Monthly reports to Triage team for Analysis.

Responsible for Tokenization/Detokenization using Turing tool.

Responsible for monitoring Daily Reports and anomaly detection.

Responsible for making changes in the existing reports based on Business Requirement Documents (BRD).

Responsible for sending quality data to the business team and fixing the data problems as per the intent owner requirements.

Responsible for researching data related issues in both source and processed data.

Responsible for identifying correct Source data on different platforms (S3, Snowflake, Redshift, Whirl, Onelake) and confirming with the intent owner as the part of new report built.

Attend daily status/standup meeting to follow-up and resolve production issues.

Create Service desk tickets with Snowflake team in case of source data issues and communicate with different teams through Slack channel for Platform issues.

Provided central point for questions/response between technical resources and business area.

Coordinated with AWS and Databricks teams during system upgrades and maintenance and actively involved in bringing the system back after maintenance.

Review and ensure analysis data to meet regulatory and company standards before sending reports to the business team.

Responsible for creating daily reports and analyzing the reports to find out potential issues with data processing.

Responsible for demonstration of reports to business owners to get the feedback and make any changes necessary.

Responsible for running monetary adjustments and refunds for credit card customers using Fractal.

Jan 2017 – July 2017 (Openet Telecom)

Project Name: CABS for Time Warner Cable

Role: Data Analyst

Company: Openet Telecom

Location: Reston

Project Description:

The Carrier Access Billing System (CABS) processes usage data in a method that allows for the changes to be isolated up front when new usage sources are identified or existing usage sources are changed. CABS system collects TWC’s carrier network usage data records from various switches, correlates them and enriches them with reference data sources such as LERG routing guide. Enriched records are written to downstream files and delivered to downstream systems such as billing systems and Data Warehouse.

Roles & Responsibilities:

Statistical analysis reports creation and automation.

Analyzing quality of output data stored in Oracle tables.

Creation of trend analysis reports using SQL on enriched data.

Anomaly detection and Identification of enrichment issues with reference to source data.

Monitoring of daily jobs and automation of manual processes.

Production implementation coordination between the development and operations teams.

Responsible for day to day operation tasks and system cleanup/housekeeping tasks on database.

Actively involved in the performance tuning to reach the client time lines to deliver data.

Involved in gathering information for various scenario for Change Data Capture.

Managing calls every day including work status and weekly team status.

Involved in the installation, configuration changes and upgraded production system.

Documented daily operational steps and configuration changes.

Researching data related issues in both source and processed data.

Communicated with developers and other team members to fix the problems and enhancing the application.

Created Service desk tickets with UNIX team in case of no response from server and supported to restart the entire system.

Experience with SQL stored procedures and tables.

Review and ensure analysis data to meet regulatory and company standards.

Apr 2013 – Jan 2017 (Openet Telecom)

Project Name: Audience Measurement for Time Warner Cable

Role: Data Analyst and Operations Support

Company: Openet Telecom

Location: Reston

Project Description:

Audience measurement application collects TWC’s customers channel tuning activity from millions of active Set Top Boxes (STBs) correlates them with other reference data sources and generates enriched channel tuning events which are then fed into Data Warehousing system. Enriched tuning events are used for measuring ratings of the Cable networks.

The collection process includes the collection of daily usage and reference data files from the source systems and ingesting them into the database. The mediation process occurs across all streams and correlates the channel tuning events to create complete viewing impressions. Then the correlated viewing impressions are enriched with the reference data such as Subscriber data and Channel lineups. This process involves a series of lookups that are controlled by database Scheduled Jobs and procedures. Finally the processed data is delivered to downstream daily.

Roles & Responsibilities:

Responsible for creating daily reports and analyzing the reports to find out potential issues with data processing.

Responsible for running trend analytics such as week over week viewing activity to ensure data quality and identify issues with enrichment of raw data.

Responsible for researching reference source data (Subcriber data and Channel Lineups) issues and identifying anomalies.

Responsible for daily validation process and issue analysis by running SQL queries and generating adhoc reports.

Coordinating with various source teams regarding any data related issues in production.

Extensively used SQL queries to verify the processed data in tables. Involved in generation of the various validation reports developed.

Used advanced Excel functions to generate spread sheets and pivot tables.

Used Macros and basic VB for creation of reports in Excel.

Analyzed different types of fallout data scenarios and corrected by reprocessing source files.

Hands on experience on JIRA, TRELLO for tracking defects and product issue requests.

Provide central point for questions/response between technical resources and business area.

Responsible for day to day operation tasks and system cleanup/housekeeping tasks on database.

Production implementation coordination between the development and operations teams.

Responsible for working with the development team to review the data validation reports and come up with strategy to fix the processing issues.

Analyzing the queries in case of any long running jobs and collecting the stats for the tables involved in it.

Mentor the offshore team in resolving production issues.

Involved in the installation, configuration changes and upgraded production system.

Attended daily status/standup meeting to follow-up and resolve outstanding group issues.

Provided support to all external teams whose is dependent application data and also communicated with other teams when application is dependent on their system.

Analyzed the Business Requirement Documents (BRD) and laid out the steps for the data extraction, business logic implementation & loading into targets.

Participated in bi-weekly meetings with source team to discuss the source related issues, and to recollect the missing source data.

Created Service desk tickets with UNIX team in case of no response from server and supported to restart the entire system.

Coordinated with other teams and team members to follow up and close production related issues on Rally Tracker.

Supported weekend systems upgrades: provided technical support and knowledge of system availability times; in need of back outs, would set up conference calls to engage technical resources to coordinate action plan

Assisted in streamlining and automating processes to reduce and make daily workload more efficient.

Boosted efficiency by scheduling jobs for automated tasking and notification of failures and alerts.

Directly responsible for ensuring production order processing between multiple systems occur without issue, resolved any issues that did come up.

Followed daily workload process, monitored timely starts of downstream feeds, and investigated and expedited any delays.

Worked with offshore team to continue the process during offshore hours.

Nov 2012 –Apr 2013 (Openet Telecom)

Project Name: Digital Phone Time Warner Cable

Role: Data Analyst and Operation Support

Company: Openet Telecom

Location: Reston

Project Description:

Digital Phone is the mediation and rating application for Time Warner Cable’s wire line phone service. This application receives customer’s call records from various Network switches and receives reference data from Enterprise Data Warehouse. The application correlates multiple streams of data, enriches the call records with customer information and rates the call according to their rate plans.

Responsibilities:

Monitored daily jobs and file processing components on the system.

Actively involved in the performance tuning to reach the client time lines to deliver data.

Did automation for daily repeating steps for processing to process different reference files

Involved in gathering information for various scenario for Change Data Capture.

Managing calls every day including work status and weekly team status with client.

Adhoc processing report generation with SQL stored procedures and tables.

Assisted in streamlining and automating processes to reduce and make daily workload more efficient.

Involved and participated in Daily stand up calls and updated the progress

Created Excel reports using statistical information for the process.



Contact this candidate