Post Job Free
Sign in

Data Analyst

Location:
Richmond, VA, 23221
Salary:
$55/hr
Posted:
December 11, 2017

Contact this candidate

Resume:

Pavani VISA : GC

**********@*****.***

Data Analyst (AWS Certified – Developer Associate) 804-***-****

Profile: ANALYST EXPERIENCE PROFICIENT WORKER GOOD TIME MANAGEMENT

TEAM BUILDING SERVICE ORIENTED STRONG PLANNER

TALENTED MULTITASKER EXCELLENT COMMUNICATOR DETERMINED EXECUTOR

PROFESSIONAL SUMMARY:

Over five years of experience as Data Analyst for Financial and Logistics Domain

Comprehensive knowledge of Software Development Life Cycle (SDLC), having thorough understanding of various phases like Requirements, Analysis/Design, Development and Testing

Facilitated knowledge of Agile Methodology, Scrum, Waterfall Methodology and Kanban.

Good Tableau Experience in Enterprise Environment, in Tableau 8.1.6 and 8.2 as an Analyst.

Knowledge on Tableau Desktop, Tableau Reader and Tableau Server

Performed in depth analysis and prepares reports by using Ms Excel, Ms Access, SQL, PL/SQL Teradata, and BTEQ.

Good knowledge in Teradata Bteq, Mload, Fload, Tpump and Oracle (Sql, Plsql).

Good Experience in Performing Adhoc reports by using TeraData, Oracle, SQLServer, BTEQ, and UNIX.

Developed weekly, biweekly, monthly reports by using Ms Excel, Ms Access, SQL, PL/SQL AND UNIX.

Experience in shell scripting, Automation using UNIX and Scheduling tasks.

Experience in using import to extract data from flat files (csv, tab, fixed format) and convert it into a Teradata table for business analysis.

Good experience in Unica IBM tool.

Experience in Reporting, Validation and Documentation.

Extensively helped users with their Teradata queries in SQL Assistant to extract the required information – enabling them to take their decisions in better way.

Worked as a liaison between Business users (stakeholders) and Technical team.

Experience scheduling the jobs in UNIX using crontab and sleep functions.

Experience in Building up VBA Macro for Processes which are recurring jobs to pull data for specific requirement.

Experience in Data warehouse, ETL-Extraction/Transformation/Loading, Data Analysis, Data migration, Data preparation

Interacting with clients and gathering the specifications for marketing campaigns.

Experience in writing and implementation of the Packages, Triggers, Stored Procedures, Functions, Views, Indexes at Database level and Form level

Knowledgeable in AWS Environment for loading data files

Experience in running scripts in SAS Enterprise Guide and generating reports by using Pivot tables.

Experience in Writing BASE SAS Programs for converting Teradata table into Flat files (CSV, Fixed Format etc.)

Experience in interacting with clients and gathering the specifications for marketing campaigns and mapping the requirements in Unica Affinium Campaign tool.

Sound knowledge of Change Management process

Excellent communication, interpersonal, analytical skills and strong ability to perform as part of a team.

WORK AUTHORIZATION STATUS: EAD

TECHNICAL SKILLS:

Methodologies

Waterfall, Agile Methodology, Scrum

Tracking and Reporting Tools

Tableau, Business Objects, SAS

Languages

SQL, UNIX,VBA,MY SQL

Databases

TERADATA, Oracle 10g, 9i, SQL Server 2000, TOAD, MS – Access, Redshift

GUI Tools

Affinium Campaign ( V7.2), MS-OFFICE

Database Tools

SQL Server Enterprise Manager, Query Analyzer, SQL Server Management Studio, SQL Server Integration, SQL developer, TOAD, Teradata SQL Assistant, Services (SSIS), Informatica.

Platforms

Unix (HP, Solaris8.0), Windows

IBM Tools

Unica

Education:

Bachelors in Computer Science

BUSINESS SKILLS:

Business Requirements and Functional Requirements gathering, mapping customer needs

Use Case and Business Process Modeling

Impact, Feasibility and Risk Analysis

Report generation

DATA ANALSYS SKILLS

Data Requirements Gathering

Data Processing and Cleaning

Data Analysis and Data Modeling

Data Mapping and Reporting

Capital One, Richmond, VA

Role: Data Analyst Oct 2015 – Till Date

Roles and Responsibilities:

Responsible for the design, development and administration of transactional and analytical data constructs/structures.

Expertise in data quality, data organization, metadata, and data profiling. Data set sizes are usually huge (in excess of hundreds of millions of records). Demonstrated ability to move from one sequential assignment to the next (work environment and priorities can change quickly depending on business needs).

Responsible for gathering requirements from Business Analysts and Operational Analysts and identifying the data sources required for the requests.

Worked with business analysts, senior project managers, and developers to gather business requirements and specifications.

Involved in merge project which requires merging of duplicate data in various systems.

Involved in one of the commercial Banking Process.

Highly experienced in SRM, WCIS Applications.

Used SRM Application to merge the duplicates.

Worked on data profiling, data analysis and validating the reports sent to third party.

Experience in the process of segmentation as a dual such as deriving the basic population and applying the general inclusion and exclusion criteria for strategy analysis and implementation.

Worked with business analysts, senior project managers, and developers to gather business requirements and specifications.

Migrated the code from PROD environment to PREP environment for enhancements in production code and worked on various issues involved in sql coding.

Designed complete intent as of a flowchart using Affinium Tool.

Tested the results based on pre-approved criteria for each individual Box/Segment in the flowchart.

Submitted Randomization samples for Business approvals and if necessary update the sampling quantities to ensure the desired sample falls within the Business range for all variables/parameters.

Submitted all reports necessary for Validation, Business and Process approvals.

Performed various data extracts from Teradata One View Data warehouse using SQL Assistant.

Performed data validations before submitting the change order to deploy the code to prod.

Submit all data validation forms for business approvals like Campaign eligible population/final segments confirming to business requirements with other additional metrics if necessary to the business team.

Perform data testing to confirm business/data requirements.

Perform all data confirmation validations with the vendor and in-house processors.

Extracted data from existing data source and performed ad-hoc queries by using Teradata SQL, SAS, Ms Access and UNIX.

Coded using BTEQ SQL of TERADATA. Wrote Unix scripts to validate, format and execute the SQL’s on Unix environment

Monitoring the existing code performance and to change the code for better performance.

Data was extracted from various production databases and exported into Ms Excel by using Teradata utilities, SAS, SQL and UNIX.

Developed Teradata macros and views to integrate complex business rules.

Optimized SQL statements for better performance using Explain PLAN Method.

Assisted in the analysis, design, coding and development of building a new schema (staging tables, views, SQL files).

Prepare data for transfer to internal lead distribution system or third party vendor for delivery via ETL process.

Created Pivot tables, Pivot charts and reports in different formats by using advanced features in MS Excel.

Performed in-depth quantitative analysis or data analysis.

Developed large size flow charts to segment population with added inclusions and exclusions for balance build events.

Used Tableau to generate reports based on business requirements.

Proficient in importing/exporting large amounts of data from files to Teradata and vice versa.

Communicated with business users and analysts on business requirements. Gathered and documented technical and business Meta data about the data.

Responsible for Data Verifications and Validations to evaluate the data generated according to the requirements is appropriate and consistent.

Environment: Teradata V15, Teradata SQL Assistant, Tableau, Oracle Siebel, Unica Affinium Campaign, UNIX, SQL, Microsoft Office Tools, AWS

Connexions Loyalty, Richmond, VA

Role: Data Analyst Jan 2014 – Sep 2015

Roles and Responsibilities:

Responsible for gathering requirements from Business Analysts and Operational Analysts and identifying the data sources required for the requests.

Designed and developed various SQL scripts as part of automation of manual monthly reporting to third party consumer.

Modified, maintained and developed the SQL codes to develop new enhancements.

Created, modified and reviewed Procedures, Packages, Functions and cursors to suit business requirements.

Created maintained and compared objects in multiple schemas for developers.

Trouble-shooted performance issues and bugs within PL/SQL packages.

Worked on creating stored procedures, functions, triggers and packages.

Worked on performance tuning using the execution plan, explain plan and oracle hints.

Worked on materialized views and table partitioning for the better performance.

Tuning of SQL Queries, Procedures, Functions and Packages using EXPLAIN PLAN and TKPROF.

Used Bulk Collect feature to improve performance.

Analyzed and computed statistics to improve response time for queries.

Involved in Performance tuning of complex queries.

Used Informatica to load large flat files in the database.

Involved in meetings with end users to help establish use cases and design business process automation solutions.

Performed changes to the interfaces on weekly bases as per the requests from the end users.

Domain Knowledge sharing within & across projects.

Trouble shooting within & across projects.

Worked on data profiling, data analysis and validating the reports sent to third party.

Designed and developed Ad-hoc reports as per business analyst, operation analyst, and project manager data requests.

Extracted, Analyzed and Presented Data in a Graphical Format, such a way that helped project manager to make Business Decisions

Communicated with business users and analysts on business requirements. Gathered and documented technical and business Meta data about the data.

Worked on Data Verifications and Validations to evaluate the data generated according to the requirements is appropriate and consistent.

Environment: Oracle 11g R2/10g/9i, PL/SQL, TOAD 9, SQL*Plus, Solaris, Informatica 9.6, Microsoft-Office

New Breed Logistics, Greensboro, NC

Role: Data Analyst May 2012 Dec 2013

Roles and Responsibilities:

Verified and imported flat files into Teradata tables using import facility to perform data analysis.

Developed UNIX BTEQ scripts for automation.

Performed data validation, analysis, and reporting on a daily basis.

Good exposure to Mainframe Systems and knowledge in handling COBOL files.

Extracting data from the Mainframe flat files using SAS and creating SAS Data Sets. Modifying data using BASE SAS.

Performed User Acceptance Testing (UAT) to verify that the data is provided according to the customer’s request.

Involved in Building up VBA Macro for Processes which are recurring jobs to pull data for specific requirement.

Extracted data from existing data stores and developing and executing departmental reports.

Created complex SQL queries to retrieve the data requested using Teradata SQL Assistant.

Daily utilized various SQL functions like Aggregates, Set Operators, CASE Expressions, String Functions, Subqueries, Inner and Outer Joins, System Calendar, OLAP CSUMS and Ranking Functions and more.

Worked on Email campaigns, inbound, outbound, Direct Mail and In Statement Letters using Unica Affinium Campaign.

Interacting with clients and gathering the specifications for marketing campaigns and mapping the requirements in Unica Affinium Campaign tool.

Created Macros, Triggers, Stored Procedures, and Views in Teradata SQL Assistant.

Experience in converting the SQL scripts into SAS SQL scripts.

Automated Windows SAS Scripts on UNIX SAS platform.

Daily created Volatile Table, Global Temporary Tables, Multiset Tables, Derived Tables, and Views as needed to retrieve requested data.

Utilized MS Excel to generate Reports, Pivot Tables, vlookups, hlookups, indexes, match and more.

Utilized MS PowerPoint to present the reports generated.

Environment: TeradataV13, Teradata SQL, Putty, Teradata SQL Assistant, TOAD, Vi Editor, Edit Plus, BTEQ, SAS, Excel VBA, UNIX, Korn Shell Scripting, MS Excel, PowerPoint, Access



Contact this candidate