Post Job Free

Resume

Sign in

Pricing Analyst

Location:
Covington, GA
Posted:
April 16, 2024

Contact this candidate

Resume:

NAME: Suneetha G

Email: ad41n0@r.postjobfree.com PH: 470-***-****

Sr. Data Analyst

PROFESSIONAL SUMMARY:

I have over 10+years of experience in Data analysis, software development, and design using Python and Data Analytics using Python/R/SQL/Power BI/Tableau.

I have expertise in Tableau, Tableau Desktop, Tableau Server, Tableau Online, Tableau Mobile, Tableau Public, Tableau Reader, SQL, PL/SQL, and Unix Shell Scripting.

Designed several business dashboards using Power BI, Tableau Desktop, and Tableau server and hosted them across the client servers.

Explore data in a variety of ways and across multiple visualizations using Power BI.

Responsible for creating SQL datasets for Power BI and Ad-hoc reports.

Expert in creating multiple kinds of reports in Power BI like Tabular Reports.

Robust MS Excel skills with Power Pivot, Pivot table, Pivot Charts, Power View, Power BI, and VLOOKUP functions and formulas.

Strong exposure to writing simple and complex SQL, PL/SQL queries.

Experience with relational and non-relational databases such as MySQL, SQLite and SQ

Expertise in Requirement gathering, system analysis, application design, development, testing, configuration management, client/user support services.

Experience on industry leading Digital Analytics platforms i.e., Google Analytics.

Worked on several interactive, real-time dashboards using drilling downs and customizations with Filtering, the ability to combine multiple data sources and blend the data to achieve the desired result.

Worked on data warehouses with expertise in all phases of the data warehousing lifecycle along with implementation and testing of software products.

Expertise in Relational Database Management Systems, Tableau, AWS cloud computing and Microsoft Azure Cloud.

Used SSRS to create basic reports containing tables and graphs and more complex data visualizations, using charts, maps, and sparklines.

Performed maintenance activities like cleaning logs and archiving log files at prescribed intervals.

Used Power BI and Tableau's flexibility to build a tabbed application with a wide range of options like quick filters, calculated fields, dynamic interaction, and flexible ad-hoc capability in various functional areas like finance, sales, marketing, and Revenue Management. TECHNICAL SKILLS

Databases Teradata, MS SQL Server, DB2, AWS RDS, Teradata, Oracle, MySQL, Microsoft SQL

Programming Languages C, Visual Basic, and C++, SQL, Python, R Cloud Technologies AWS, Azure, Ec2, S3, Docker

Scripting Languages MS-DOS, Bash, Korn

Data modeling Sybase Power Designer / IBM Data Architect BI & Reporting tools Business Objects, Cognos, SSRS, Crystal Reports Visualization tools Tableau Desktop, Power Bi, Python ETL Tools / Tracking tool Informatica, SSIS, SSAS, SSRS / JIRA Database Development T-SQL and SA, Microsoft Hyper-V servers Testing and defect tracking

Tools

HP/Mercury (Quality Center, Win Runner, Quick Test Professional, Performance Center, Requisite, MS Visio & Visual Source Safe

Operating Systems Windows, UNIX, Sun Solaris

PROFESSIONAL EXPERIENCE:

Client: Epsilon, Irvin, Tx Dec 2023 to Present

Role - Sr. Data Analyst

Responsibilities:

Working with the business team to understand the customer data and gather the requirements to develop and analyze required systems.

Worked on Requirement Analysis, Data Analysis and Gap Analysis of various source systems sitting and coming from multi-systems. Responsible for BI Data Quality.

Conducted JAD sessions to allow different stakeholders such as editorials, designers, etc.

Performed Business Process mapping for new requirements.

Performed Data Validation with Data profiling.

Created different charts such as Heatmaps, Bar charts, Line charts.

Experience in developing Spark applications using SQL in Databricks for data extraction, transformation, aggregation from multiple file formats for Analyzing & transforming the data to uncover insights into the customer usage patterns.

Experience on Unified Data Analytics with Databricks, Databricks Workspace user interface, Managing Databricks notebooks, Delta Lake with python.

Used SQL, PL/SQL to validate the Data going into the Data Warehouse.

Designed and developed ETL workflows and datasets in Informatica.

Used Informatica for Data preparation and then tableau for visualization and reporting.

Processed data in Informatica creates TDE for tableau reporting.

Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing)

Verified and maintained Data Quality, Integrity, data completeness, ETL rules, business logic.

Verified Informatica sessions, worklets, and workflows in QA repository.

Extensively used SQL queries to check storage and accuracy of data in database tables.

Used SQL for Querying the Oracle database.

Performed all aspects of verification, validation including functional, structural, regression, load, and system testing.

Involved in backend testing for the front end of the application using SQL Queries in Teradata database.

Tested different detail, summary reports and on-demand reports using Report Studio

Experienced in working with DB2, Teradata.

Client: Vertex Pharmaceuticals, Bangalore, India May 2016 to August 2021 Role - Data Analyst/Tableau Developer

Responsibilities:

Collaborated with team members, product owner, and technical manager to gather requirements, and clarify requests with various meetings utilizing agile methodology.

Data Warehouse is implemented using sequential files from various Source Systems.

Developed Mapping for Data Warehouse and Data Mart objects.

Design and Develop ETL jobs using the DataStage tool to load data warehouse and Data Mart.

Developed Unix Shell Scripts to automate the Data Load processes to the target Data warehouse.

Responsible for interacting with stakeholders, gathering requirements, and managing the delivery. Proficient in using Tableau Versions 2019.2 and Snowflake using SQL to extract data from Snowflake for writing queries and data analysis for dashboard development.

A scalable and highly available object storage solution for big data is done in Azure Blob storage.

A cloud-based business intelligence and data visualization service for analyzing and visualizing data is performed in Azure Power BI.

Developed Tableau Data visualization using heat maps, Scatter Plots, Geographic maps, Pie and bar charts, Symbol maps, Horizontal bars, histograms, Packed bubbles, and Candle stick stats.

Written procedures, Functions, and SQL Logical statements for the data analysis.

Created complex calculations in Tableau reports including string manipulation, basic arithmetic calculations, custom aggregations and ratios, date math, logic statements, and quick table calculations.

Extensive Data Warehousing experience using Informatica as an ETL tool on various databases like Oracle, SQL Server, Teradata, and MS Access.

Worked on Azure Active Directory.

Responsible for migrating legacy applications to AWS & Azure clouds and migrating to SaaS solutions.

Prepared dashboards using calculations and parameters in Tableau. Designed and implemented near-real- time monitoring dashboards for Tableau Visualization users. Tested the data connections that brought the data from the database to the Tableau dashboards to develop the reports.

Design and develop ETL workflows and datasets in Alteryx to be used by the BI Reporting tool. Used Alteryx to perform data blending, created workflows to process data, and integrated with Tableau.

Used various ETL tools to extract data into Desktop and data wrangling using Alteryx in the data preparation phase. Performing end-to-end Data validation.

Product owner for revenue data rate volume dashboard and automated python script for Price Bucket review.

Developed complex visualizations reports using data from relational databases using Tableau Desktop.

Prepare technical specifications and documentation for Alteryx workflows supporting BI reports.

Report generation and automation using Visual Basic Macros (Snowflake ODBC) and Snowflake Python Connector. Automate Revenue data refreshes and business reporting with Python scripts. Build complex SQL queries from Snowflake and Oracle enterprise data warehouses for use as data sources for Tableau or Excel reporting. Work with business stakeholders to create engaging and informative business reports. Client: ITC Infotech, Bangalore, India July 2013 to May 2016 Role - SQL Developer/ ETL Developer

Responsibilities:

Worked with the development and testing teams about the ETL processes involved in the project.

Worked on Requirement Analysis, Data Analysis and Gap Analysis of various source systems sitting and coming from multi-systems. Responsible for BI Data Quality

Extracted Data using SSIS from DB2, XML, Oracle and flat files, Excel perform transformations and populate the data warehouse.

Performed Teradata SQL Queries, Creating Tables, and Views by following Teradata Best Practices

Performed Business Process mapping for new requirements.

Use SQL, PL/SQL to validate the Data going into the Data Warehouse

Creating complex data analysis queries to troubleshoot issues reported by users.

Evaluates data mining request requirements and helps develop the queries for the requests.

Worked with Business Analysts and developers while reviewing the Business Requirement Documents and when there are enhancements in the applications.

Defined data requirements and elements used in XML transactions.

Developed Test Cases for Deployment Verification, ETL Data Validation, Cube Testing and Report testing.

Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases.

Independently designed and developed project document templates based on SDLC methodology.

Prepared Entry and Exit criteria for Testing cycles, review with functional leads and PMO.

Exported Requirements, test plans and test cases to QC 9.2

Created Traceability Matrix to ensure that all requirements are covered in test cases.

Prepared Kick-off meeting PowerPoint and gave a presentation on testing activities for each cycle of testing and for different releases.

Wrote complex SQL, PL/SQL Testing scripts for Backend Testing of the data warehouse application. Expert in writing Complex SQL/PLSQL Scripts in querying Teradata and Oracle.

Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting Test cases, Test Scripts in Quality Center 9.2

Created defects, managed defect reports and defect status in ClearQuest tool.

Extensively assessed the Business Objects report by running the SQL queries on the database by reviewing the report requirement documentation.

Client: Strata Greeks Software India Pvt. Limited Aug 2011 to June 2013 Role – SQL Developer

Responsibilities:

Developed and managed SQL databases by planning, developing, and maintaining the databases.

Used structured query language (SQL) to create and modify database tables using CRUD SQL commands.

Involved in Analysis, Design, Development and Testing phases.

Created technical specifications, test plans and test data to support ETL data flows.

Created complex stored procedures, triggers, cursors, tables, views, and other database objects using TSQL.

Created Reports using stored procedures as per customer requests.

Building databases and validating their stability and efficiency.

Creating program views, functions, and stored procedures.

Writing optimized SQL queries for integration with other applications.

Used Toad Client for accessing Oracle databases that contained views of the domestic application.

Designing database tables and dictionaries.

Responsible for performing T-SQL tuning, multi-tasking and optimizing queries for reports which take longer time in execution with SQL Server 2008/2005.

Created Complex ETL Packages using SSIS to extract data from staging tables to partitioned tables with incremental load.

Created and gathered requirements for SSRS reports and Spotfire reports.

Design and implementation of Security models, policies, and procedures.

Query Optimization using Query Analyzer, Profiler, Show plan, Index Tuning and Stats Time Tool.

Configured Transactional Replication between the Production server and Reporting server and to move the data to servers in various locations.

Configured mail profile for sending automatic mails to respective people when a job is failed or succeeds.

Developing scripts, procedures, and triggers for application development.

Involved in designing of reports using SSRS.

Maintaining data quality and backups and overseeing database security. Certifications:

SQL – MySQL for Data Analytics and Business Intelligence

Google Data Analytics

Visa Status:

H1-B Visa Valid from Dec-2023 till 2026.



Contact this candidate