Post Job Free

Resume

Sign in

Data Sql Server

Location:
Posted:
February 01, 2018

Contact this candidate

Resume:

Pranay T

Phone E-mail

937-***-**** ac4anf@r.postjobfree.com

SUMMARY

Over 6 years of experience in Analysis, Report design, Dashboard development, Testing and

Implementation of Business Applications and Client/Server applications.

Strong expertise in using Tableau software as applied to BI data analytics, reporting and

dashboard projects.

Strategic expertise in design of experiments, data collection, analysis and visualization.

Troubleshoot application errors, address/manage requests from the user and development

teams, analyze and support large financial data sets.

Work with the technology teams to diagnose and resolved problems and incidents.

Data research, reconciliation breaks, jobs/processing re-architecting, testing.

Data migration from across multiple environments

Work autonomously and be part of a technology support team supporting multiple platforms.

Analysis data using Microsoft excel, Microsoft SQL server, & Teradata, writing Unix script,

stored procedures, and databases were highly used.

Expert in Data Visualization development using Tableau to create complex, intuitive, and

innovative dashboards.

Proven ability to deliver and ensure success in high-pressure/high-risk situations.

Involved in many aspects of the SDLC including requirements gathering, proof of concept

Delivery, implementation, user support and maintenance.

Administered user, user groups, and scheduled instances for reports in Tableau.

Hands-on development assisting users in creating and modifying worksheets and data

Visualization, dashboards.

Extensively used Tab admin and Tab cmd commands in creating backup and restoring backup

of Tableau.

Administered user, user groups, and scheduled instances for reports in Tableau.

Deployed Tableau Server in clustered environment by adding worker machines.

Established Best Practices for Enterprise Tableau Environment and application intake and

Development processes.

Data Warehousing: strong Data warehousing experience using Informatica Power Center

9.5/9.0/8.6/8.1, Informatica Data Quality 9.5/9.0, OLTP, OLAP, Data Mining, scheduling tool.

Worked with heterogeneous source systems like flat files, XML, RDBMS.

Data Modeling: Data modeling knowledge in Dimensional Data modeling, Star Schema, Snow-

Flake Schema, Fact and Dimensions tables, Physical and logical data modeling and De-

normalization techniques.

Database: Using Oracle 11g/10g, Teradata TD13/TD12, Netezza, SQL, PL/SQL. Proficient in

working with PL/SQL to write stored procedures and triggers. Expertise in WinSQL, TOAD.

Proficient in Analyzing Business processes requirements and translating them into technical

requirements.

Worked with testing team to create test cases for unit and integration testing, expertise in

User Acceptance Testing Consultation with SME’s.

Involved in creating detailed design documents and performing Proof of Concepts (POC).

Good at Relational Database Management System (RDBMS) concepts.

Experience in managing Slowly Changing Dimensions (SCD) Type 1, Type 2 and Type 3, Data

Marts, OLAP, OLTP and Change Data Capture (CDC).

Provided 24/7 production support for Tableau users

SKILL SUMMARY:

Data Management

Process Improvement

Business Development

Reporting

Analytics

Financial Domain

TOOLS :

MS Excel, Putty, Tableau 9.2, 10.1, 10.2, Tableau server 10.3, MS office, HP ALM, DATA STAGE- IBM Web Sphere Data stage and Quality Stage 8.0, Ascential Data Stage /7.5.2/5.1/6.0 Profile Stage 7.0, SSIS (SQL server 2005), Data Integrator, SSRS(SQL Server 2005), Informatica PowerCenter 9.5/9.1.0/8.6.0/8.1.1 PowerMart6.2/5.1/4.7, Informatica Data Quality 8.5, Power Exchange 9.1

DATABASE SERVER :

MS SQL Server, MS-Access, Oracle, Netezza (Aginity Work Bench) etc.

LANGUAGES :

SQL, PL/SQL, oracle, C, C++

DATA OPERATIONS :

Operational Research, A/B Testing, Pattern Recognition, Predictive analysis, Visualization, Machine Learning (Supervised & Unsupervised), Spread Sheets, Data Analysis

PROFESSIONAL EXPERIENCE:

Nationwide Insurance Columbus, OH Oct ’17 – till date

Role: Data Analyst/Tableau/ETL Developer

Description: The Project was to track the spam emails employees of Nationwide shall receive on an average per year and create dashboards to track what number of employees respond to it considering them as work emails based and enroll the employees in the required security trainings.

Responsibilities:

Design Data Modeling, data analysis, in technology to the Preferred Segment Technology lines

of businesses.

Design, develop and implement enterprise data warehouse and datamart models, logical data

model, and physical data model. Analyze requirements, program and configure warehouses

of database information and provide support to warehouse users.

Business activities that include working business user to maintain associate information &

alternative hierarchy application that provides the ability to access and complete associate

details.

Gathered the requirements from the Business to create the prototypes for the dashboards

Created Parameters, Calculated fields, Reference lines, Trend lines, Sets, Groups to match

the conditions per the requirement while creating the charts

Build various charts like Bar Graphs, Pie charts, Funnel charts, Scatter plots, Donut chart

bubble charts etc. per the requirements.

Created action filter in the dashboards to navigate and map the relevant Data.

Created detailed reports to drill down the data and help users to view the data at a very

granular level.

Educated the user to utilize the dashboards developed and retrieve the significant data

within few seconds.

Managing Access to the users in the share point site and scheduling automation of

reports.

Distributed Tableau reports using techniques like Packaged Workbooks, PDF to different

user community.

Provided customer support to Tableau users and Wrote Custom SQL to support business

requirements.

Expert level capability in Tableau calculations and applying complex, compound

calculations to large, complex data sets.

Designed mappings using different transformations like Lookup’s (connected,

unconnected, Dynamic), Expression, Sorter, Joiner, Router, Union, Transaction Control,

update strategy, Normalizer, Filter, Rank, and Aggregator using Informatica Power center

designer.

Analyzed different data sources like Oracle, DB2, Flat files including Delimited and Fixed

width like text files, SQL Server, XML files from which the contract data and billing data is

coming and understand the relationships by analyzing the OLTP Sources and loaded into

oracle data warehouse.

Worked on Slowly Changing Dimension (SCD) Type 1 and Type 2 and 3to maintain the full

history of customers.

Responsible for performance optimization by writing SQL overrides instead of using

transformations, implementing active transformation like filter as early as possible in the

mapping, selecting sorted input when using Aggregator or Joiner transformations .

Implemented performance tuning at all levels like source level, target level, mapping level

and session level.

Created and configured workflows, worklets and sessions using Informatica Workflow

manager.

Used workflow level variables for the reusability of the code.

Used Mapping Parameters and Variables to implement Object Orientation techniques and

facilitate the reusability of code.

Environment: Windows 7, Agile methodologies, Tableau Server 10.x, Tableau Desktop 10.1, Advance Excel, IBM-Netezza, Aginity Work Bench, Informatica Power Center 9.1/Power Exchange/Informatica, Cloud, IDQ 9.1, Tivoli, Teradata, Oracle 11g, PLSQL, DB2, XML, MS SQL Server 2005, SQL* PLUS, MS Excel, UNIX(AIX), UNIX Shell.

Enqbator, Troy, MI Jan ’17 – Sep ‘17

Role: Data Analyst/Tableau Developer

Responsibilities:

Responsible for Design Data Modeling, data analysis, testing in technology to the Corporate

Treasury and Corporate Investment lines of businesses.

Business activities that include working business user to produce forecasts of the ending

Federal Reserve account balance.

Troubleshoot application errors, address/manage requests from the user and development

teams, analyze and support large financial data sets.

Work with the technology teams to diagnose and resolved problems and incidents.

Data research, reconciliation breaks, jobs/processing re-architecting, testing.

Use Tableau Desktop to analyze and obtain insights into large data sets.

Responsible for Data Extraction and Transformation from disparate sources like Oracle and

XML files and loading in to XML files using Informatica Power Center 9.1

Create visually compelling and actionable interactive reports and dashboards.

Developed Tableau data visualization using Cross Map, Scatter Plots, Geographic Map, Pie

Charts and Bar Charts, Page Trails, and Density Chart.

Created Advance connections, join new tables, create & manage Extract and Monitor

Queries using SQL Assistant.

Created Complex workbooks and Dashboards by connecting to multiple data source

using data blending.

Extensively worked on Tableau Desktop, by creating Sears Kmart Complex reports

Dashboards.

Created Tableau scorecards, dashboards, Heat maps using show me functionality.

Manipulating data using Tableau versus using a programming language to find the right

balance.

Updating the Reports according to the client Requirements.

Prepared Unit Test Cases for reports.

Perform Gap Analysis and feasibility study for implementing Extract/reporting enhancements.

Involved in understanding the reporting requirements and providing Tableau reporting

Solutions

Develop, maintain and document highly visualized comprehensive dashboards using Tableau

Desktop and publish it to Tableau Server to meet business specific challenges.

Work with subject Matter Experts to identify data relationships, report field mappings and

translate them into technical requirements for development.

Presented to the higher management the discovered trends and analysis, forecast data,

recommendations and risks identified.

Coordinating and conducting business, systems requirements analysis meetings with business

and technology partners, business analysts, data modelers, solution architects, DBA's and

project managers.

Design mockups/wire frames for various reports by using MS Visio and Excel.

Write and execute SQL scripts, Stored Procedures to develop reports and perform data/report

validation.

Environment: Tableau Server 9.x, Tableau Desktop 9.3, Advance Excel, Informatica Power Center 9.1/Power Exchange/Informatica, Cloud, IDQ 9.1, Tivoli, Teradata, Oracle 11g, PLSQL, DB2, XML, MS SQL Server 2005, SQL* PLUS, MS Excel, UNIX(AIX), UNIX Shell.

Client: Microsoft Bing R&D, India Jan ’14 – Dec ’16

Role: Data & Reporting Analyst

Responsibilities:

Analyzed massive and highly complex Data sets, performing ad-hoc analysis and data manipulation.

Analyzed journal entries provided by client for equality, completeness, and abnormalities

Validated data formats and Verified control totals against the general ledger.

Imported and exported data from text files, saved queries, or databases used automatic outlining, inserted subtotals, created advanced filters, and used database functions.

Researched, updated, and validated data underlying spreadsheet production strategically fill gaps.

Created pivot tables and charts using worksheet data and external resources, modified pivot tables, sorted items and group data, and refreshed and formatted pivot tables.

Designed, recorded, and executed macros to automate data entry inputs.

Formatted spreadsheets and workbooks for print, document reproduction, and presentations.

Executed standardized and audit-objective-specific scripts for journal entry testing and performed data analysis testing

Mapping the data according to the Bing Data base according to the results & run the data in the data tool for Validation.

Validating the data through different validation tools such as Swiss Knife, Hit App, Pin Pointer, Data Clustering, Data Templates.

Expertise in documenting and functional level how the procedures work within the data quality applications.

Expertise in using data quality tool IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes.

Developed complex mappings using Lookups connected and unconnected, Rank, Sorter, Joiner, Aggregator, Filter, Router transformations to transform the data as per the target requirements.

Created Workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Session in the Workflow Manager.

Used Workflow Monitor to monitor the jobs, review error logs that were generated for each session, and rectified them.

Worked on performance tuning of the ETL processes. Optimized/tuned mappings for better performance and efficiency.

Designed and Developed ETL logic for implementing CDC by tracking the changes in critical fields required by the user using SSIS.

Coded complex BTEQ scripts to populate data mart tables from EDW to cater specific reporting needs.

Escalating Issues & Interacting with the Business Managers for insertion of bulk data and other possible Mapping Entities

Sending Reports to the managers on daily basis about the schedule work.

Maintained the quality of Data analysis, researched output and reporting, and ensured that all deliverables met specified requirements.

Environment: Windows (7,8 & 10), Pin wheel, Quality Center, Geo-Coding Tool, Data Integration, MS Excel, Conditional Formatting, MS Office, Various Tools of Microsoft (Swiss Knife, Prediction Analysis Tools for Bing Predictions), Access, MS SQL Server, Microsoft Team Foundation Server 2015, Microsoft Visio, SSRS.

Client: Google India Jun ’12 – Dec ’13

Role: Analyst

Description: My roles required me to clean and analyze the geospatial data from GIS domains and ingest into Google Maps API as per country specific security policies using Techmate.

Responsibilities:

Interpreted data, analyzed results using statistical techniques and provided ongoing reports

Developed and implemented data collection systems and other strategies to the efficiency

and data quality

Acquired data from primary or secondary data sources and maintained databases/data

Systems

Collected, organized, and documented infrastructure project attributes, data and project

metrics

Processed data load requests, manually entered data, reconciled data conflicts, and

Created data extracts and reports.

Analyzed and interpreted trends/patterns in complex data sets

Filtered, cleaned data, and reviewed computer reports, printouts, and performance

Indicators to locate and fix the issues with the code

Identified Spams business and taking them off Google maps to provide better user

Experience.

Synthesized data, reported ad-hoc utilizing Excel, SQL Server, Crystal Reports

Provided quantitative performance results through recurring reports and presentations

Fulfilled reporting requirements and assisted in the development of reports for internal/

external clients on a recurring basis using macros, pivot tables

Made changes about a business being closed or relocated when viewing for it on the

Google plus page.

Supported the collection, analysis, harmonization, and loading of metadata into the repository

Performed detailed analysis of technical specifications, data dictionaries, data lineage

documentation translate/import the information from these documents.

Environment: Windows 7, Pin wheel, Excel sheet, Google Ad words, Quality Center Tool, Geo-Coding Tool, Google Analytics, SQL Server, Various Tools of Google (Under-clustering Tool, Address Extraction, Push-Pin) etc.

EDUCATION DETAILS :

Bachelor’s Degree from TKR College of Engineering & Technology – Jawaharlal Nehru

Technological University Hyderabad (2008 – 2012)



Contact this candidate