Post Job Free

Resume

Sign in

Data Analyst

Location:
Prakash Nagar, Telangana, 500016, India
Posted:
March 01, 2019

Contact this candidate

Resume:

Dinesh Sudagoni

Data Analyst Lead

Phone: 980-***-**** Email: ac8ndt@r.postjobfree.com

Summary:

•A meticulous Business Intelligence and Data Analytics Professional with 7.2 years of professional experience in Data Analytics/Business Intelligence and ETL with extensive knowledge of business process and functioning of Health Care, Manufacturing, Financial, Insurance sectors.

•Extensive and hands-on experience using Tableau, SAS, Qlik Sense, Qlik view Informatica, Talend, Amazon Redshift, SQL with extensive hands-on system analysis, data modeling, design, development, testing and implementation of projects (Complete SDLC) and capable of handling responsibilities independently as well as a proactive team member.

•Excellent communication skills, team management and logical ability. Responsible for collaborating with business partners/clients and other stakeholders.

•Excellent knowledge on Dimensional Modeling concepts like (Star Schema, Snow Flake Schema) & Loading Data from multiple sources.

•Good experience in creating Use Cases, process flow chart diagrams, Activity Diagrams, Sequence Diagrams, Unified Modeling Language (UML) using MS Visio and Rational Rose.

•Widespread understanding of SDLC Methodologies like Waterfall, Agile/Scrum.

•Extensive experience in Data Analysis, including profiling, cleansing, transforming and confirming.

•Experience in Data Models/ Scripting to handle high volume of data applying optimization techniques and data statistics calculations using different sources.

•Implemented Section Access, Binary Loads and abstracting the data layer using QlikView library, QVD and QVX files. Coordinate new data development which includes the creation of new structure to meet business requirements or streamline existing processes to reduce overhead on the existing warehouse with ETL Development team

•Introduced several Levels of Dashboard Drill Down and Links to interrelated performance metrics.

•Upgraded Workbooks and Dashboards from Tableau 8 to Tableau 9.

Certifications:

•Qlik Certified Business Analyst, Tableau Certified Associate, SAS certified Base Programmer & Oracle Database SQL Certified Expert.

Education:

Bachelor of Technology in Computer Science and Engineering – 2010.

Technical Skills:

Analytical Tools

Tableau 10.5/9/8.3, SAS, Qlik Sense 1.2/2.0/3.2, Qlik N-printing, Pentaho

Languages

SQL, PL/SQL, QlikView Scripting, Python, R Programming, SAS, PL/SQL, HTML, CSS, HQL

Database Technologies

Oracle11g/10g, MS SQL Server, Netezza, PostgreSQL, DB2, MS Access, Amazon Redshift, Azure SQL

Big Data Tools

Hadoop, Hive, Splunk

Data Integration

Informatica, Talend

Operating Systems

Windows, Unix, Linux, MAC OSX

Data Modeling

ER Studio

Methodologies

Waterfall/Agile Software Development

Professional Experience:

Client: Johnson & Johnson (Raritan, NJ) Dec 2017 – Present

Role: Sr Data Engineer / Lead

Responsibilities:

•Led multiple projects i.e. Global Pricing, Janssen Connect, Care path, iQlik and Connected Visibility.

•Worked with Business SMEs to analyze and document business requirements and functional specifications.

•Implemented the 3-tier architecture, Data Extract, Data transform and Data model & Application. Data Extract would load the Data from the data sources into the QVDs. Data transform will transform the data and Application layer is User Interface Design layer where we layout all our Charts, Tables and List Boxes.

•Multiple tier of QVD architecture is implemented depending upon the complexity of the application and the final application reload duration was 6 hours

•As a data architect for one of the projects, designed the data flow, configuration set up, implemented best practices and coordinated with offshore developers and testing team to build reports, log and fix existing defects and enhancements.

•Expertise in QlikView Scripting, creating Qlik n-printing reports, Set analysis and Section Access and in database programming, SQL, Tables, Views, Indexes, Joins, and Sub – Queries.

•Designed and implemented the Incremental Load using QVDs to decrease the load time / increase loading performance, so that Qlik sense would only load the data worth of one day and append it to the old data which is already stored in QVDs.

•Performed different tasks like extraction, reload tasks, distribution tasks, reduction, deploying apps, scheduling, User management, license management, Clustering and licensing using QMC, tested applications using review checklists, Data quality for Quality Assurance before delivering to the end users, analyzed the data and defined Key Performance Indicators (KPI).

•Increased performance in QlikView / Qlik Sense application by exporting complex calculations to views and stored procedures in the Amazon Redshift Database. Designed and developed Qlik Sense application based on existing QlikView data model. Scheduled the jobs and distributed to end users from QMC.

•Extensive experience in ETL methodology for performing Data Profiling, Data Migration, Extraction, Transformation and Loading using Talend and designed data conversions from multiple source systems including Netezza, Oracle, DB2, SQL server, Teradata, Hive and non - relational sources like flat files and XML.

•Used Database components like tMSSQLInput,tMsSqlRow, tMsSqlOutput, tOracleOutput, tOracleInput etc and created talend jobs to copy the files from one server to another and utilized Talend FTP components.

•Created ETL job infrastructure using Talend Open Studio and worked on the design, development and testing of Talend mappings.

•Created Talend ETL jobs to receive attachment files from pop e-mail using tPop, tFileList and tFileInputMail and then loaded data from attachments into database and achieved the files.

•Used ETL methodologies and best practices to create Talend ETL jobs. Followed and enhanced programming, naming standards and improved the performance of Talend jobs.

•Worked on Talend components like tReplace, tmap, tsort and tFilterColumn, tFilterRow, tConvertType etc.

•Worked with various File components like tFileCopy, tFileCompare, tFileExist, TFileDelete, tFileRename.

•Worked on Created triggers for a Talend job to run automatically on server. Worked on Exporting and Importing of Talend jobs.

Environment: Qlik Sense, Talend, Amazon Redshift, PostgreSQL, SQL Server, Windows 2008 R2 Server, Qlik Sense Management Console, Amazon AWS, MS Excel.

Client: Charter Communications (Stamford, CT) Aug 2016 – Nov 2017

Role: Sr. Data Analyst

Responsibilities:

•Worked collaboratively with Business Analysts, Product owners and Data Architects to understand reporting needs and to design and build.

•Loaded the structured data (Web data and Digital Data) and Splunk logs into HDFS.

•Created analytical views in HDFS as per the reporting requirement and for cross channel analysis.

•Developed dashboards for analyzing the non-registered users who are calling the call center and drill down into the company these users work for, demographics etc.

•Built reports for analyzing the total members contacting Anthem each day, the interactions made demographics of these members over different time periods.

•Worked with over 120 million users customer data and created numerous custom charts along with basic tableau chart types. Venn diagram representing the total members and interactions for Digital Channel, Solution center and both the channels.

•Cross channel analysis to understand why registered users are calling the call center, how many members are making a digital interaction and then using solution center, drill down into the reasons for this type of interactions based on total digital interaction errors, interaction types etc.

•Design Metadata Layer for Tableau reports and Tableau metadata management with maintaining consistency in server data source.

Environment: Tableau, Hive, HDFS, Splunk, MS Excel, Python.

Client: Allstate (Atlanta, GA) Apr 2015 – July 2016

Role: Data Engineer

Responsibilities:

•Working in a fast-paced agile environment with the responsibility of using data available with the company to figure out market and business trends for the company to increase the profits and efficiency while coordinating with teams and clients.

•Design, build and automate insightful business intelligence dashboards using Pentaho BI Suite to help stakeholders and clients plan future sales and marketing strategies.

•Analyzed, retrieved and aggregated data from multiple client verified sources like databases, sales orders and dial-up reports to perform data mapping, in order to precisely append either incorrectly mapped or missing data into the respective data sources.

•Created 6 Python applications for ETL and for report data extraction, distribution across clients.

Environment: Python, PostgreSQL, SQL, Redshift, Pentaho, PL/SQL, MS Excel

Client: Sun Trust Bank (Arlington, TN) July 2014 – Mar 2015

Role: Data Analyst

Responsibilities:

•Used good judgment in estimating “what-if” impact to help business understand risks and opportunities before taking actions.

•Gathered end user requirements from business and Finance to develop Executive dashboards constituting of Financial Forecast, Pre-sales, Risk and underwriting Business.

•Directly involved in gathering complex business requirements pertaining to recent or old pattern of abnormal behavior or activity on different channels that would affect the end customer of the bank.

•Designed the needed aggregations, transformations and mapping of the data entities to suite the build and execution phases of the applications.

•Used SAS, SQL, XML, PL/SQL and Windows batch programming techniques to code the technical specifications apply business logic and produce automated reporting solutions.

•Extensive use of PROC SQL, SAS Macros, Unix and Windows Batch Shell for process automation.

•Routinely deal in with large internal and vendor data and perform performance tuning, query optimizations and production support for SAS, Oracle 11g.

•Responsible for providing the consulting team with data preparation, data analysis and statistical modeling using SAS as a primary tool.

•Execute data crunching, write codes, merge/purge/ and split data, and do ad hoc queries/cut data programming in SAS.

•Used SAS procedures (FREQ, SUMMARY), SAS functions and user defined macros to test the accuracy and validity of data, SAS date in formats, functions, statements and options to extract the data from files that are sent in different layouts.

•Used SAS data step, PROC SQL, PROC APPEND, PROC DATASETS and extensively used data step merge to merge the claims data against the eligibility information sent by the customer. Wrote macros and procedures to check the quality of the data before and after the eligibility merge.

Environment: SAS, Oracle 10g/11g, SQL, XML, PL/SQL, MS Excel

Client: Bridgestone (Austin, TX) Nov 2013 – Jun 2014

Role: Business Intelligence Developer

Responsibilities:

•Hands-on experience in creating solution driven Tableau dashboards by developing different chart types including heat maps, geo maps, scatter plots etc.

•Mainly designed and developed Tableau applications from scratch, to support reporting and Business Intelligence initiatives.

•Deployed the application on Tableau server and imported data from different sources like Oracle 10g, XML files and flat files using OLE DB/ODBC.

•Created interactive dashboards in Tableau Desktop and published them to Tableau server which allowed story telling with quick filters to retrieve on demand information with just click of a button.

•Calculated the Variance (YOY) in Sales of current year over last year using advanced table calculations.

•Developed dashboards for 3rd party seller analysis, total brand sellers, authorized vs unauthorized. Dashboard providing insights over MAP (minimum asking price management).

•Designed and developed Seller and Customer snapshot, focusing on the active and inactive customers of each seller.

•Created analytical views for central SLT automation and tableau report views for Application teams, Technology services and the product owners to analyze and handle daily enterprise jobs effectively.

•Created Custom maps to analyze sales in various regions using Custom Geocoding and Power tools for Tableau and designed a performance evaluation dashboard to display YTD, QTD and MTD performance of each business category.

•Extensively designed KPI's using Custom Shapes to help users quickly evaluate status of a metric.

•Identified the outlier s (Top X and Bottom X) of each business segment using Rank table calculation.

•Calculated the percentage of contribution of sellers to total sales and profit using Pareto analysis.

•Created stored procedures and triggers to provide complete technical solutions.

•Worked on performance monitor counters for monitoring Teradata environment. Cleaned the data and created fact and dimension tables, performed calculations required for the reports using SQL Server environment and accomplished data modeling.

•Worked with all levels of development from analysis through implementation and support.

•Implemented best practices in all the dashboards using extracts, context filters and fixed time-consuming reports using performance recording feature.

•Administered user, user groups, and scheduled instances for reports in Tableau Server.

Environment: Tableau 8.3/9.0, Teradata, Oracle 10g, SQL Server, Teradata SQL Assistant, MS VISIO, Microsoft Office Suite, XML files.

Client: Tech Mahindra – Quest diagnostics

(offshore – Hyderabad, Ind) (Onsite – Princeton, NJ) June 2011 – Oct 2013

Role: Data Analyst

Responsibilities:

•As one of the youngest employees in the company to be promoted to the role of team lead, I led teams to gather business requirements, build BI applications to provide insights and provide business solutions based on discoveries made.

•Analyzed data quantitatively and qualitatively with different tools like SAS, Tableau, R Programming, Talend, Informatica and Qlik Sense.

•Collaborated with the business community and other BI team members on the definition, construction and deployment of reports, dashboards, metrics and analytic components, as well as data migration design and testing.

•Scheduled data refresh on Tableau server for weekly and monthly increments based on business change to ensure that the views and dashboards were displaying the changed data accurately and sent those to production after editing.

•Experience in trouble shooting and performance tuning of reports and resolving issues within tableau server and reports.

•Developed many python applications for extracting data from different sources like PostgreSQL and export to formats desired by client.

•Operationalized the extracts generated from the python applications so that the extracts are automatically stored in AWS S3 and emails are sent to the respective clients based on the frequency configured.

•Built complex views to transform data for the reporting needs from various external data sources like LYNX, IHS, SFDC

•Strong experience in Extraction, Transformation, loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Designer, Workflow Manager, Workflow Monitor, Metadata Manger)

•Extensively worked on Informatica Designer Components such as Source Analyzer, Transformations Developer, Mapplet and Mapping Designer.

•Expertise in Data Cleansing at staging level with utilization Informatica Power Center.

•Designed and developed Informatica Mappings and Sessions based on business user requirements and business rules to load data from source flat files and oracle tables to target tables.

•Worked on the design, development and testing of Informatica mappings. Created tables, stored procedures in SQL for data manipulation and retrieval, Database Modification using SQL, PL/SQL, Stored procedures, triggers, Views in Oracle.

Environment: Tableau, Qlik Sense, R Programming, Informatica, Amazon Redshift, SQL Server, Python, MS Excel.



Contact this candidate