Post Job Free

Resume

Sign in

Data Analyst Business Intelligence

Location:
Hyderabad, Telangana, India
Salary:
100000
Posted:
November 08, 2023

Contact this candidate

Resume:

Shiva Pola

Sr. Data Analyst

Phone: 647-***-****

E-mail:ad0x8c@r.postjobfree.com

Reliable, Innovative Data Analyst recognized for developing science-driven business methods/strategies that produce real value and minimize risks. Known as a self-starter who manages multiple project priorities such as mining data, and leveraging machine learning algorithms, in fast-paced Big Data, Cloud

Diligent Power BI Developer, Sr. Data Analyst, with 5+ years of IT, experience having multiple skillsets working experience in data warehousing and Business Intelligence Technologies with expertise in SQL Server development, MSBI stack (TSQL, SSIS, SSAS, SSRS), Azure, Power BI for building, deploying, managing applications and services through a global network of Microsoft - managed Data Center. Specialize in design, implementation, integration, maintenance, and testing of various web-based, Enterprise, Client/Server, and Distributed applications using Python, Django, Flask Pytorch

Worked on Data warehousing, Data engineering, Feature engineering, big data, ETL/ELT, and Business Intelligence. As a data analyst, specialize in Azure frameworks, Hadoop Ecosystem, Excel, Snowflake, relational databases, tools like Tableau, PowerBI Python, and Data DevOps Frameworks/Azure Dev Ops Pipelines

SUMMARY

Data Analyst with a solid understanding of Data Modeling, Evaluating Data Sources strong understanding of Data Warehouse/Data Mart Design, ETL, BI, OLAP, and Client/Server applications. Experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export using multiple ETL tools such as Informatica PowerCenter Experience in testing and writing SQL statements - Stored Procedures, Functions, Triggers, and packages proficient in POWER BI, Snowflake, Teradata, SQL Server, Oracle, Python.

Data analysis using Python Specialize in design, implementation, integration, maintenance, and testing of various web-based, Enterprise, Client/Server, and Distributed applications using Python, Django, Flask Pytorch and Node.js, Angular.js, DOJO, jQuery both on Premise, AWS and Azure frameworks, Cloudera, Hadoop Ecosystem, Spark/Py Spark/Scala, Data bricks, Hive, Redshift, Snowflake, relational databases, tools like Tableau, Airflow, DBT, Presto/Athena, Data DevOps Frameworks/Pipelines Programming/Scripting skills in Python

A Data Warehouse developer, Data modeler, and Data Analyst with more than 4 years of experience in all phases of the Software development life cycle (SDLC) including System Analysis, Design, Data Modeling, Implementation, and Support and maintenance of various applications in both OLTP and OLAP systems.

Experienced in developing reports and dashboards using data blending from multiple sources.

Proficient in Building Analysis Services reporting models, developing visual reports, KPI scorecards, and dashboards using the desktop. Connecting data sources, importing data, and transforming data for Business intelligence. Analytical thinking for translating data into informative reports and visuals.

Extensive experience in Database Design using SQL and PL/SQL Programming using Stored Procedures, Functions, Packages, and Triggers.

Created different reports utilizing the desktop and the online service and schedule refresh.

ETL Extract, Transform, and Load (ETL) source data into target tables to build Data Marts.

Conducted Gap Analysis, and created Use Cases, workflows, screenshots, and PowerPoint presentations for various Data Applications.

Worked on complex SQL Queries, and PL/SQL procedures and convert them to ETL tasks

Experience in understanding of Microsoft development suite SSIS and SSAS, as well as familiarity with standard data extraction, cleansing, integration, and aggregation techniques and best practices.

Experienced in both live and import data for creating reports.

Created reports in the preview portal utilizing the SSAS Tabular via the Analysis connector.

Proficiency in different visualization in the reports using custom visuals like Bar Charts, Pie Charts, Line Charts, Cards, Slicers, Maps, etc. Also, using different transformations inside Power Edit Query to clean up the data.

Proficient in Data Analysis, Cleansing, Transformation, Data Migration, Data Integration, Data Import, and Data Export through the use of ETL tools such as Informatica.

Hands-on experience on Tableau Desktop, Tableau Reader, and Tableau Server.

Handling BI capabilities: reporting, analysis, dashboards, and business event management.

Hands-on Experience with Framework Manager Modeling (Physical Layer, Business Layer, Packages) and Complex Report building with Report Studio.

Expertise in developing and testing Complex reports (List, Cross tab, Charts, Maps, Drill-through, Master-Detail, and Cascading) and Dashboards using Report studio.

Hands-on learning with different ETL tools to get data in shape where it could be connected to Tableau through Tableau Data Extract.

Experience with writing SQL queries, Oracle PL/SQL Stored Procedures, Functions, Cursors and Triggers.

Have a clear understanding of Business Intelligence and Data Warehousing concepts emphasizing ETL and Life Cycle Development.

Worked on Configured Applications using the Service-Now tool used in ITIL Management. understanding of ITIL V3. Technical knowledge of the Service Now platform as well as experience delivering medium to large-scale Service Now implementations.

Involved in the entire data science/analysis project life cycle and actively involved in all the phases including data cleaning, data extraction, and data visualization with large data sets of structured and unstructured data, created ER diagrams and schema

EDUCATION:

Bachelors- SR Engineering College (2013-2017)

Major- Computer Science and Engineering

TECHNICAL SKILLS

Analytical Techniques

Hypothesis testing, Predictive analysis, Machine Learning, Regression Modelling, Logistic Modelling, Time Series Analysis, Decision Tree, Neural Networks, Support Vector Machines (SVM), Monte Carlo methods, Random Forest, and Time series analysis.

Analytical tool

Rapid Data miner, Google Analytics, IBM Watson, R Studio, SAS/STAT, Google Ads, Azure data lake analytics, SAS Enterprise miner, Pycharm, Jupyter notebook, NLP, MATLAB, GGPLOT, WEKA, Databricks, Jenkins.

Data Visualization Tool

Tableau, Qlikview, Qlik Sense, Datawrapper, Microsoft Power BI, Excel, VISIO, looker

Data modeling

Entity relationship Diagrams (ERD), Snowflake schema, Star schema

Languages

SQL, U-SQL, HIVE QL, C, R, Python.

Database Systems

SQL Server10.0/11.0/13.0, Oracle, MYSQL 5.1/5.6/5.7, Teradata, DB2, Amazon Redshift

NoSQL Databases

HBASE, Apache Cassandra, MongoDB, Redis

ETL Tools

Microsoft SSIS, Talend Open Studio, Data Stage 11.x, Informatica Power House 9.0, Informatica IDQ, Collibra, KAFKA, FLUME

Others: TOAD, Oracle SQL developer, MS Office, FTP, Control-M, SQL Assistant, Rally, JIRA, GitHub, JSON, Hadoop, SSRS, Service Now

Data Analysis

Libraries: Python, Pandas, NumPy, SciPy, Scikit-learn, NLTK, Plotly, Matplotlib

PROFESSIONAL EXPERIENCE

Client: Canada Life, Toronto, Ontario Apr 2021 – Till Date

Sr. Data Analyst

Canada Life provides insurance and wealth management products and services in Canada, the United Kingdom, the Isle of Man and Germany, and Ireland through Irish Life.

Project Description:

Developed BI analytics strategy, The Project involved in developing the dashboards in Azure environments for the Metrics and Compliance department.

Key Contributions:

I was involved in Data Analysis, Data Profiling, Data Integration, Migration, Data governance, metadata management, and Master Data Management and Configuration Management. The Project involved in developing the dashboards in Tableau, environments for the Metrics and Compliance department.

Responsibilities:

Worked with various versions in Tableau, Tableau Server, Tableau Public, Tableau Reader, and Tableau Online.

Extensive experience in building dashboards using Tableau.

Ability to introduce Tableau to clients to produce different views of data visualizations and present dashboards on web and desktop platforms to End-users and help them make effective business decisions.

Experience with the creation of users, groups, projects, workbooks, and the appropriate permission sets for Tableau server logons and security checks.

Created incremental refreshes for data sources on the Tableau server.

Involved in Requirements gathering/analysis, Design, Development, Testing, and Production role of Reporting and Analysis projects.

Able to creatively stretch the capability of Tableau to customize and develop creatively stunning and dynamic visualizations.

Created Tableau Dashboards with interactive views, trends, and drill-downs along with user-level security.

Provided Production support to Tableau users and Wrote Custom SQL to support business requirements.

Deep, technical knowledge of both Tableau Desktop and Tableau Server from an administrative and consultative standpoint.

Strong oracle SQL skills with good knowledge of RDBMS concepts.

Work directly with Quality Assurance associates to ensure all deliverables are accurate and of the highest quality.

Designed and developed ETL workflows and datasets in Alteryx.

Processed data in Alteryx for tableau reporting.

Experience in creating different visualizations using Bars, Lines and Pies, Maps, Scatter plots, Gantts, Bubbles, Histograms, Bullets, Heat maps, and Highlight tables.

Involved in Trouble Shooting, Performance tuning of reports, and resolving issues within Tableau Server and Reports.

Designed and developed ETL solution using SSIS, and Azure data factory and created CAS jobs using Microsoft internal tools like Geneva Analytics data studio.

Environment: Hadoop, Tableau Amazon Redshift, Data stage 7.5, Python, Alteryx, Teradata, Oracle 12c, SQL Server 2008, TOAD, PL/SQL, JavaScript.Azure, Azure DevOps

Thermo Fisher Scientific, Mississauga, Canada Mar 2019 – Dec 2020

Data Analyst

Thermo Fisher Scientific is an American supplier of scientific instrumentation, reagents and consumables, and software services. Based in Waltham, Massachusetts

Project Description:

This project’s objective was to extract the data from the OLTP systems into DSS systems. DSS database was designed for the business analysis of the statistical data collected through web transactions. My role is to Develop Applications by using and Implementing data processing projects to handle data from various RDBMS and Streaming sources.

The project also involves planning, designing, creating, and developing the workflows to initiate, manage, and remediate production incidents. Deploy the automated workflows and test and validate applications and workflows, ensure and maintain compliance with making changes and implementing code, and update the documentation to cover department-level processes

Key Contributions:

Involved with data profiling for multiple sources and answering complex business questions by providing data to business users.

Responsible for the development of workflow analysis, requirement gathering, data governance, data management, and data loading.

Have set up data governance touchpoints with key teams to ensure data issues were addressed promptly.

Responsible for facilitating UAT (User Acceptance Testing), PPV (Post Production Validation), and maintaining Metadata and Data dictionary.

Responsibilities:

Worked with Business users to gather requirement specifications for new dashboards.

Communicating with the users about their requirements, converting the requirements into functional specifications, and developing advanced visualizations.

Worked with various versions in Tableau, Tableau Server, Tableau Public, Tableau Reader and Tableau Online.

Developed Tableau visualizations and dashboards using Tableau Desktop.

Developed Tableau workbooks from multiple data sources using Data Blending.

Designed and developed ETL workflows and datasets in Alteryx to be used by the BI Reporting tool.

Analyzed huge volumes of data. Devised simple and complex HIVE, SQL scripts to validate Dataflow in various applications

Deep, technical knowledge of both Tableau Desktop and Tableau Server from an administrative and consultative standpoint.

Designed and developed ETL workflows and datasets in Alteryx.

Merge and Append Queries and Utilized aggregation to improve query performance

Group data to create Queries in, Development of the framework using the Python

Using Tiles and Power View Filters and connecting slicers to more than one table

Worked on SQL Server Integration Services (SSIS) and SQL Server Reporting Services SSRS). Worked with the database team to gather data required for the business reports.

Worked with various versions in Tableau, Tableau Server, Tableau Public, Tableau Reader, and Tableau Online.

Environment: MS SQL Server 2012/2014, Business Intelligence Development Studio (BIDS), SQL Server Integration Services (SSIS 2014), SQL Server Analysis Services (SSAS 2014), Azure, SQL Server Reporting Services (SSRS 2014), Power Pivot, Windows, MS Excel, SQL Server Profiler, Tableau

Zensar Technologies Pune, India Feb 2017 – Feb 2019

Data Analyst

Description:

The client is a Bank that offers deposit products, mortgages, and business and consumer loans, as well as securities brokerage, insurance, trust, and private banking, is an American multinational independent investment bank and financial services company providing financial services to individuals, corporations, and municipalities through its subsidiary companies that engage primarily in investment and financial planning, my role was to enrich and transform data into internal data models powering search, data visualization, and data analytics.

Responsibilities:

Responsible for setting up Python REST API framework using DJANGO.

Created Django forms to display database models and Django views to display form details.

Created pie charts and bar graphs to display trends from data analysis.

Used BeautifulSoup python library for web scraping to extract important data from html and xml tags.

Used PyQt to implement GUI for the user to create, modify and view reports based on client data.

Views and Templates were developed with Python to create a user-friendly website interface and maintained data with JSON data standards for quick response and Restful, CRUD operation.

Wrote a program to use REST API calls to interface with the backup server, and parse output reports of an excel file in python to monitor backup usages.

Used AJAX for Better interactivity for the end user and easier navigation and AJAX is compact.

Involved Merge jobs in Python to extract and load data into MySQL database using CRUD Operations.

Involved in the development of Web Services using SOAP for sending and getting data from the external interface in the XML format.

Involved in source code management with the help of GitHub using push and pull operations of GIT and created a local GIT repository so that the source code can be managed locally.

Responsible for designing, developing, testing, deploying, and maintaining the web application.

Maintained program libraries, user manuals, and technical documentation.

Managed large datasets using Panda data frames and MySQL.

Wrote and executed various MYSQL database queries from python using Python-MySQL connector and MySQL dB package.

Thorough knowledge of various front-end tools like HTML, CSS, JavaScript, XML, JQuery, Angular JS, and AJAX. Managed large datasets using Panda data frames and MySQL.

Environment: Power BI, SQL Server 2012, AZURE, Google Big Query, Tableau, Excel, MS SQL Server 2014/2012, Server Integration Services ( SSIS), SQL Server Reporting Services (SSRS), SQL Server Analysis Services (SSAS), SQL Server Management Studio (SSMS) and Oracle 11g.



Contact this candidate