Post Job Free
Sign in

Data analyst

Location:
Los Angeles, CA
Salary:
$65
Posted:
May 10, 2019

Contact this candidate

Resume:

Faham Nehal

********@*****.***

732-***-****

SUMMARY

Over 7 years of extensive experience as a Data Analyst/BI Analyst in Power BI/Tableau/SSIS/SSRS/QlikView with exposure to Database Administratoin.

Experience in creating Store Procedures/TSQL Coding, Performance and Tuning, User Defined Functions and Physical and Logical Database Design.

Expertise in creating and developing Power BI Dashboards into rich look.

Excellent knowledge of entire Software Development Life Cycle(SDLC) methodologies like Waterfall model and Agile SCRUM, starting from business requirement gathering and analysis to testing and deployment.

Good experience in Information Technology field as a SQL server Developer with strong expertise in SQL server development, designing Stored Procedures/Transact-SQL coding.

Hands on Experience in Installing, Configuring, Managing, Monitoring and Troubleshooting SQL Server 2008/2005/2000.

Expert in creating complex Stored Procedures, User Defined Functions, and effective Triggers to facilitate efficient data manipulation and data integrity.

Extensively worked on system analysis, design, development, testing and implementation of projects and capable of handling responsibilities independently as well as a proactive team member.

Experience in designing Star, Snowflakes schemas and database modeling using Erwin tool.

Experience in OLTP/OLAP environment and Data Warehouse.

Good knowledge in establishing database connections for Python by configuring packages MySQL-Python.

Utilized Power BI to design and deploy analytical reports/dashboards from tabular models, assisting with KPI analyses.

Extensive experience in SQL Server Analysis Services OLAP Cubes, and Data Mining.

Designed and developed matrix and tabular reports with drill down, drill through using SSRS.

Experience in writing Complex SQL Queries involving multiple tables inner and outer joins.

Expert in Data Extraction, Transforming and Loading (ETL) using various tools such as SQL Server Integration Services (SSIS), DTS, Bulk Insert and BCP.

Expertise in developing Parameterized, Chart, Graph, Linked, Dashboard, Scorecards, Report on SSAS Cube using MDX, Drill-down, Drill-through And Cascading reports using SSRS.

Proficient in Database performance optimization, debugging and tuning using the Query Analyzer, SQL Profiler and SQL Server Debugger.

Experience in Handling Heterogeneous data sources and databases Oracle, Teradata, and CSV and XML files using SSIS.

Auto-generated analysis reports using SQL Server, ACCESS, Business Objects and Crystal Reports and Excel pivot tables.

Excellent verbal and written communication skills with a strong focus on customer facing interaction, customer service, and presentation.

Experience in using Team Foundation Server (TFS) and Visual source safe (VSS) as a tool for repository of codes and version control, branching and merging into production.

Proficient with automated testing tools - Win Runner, Load Runner and QTP for testing Client / Server and Web based applications

Experience in Migrating from SQL Server 2000 to SQL Server 2005 and SQL Server 2005 to SQL Server 2008.

Hands on expertise with ETL process using DTS/SSIS to load and process high volumes of data for front-end, web and reporting applications.

Excellent experience in Performance Tuning, Monitoring and Optimization of T-SQL blocks in both Development and Production environment for SQL server database.

Worked very closely with Customers, End users and powered users collect requirements and training or resolve any business process related issues. Proficient in using SQL Profiler, Execution Plan and Performance Monitor

Extensive experience in using various tasks, containers, sources, transformations and destination in SSIS tool while transferring the data from various sources systems into the destination

Used Task Factory components and tasks like Upsert Destination,Update batch transformation, Surrogate key transformation and many based on the requirement.

Extensively used SSIS Import/Export Wizard for performing the ETL operations.

Proficient in creating different Package Configurations such as SQL Sever, XML configuration and Parent package variable.

Implemented different Logging in SSIS packages to log events during the run time

Experience in using BI xPress tool for applying auditing framework,deploy and manage SSIS packages

Proficient in creating QV data models, QV objects, reports, graphical interactive UI and dashboard using different data sources like SQL Server, Oracle, QVD files, Access, text files, excel documents etc. as per the user requirement.

Strong understanding of Dimensional Modeling technique, Multi Dimensional Database Schemas like Star, Snowflake Schema, Fact and Dimensional tables, Section Access, Set Analysis in QlikView and DW concepts.

Proficiency in creating different types of reports such as List, Cross-Tab, Conditional, Drill-down, Master-detail, Chart, Summary, Form and formatting them by using QlikView reporting tool.

Performed different types of loads Incremental Load, Optimized Load & Binary Load.

Experience in implementing various levels of File/Document and Data level Security.

Proficient in creatingAd-Hoc reports and complex dashboards in Tableau.

Experience in report writing using SQL Server Reporting Services (SSRS) and in creating various types of reports like table, matrix, and chart report, web reporting by customizing URL Access.

Excel under pressure in deadline driven environment with superior interpersonal communication and technical documentation skills.

TECHNICAL SKILLSET

Programming Languages

C, C++,SQL, TSQL, PL/SQL

RDBMS

MS-SQL Server 2000/2005/2008,2012, Oracle 9i/10g, Microsoft Access, DB2

ETL Tools

SQL Server Integration Services (SSIS) 2005/2008/2012, DTS

Reporting Tools

SQL Server Reporting Services (SSRS) 2000/2005/2008,QlikView 11,Tableau

Database Tools

SQL Server Management Studio (SSMS), SQL Server Business Intelligence Studio (BIDS), Enterprise Manager, SQL Profiler, Teradata, Query Analyzer

Operating Systems

Windows 98, Windows 2000 Advanced Server/Professional, Windows Server 2003, Windows NT, Windows XP Professional/Standard, Windows Vista Business/Ultimate, Windows 7 Ultimate.

EDUCATION

Masters in healthcare administration- Western Kentucky University

Bachelors in dental surgery- Baqai Medical University

PROFESSIONAL EXPERIENCE

City National Bank, LA, CA May 2018 – Present

Senior Data Analyst/Power BI Developer

This project is to plan, execute, and finalize non-technical projects according to timelines, milestone events and within budget. This includes acquiring resources, capacity planning and coordinating and motivating all project team members, contractors and/or consultants to flawlessly complete project deliverables. Project Managers help define project objectives, scope and requirements; and ensure high levels of development and project execution throughout its life cycle.

Responsibilities

Perform numerous data pulling requests using SQL for analysis.

Requirements Elicitation, Analysis, Communication, and Validation according to Six Sigma Standards.

Substantial report development experience utilizing SQL Server Reporting Services (SSRS), Cognos Impromptu, and Microsoft Excel

Generate reports using the Teradata advanced techniques like rank, row number and etc.

Involved in requirement gathering and database design and implementation of star schema, dimensional data warehouse using Erwin.

Work on different packages with various types of transformation tool like: Lookup, fuzzy Lookup, derived column, join, merge join, conditional split and slowly changing dimension.

Design, develop, and maintain business intelligence data visualizations using Microsoft Power BI.

Perform Data visualization, Dashboard design/implementation, Reporting, Self-service BI, extraction, transformation, migration, and integration and other analytics.

Develop Power BI model used for financial reporting of Daily Cash and P & L.

Collected and interpreted business requirements from various groups to produce actionable tasks.

Create complex measures using Data Analysis Expressions (DAX) query.

Install and configure Enterprise gateway and Personal gateway in Power BI service.

Schedule Automatic refresh and scheduling refresh in Power BI service.

Create Parameterized Report, Dashboard Report, Linked report and Sub Report by Year, Quarter, Month, and Week.

Create Drill Down Reports, Drill Through Report by Region.

Write calculated columns, Measures query's in power bi desktop to show good data analysis techniques.

Develop high level matrix data model based on business requirements using multiple data sources.

Create Workspace and content packs for business users to view the developed reports.

Perform data quality analysis and data migration from csv to MYSQL.

Design Power BI data visualization using cross tabs, maps, scatter plots and other available visualization templates.

Troubleshoot ETL packages by implementing SSIS functionalities, like: breakpoints, checkpoints and event handler.

Identify/document data sources and transformation rules required to populate and maintain data warehouse content.

Develop Stored Procedures to generate various Drill-through reports, Parameterized reports, Tabular reports and Matrix reports using SSRS.

Develop dashboard reports using Reporting Services, Report Model and ad-hoc reporting using Report Builder.

Create SQL server Reports using SSRS 2008. Identify the data source and defined them to build the data source views.

Work on querying data and create on-demand reports using Report Builder in SSRS reports and send the reports via email.

Perform testing for data products (Tableau reports and Teradata data infrastructure).

Develop proof of concept to replicate the processes and reports using cloud technologies.

Lead the data Correction and validation process by using data utilities to fix the mismatches between different shared business operating systems.

Design and develop matrix and tabular reports with drill down, drill through and drop down menu option using SSRS.

Create Ad-Hoc Reports, Summary Reports, Sub Reports, and Drill-down Reports using SSRS.

Involved in Master data analysis, design, Interfaces analysis, Data Analysis, Data Quality, Data Architecture tasks.

Perform Data Analysis and Data Validation by writing complex SQL queries using Teradata SQL Assistant.

Involved in mentoring specific projects in application of the new SDLC based on the Agile Unified Process, especially from the project management, requirements and architecture perspectives.

Worked on Data mapping, logical data modeling, used SQL queries to filter data within the Oracle database tables.

Perform Business Process Modeling using Visio. Perform customer Data Analysis.

Revised the Architecture document to reflect standards for ETL and ELT processes, ODS loading, the Netezza data warehouse environment and extraction via interfaces, reports and querying.

Use SDLC (System Development Life Cycle) methodologies like the RUP and the waterfall.

Aspera SmartTrack rebranded as CA Software Asset Manager (SAM).

Assist in the analysis, design, coding and development of building a new schema (staging tables, views, SQL files).

Extensively used Teradata Advanced Techniques in writing the scripts.

Facilitate creation of Taxonomy of the site with mapping of current metadatas with new PRISM elements.

Creation of Automation scripts using Python.

Creation of Abstraction Layer (Function Libraries) for supporting Automation scripts using Python.

Author various Use Cases and Activity diagrams, Sequence diagrams using Rational Requisite Pro and used UML methodology to define the Data Flow Diagrams (DFD).

Used Pivot Table to work with summarizing records and tables while made use of VLOOKUPs to find corresponding values in large excel sheets.

Create technical design documentation for the data models, data flow control process, metadata management.

Created tables, views, database links to see the information extracted from SQL files.

Worked on tables (Like Set, Multiset, Derived, Volatile, Global Temporary), views, using SQL scripts

Built a Financial Analysis Business Intelligence Solution for a public sector client using Hyperion Essbase, Hyperion EIS, Hyperion Studio, Informatica, Oracle DB, and OBIEE.

Involved in requirement gathering and database design and implementation of star schema, dimensional data warehouse using Erwin.

Identified/documented data sources and transformation rules required to populate and maintain data warehouse content.

Was responsible for indexing of the tables in that data warehouse.

Environment: MS Power BI Desktop, Crystal Reports, Power BI Reporting Server, SQL Server 2016, Python, Microsoft Power Pivot, Power Query.

Sun Trust Bank Atlanta, GA Feb 2017 – May 2018

Data/BI Analyst

Sun Trust is a financial services company ranked among the 20 largest U.S. banking companies. It offers a variety of banking products and financial solutions like personal banking, small business, retail banking, home financing and equity loans, asset management, wealth management, corporate and investment banking and credit card services to its customers. Personal Banking involves opening and maintenance of checking, savings, recurring deposit accounts and personal loans. SunTrust Web Banking project I worked on is an online banking application that enables customers to access their checking and savings accounts, and credit card account through the Internet.

Responsibilities

Drafted functional/non-functional requirement documents from JAD meetings.

Involved in the analysis of Report design requirements and actively participated and interacted with Project Lead, Technical Manager, and Lead Business Analyst to understand the Business requirements.

Involved in Logical & Physical Data Modeling. Database Schema design and modification of Triggers, Scripts, Stored Procedures in Sybase Database Servers.

Worked on data modelling and produced data mapping and data definition documentation.

Prepare Project Milestones, Tasks, Estimates, RDD, BRD, Conceptual architecture, Process Models, Entity Relationship Diagrams, Data Flows, Data Integration, Source to Target Mapping, MDM business/technical Meta Data, Facts/ Dimensions/ Hierarchies, Logical/Physical Star schema Data Models, Report mockups, Data Governance, Data Quality, SOX controls framework, Glossary.

Wrote Python scripts to parse JSON documents and load the data in database and also used python scripts to update content in the database and manipulate files.

Source Data Analysis, Source to Target data mapping, Data Integration, Define Data Governance requirements.

Participated in project review meetings and gathering Data Analysis Documents.

Developed Power BI model used for financial reporting and Customer behavior.

Expertise in writing complex DAX functions in Power BI and Power Pivot.

Wrote calculated columns and measures query's in Power BI desktop.

Developed Power BI reports for historical data analysis.

Conducted gap analysis, document functional requirements and use cases, write functional design (MD50) for PDH extensions, and assist technical design and development.

Utilized corporation developed Agile SDLC methodology. Used ScrumWork Pro and Microsoft Office software to perform required job functions.

Conducted a budget planning of the project through walkthroughs and meetings involving various leads from Development, QA and Technical Support teams.

Involved in the analysis of Report design requirements and actively participated and interacted with Project Lead, Technical Manager, and Lead Business Analyst to understand the Business requirements.

Migrated DTS packages from SQL Server as SSIS packages.

Worked on Cube structure optimization for MDX query performance in Analysis Services 2012/2008(SSAS).

Wrote complex stored procedures, views, triggers, and user-defined functions to implement business logic structures.

Added error handling techniques to SQL code to identify run time issues.

Created Calculations, KPI's and named sets utilizing multi-dimensional expressions (MDX).

Customized hierarchies, calculated measures utilizing data analysis expressions (DAX).

Secured SSAS MDX and Tabular cubes at cell and row levels respectively.

Designed multiple reports with SSRS report designer for business analyses.

Utilized Power BI to design and deploy analytical reports/dashboards from tabular models, assisting with KPI analyses.

Create complex SQL query extracting millions of raw data into easy-understandable tables with interpreting trend.

Data modeling on predicating transaction volume trends for identifying suspicious payment method misuse.

Conduct ongoing and historical data comparisons by categorized functions and classification analysis using Python.

Automated data querying/analyzing process, reduced 70% time, improved 40% accuracy by Python and VBA.

Visualize complex data as senior manager’s request for reporting metrics by using analysis tool as Spotfire, Tableau.

Generating workflows, issues tracking log and document library for customer review process by VBA.

Using big data tool for validating text mining business model to match fuzzy customer names such as Spark.

Conduct customer background research based on transaction information through search engine and authority API.

Understood the business process, performed As-Is and was the prime resource for any processes at macro and micro levels.

Assist other project teams with data analysis, data mining and data profiling as needed.

Worked on requirements statements, use cases, process flows, site maps, taxonomy/ontology analysis, and wire frames.

Liaison between the marketing and technology departments (on-shore / off-shore) for Agile and Waterfall projects: Hortonworks Hadoop (Big Data), Business Intelligence/Analytics, Data Warehouse, Master Data Management, Micro Strategy, Tableau, TeraData, Qlikview, and Omniture projects.

Reviewed the data model and reporting requirements for Cognos Reports with the Data warehouse/ETL and Reporting team.

Conducted requirements walkthrough session with the development, QA and business teams respectively. Analyze data with Informatica Analyst and update Master Data Management (MDM).

Developed data conversion programs for membership, claims, and benefit accumulator data - converted thirteen corporate acquisitions. Developed data field mappings. Provided programming and support for claims processing functions and auto-adjudication.

Data mapping, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data within the Oracle database.

Worked on all Phases of SDLC with Waterfall & Agile Scrum methodologies.

Designed conceptual and logical data models of Data Warehouse and tables.

Identify Facts, Dimensions and Hierarchies, Source to Data Warehouse Data Mapping, MDM.

Prepared Logical Data Models that contains set of Entity Relationship Diagrams and Data Flow Diagrams and supporting documents and descriptions of the Relationships between the data elements to analyze and document the Business Data Requirements.

Prepared test Data sets and performed data testing using the PL/SQL scripts. Also used MS excel for data mining, data cleansing, data mapping, data dictionary and data analysis.

Mapped process flows; assess as-is processes through user interviews, data collection and analysis, design and evaluation of to-be process solutions.

Authored progress and completion reports, which were then submitted to project management on a weekly basis.

Participated in writing data mapping documents and performing gap analysis to verify the compatibility of existing system with new business requirements.

Environment: MS Power BI Desktop, Power BI Reporting Server, SQL Server 2012, MS Office 2016, Microsoft Power Pivot, Python, Power Query, Excel with Power Pivot, Power View, TFS, SharePoint, Power BI, MS Azure

Progressive Insurance – Mayfield, OH Oct 2015 – Jan2017

Data Analyst

Progressive is a leading auto insurance company based in the Cleveland suburb of Mayfield Village, Ohio. In addition to auto insurance, Progressive offers the following types of insurance to customers throughout the country for boat/personal watercraft insurance, commercial auto insurance, homeowners insurance, motorcycle insurance and RV insurance. The project dealt with managing login and user management, database creation, tuning, backup planning, and generating reports using Power BI for the data containing information about the personal insurance policies of the customers.

Responsibilities

Coordinated with the stakeholders and project key personnel to gather functional and non-functional requirements during JAD sessions.

Responsible for prepare plan and data set for data migration from one legacy system to newly proposed system.

Assisted PMO Leadership Team with the coordination and administration of the Project Management Office including but not limited to the following activities: managing and adjusting project timeline, IT development work efforts for standardization of project implementations, administration duties, monitoring deadlines, activity and issue tracking.

Involved in Data Analysis, Data Mapping and Data Modeling and Performed data integrity, Data Portability testing by executing SQL statements.

Assist other project teams with data analysis, data mining and data profiling as needed.

Studied the existing business process and created AS-IS workflow to illustrate the existing system.

Built reports and report models using SSRS to enable end user report builder usage.

Design, development, implementation and roll-out of Microstrategy Business Intelligence applications.

Interacted with database developers for formulating the ER diagrams and data flow diagrams.

Created various reports such as ad-hoc reports, drill-down, drill-through, and parameterized reports using Reporting Services (SSRS) for internal teams.

Worked on Master Data Management (MDM) for maintaining the customer information and also for the ETL rules to be applied.

Created Use case Diagrams, Activity Diagrams, Sequence Diagrams and ER Diagrams in MS Project.

Involved in Data Modeling of both Logical Design and Physical design of data warehouse and data marts in Star Schema and Snow Flake Schema methodology.

Created daily and monthly reports using SQL and UNIX for Business Analysis.

Utilized ODBC for connectivity to Teradata & MS Excel for automating reports and graphical representation of data to the Business and Operational Analysts.

Performed extensive data modelling to differentiate between the OLTP and Data Warehouse data models

Test automation experience for GUI and application interface regression test using Win Runner, QTP and Mercury Quality Center, and scripting (TCL, Python and Perl); also experience on DB/SQL system (Oracle, tec.).

Create and maintain data model/architecture standards, including master data management (MDM).

Requirements Analysis: Gap-Fit each MDM requirement as well as provide high level of estimate for the level of effort required to meet each requirement.

Manual testing for checking the data flow of the application using SQL.

Worked on loading data from flat files to Teradata tables using SAS Proc Import and FastLoad Techniques.

Provide business intelligence analysis to decision-makers using an interactive OLAP tool.

Designed and Developed the Business Objects Universes which suit the standard, analytical and ad-hoc reporting requirements of the Business Objects users.

Collaborated with corporate accountants to analyze Financial Statements for Risk Estimation according to GAAP principles.

Used the DataStage Designer to develop processes for extracting, cleansing, transforms, integrating and loading data into data warehouse database.

Used Star Schema and Snow flake Schema for data marts / Data Warehouse.

Gathered requirements and modeled the data warehouse and the underlying transactional database.

Facilitated Joint Requirement Planning (JRP) sessions with SME’s in understanding the Requirements pertaining to Loan Origination to Loan Processing.

Create and Review the Acceptance test cases with the Product management and the Development team.

Assured that all Artifacts are in compliance with corporate SDLC Policies and guidelines.

Environment: MS SQL Server 2008, Visual Studio 2008, windows server 2008, T-SQL, QlikView 11/10,Tableau, MS Excel, Microsoft SQL Server Integration Services (SSIS), Microsoft SQL Server Reporting Services (SSRS), TFS.

Wells Fargo, Charlotte, NC Jun 2012 – May 2015

Data Analyst

Wells Fargo is the fourth largest bank in the U.S. by the assets and the largest bank by market capitalization. Loan Origination System – Mortgage Banking Division provides a variety of products and services to individual that include mortgage for home purchase, home equity loans, refinancing, reverse mortgage and specialized mortgage loans. The Loan Origination System concerns with automating every task involved in a Mortgage business beginning from pre approving an applicant to the formal closing of a mortgage loan. The LOS automates movement of records within the organization for approval along the hierarchy, governed by the organization's business rules and guidelines.

.Responsibilities

Created complex SQL queries, stored procedures, functions, triggers, joins and SSIS/DTS packages.

Responsible for performance tuning and optimization of SQL queries and indexes.

Created the design documents for the entire project including table structures and design logic of stored procedures, packages and Reports.

Analyzed existing risk models and internal credit ratings on the customers by comparing the critical financial index.

Implement database service/testing system, such as bank transactions data testing system, and assist model validation.

Organized financial data sets using appropriate experimental design and hypothesis validation.

Controlled the financial criminal behavior by analyzing the transactions account information using SQL Server for querying.

Analyzed existing risk models and internal credit ratings on the customers by comparing the critical financial index.

Managed the risk of co-lending banks by analyzing the capital distribution (focusing on long-term debt and outstanding stock) and liquidity/credit risk, eliminated the side effect of co-lending activity.

Created and managed database schema objects such as tables, views, indexes, and stored procedures and maintaining referential integrity for the lookup tables.

Created Stored Procedures in T-SQL to fill Datasets from MNS Database and generating Media Rates Reports in Excel.

Responsible for fine tuning of the database, trouble shooting, user administration, memory management, and running DBCC commands.

Perform daily database backup & restoration and monitor the performance of Database Server.

Debugged Stored Procedures and improved Query Performance while Retrieving Millions of Records by creating Indexes.

Responsible for developing processes, automation of maintenance jobs, tuning SQL Server, locks and indexes configurations, administering SQL Server security, SQL Server automatic e-mail notification and SQL Server backup strategy and automation.

Configured and maintained Report Manager and Report Server for SSRS

Designed the usage summary, performance, Audit/Log, Job Status and detailed reports for ETL users in SSRS.

Experience in creating adhoc reports Using Report Builder 1.0 and Report Builder 2.0.

Involved in Migration of SSRS 2005 to SSRS 2008.

Involved in extracting the data using SSIS from OLTP to OLAP.

Involved in creating Tables, Stored procedures, functions, and triggers at the back-end by using SQLSERVER.

Designed high level ETL architecture for overall data transfer from the source server to the Enterprise Services Warehouse which encompasses server name, database name, accounts, tables and direction of data flow, Column Mapping, Data dictionary and Metadata.

Transformed data from various data sources using OLE DB connection by creating various SSIS packages.

Environment: SQL Server 2000/2005, Enterprise Edition, SQL, Enterprise manager, SSRS 2005, SSIS 2005, Spotlight, MS SharePoint, MS Access 2000 & Windows 2003/2000 platform,Oracle,VB.NET,ASP.NET.



Contact this candidate