Post Job Free
Sign in

Data Analyst Modeling

Location:
Saint Paul, MN
Posted:
May 05, 2024

Contact this candidate

Resume:

Niha Sanobar Syed

Senior Data Analyst

********@*****.*** 614-***-****

PROFESSIONAL SUMMARY

Around 8 years of experience in Data mining with large datasets of Structured and Unstructured data, Data Acquisition, Data Validation, Predictive modeling, Data Visualization.

Experienced in using SAS, Tableau and Power BI for data cleaning, data visualization, risk analysis and predictive analytics.

Strong Knowledge and hands-on experience of SDLC methodologies and Business Process Models (BPM) like Agile Modeling and Waterfall model.

Extensive experience in building reports and dashboards in Microsoft PowerBI by using Azure Datawarehouse and Microsoft Azure Analysis Services, involved in performance tuning of reports and resolving issues within database and cubes.

Knowledge in retrieving data from different databases like Oracle, SQL, MySQL, MS Access, DB2, andTeradata.

Expertise and Vast knowledge of Enterprise Data Warehousing including Data Modeling, Data Architecture, Data Integration (ETL/ELT) and Business Intelligence.

Experienced in Dimensional Data Modeling experience using Data modeling, Relational Data modeling, ER, Star Join Schema/Snowflake modeling, FACT & Dimensions tables, Conceptual, Physical & logical data modeling.

Good experience in Production Support, identifying root causes, Troubleshooting and Submitting Change Controls.

Possess strong analytical and problem-solving skills and have a quick learning curve. Committed team player and capable of working on tight project delivery schedules and deadlines.

Experienced in writing Design Documents, System Administration Documents, Test Plans & Test Scenarios/Test Cases and documentation of test results.

Responsible for architecture design, data modeling, and implementation of Big Data platform and analytic applications.

Experienced in Data Scrubbing/Cleansing, Data Quality, Data Mapping, Data Profiling, Data Validation in ETL.

Sound knowledge on SDLC process - Involved in all phases of Software Development Life Cycle - analysis, design, development, testing, implementation and maintenance of applications.

Worked and extracted data from various database sources like Oracle, SQL Server, DB2, and Teradata.

Involved in writing shell scripts on UNIX for Teradata ETL tool and data validation.

Excellent knowledge in preparing required project documentation and tracking and reporting regularly on the status of projects to all project stakeholders.

Excellent in creating various artifacts for projects which include specification documents, data mapping and data analysis documents.

An excellent team player & technically strong person who has capability to work with business users, project managers, team leads, architects and peers, thus maintaining healthy environment in the project.

TECHNICAL SKILLS

ETL Tools

SSIS, SSMS, SSRS, Informatica, TOAD

Databases

MS SQL Server, MS Access, DB2, Oracle, Teradata

Operating Systems

Linux, UNIX, Windows

Programming Languages

SQL, VB Script, Python

Software

Microsoft Office (Excel, Word, PowerPoint and Access), SQL Reporting

Reporting Tools

Tableau and Power BI, Qlik Sense, QlikView, SAP BI/BO

Project Management Tools

JIRA, GitHub, Confluence, SharePoint

Methodology

Agile, Waterfall

PROFESSIONAL EXPERIENCE

Humana Kentucky Remote Jan 2022- Present

Description: Enterprise Data Warehouse (RO) Project – Track the Reporting optimization for the business to have a summary and detailed views of Medicare and Medicaid services with respect to different policies and applications provided.

Tools Used: TOAD for Data Point/Oracle, Power BI Reports, Qlik Sense and QlikView Reports, Microsoft, Azure

Roles & Responsibilities:

Communicating with the business and collecting the requirement, understanding the requirement

Leading the QA Offshore, collecting status

Creating a capacity sheet allocating the resources against timeline

Creating test Plan and test cases, uploading test case on Azure Dashboard

Connecting with DEV, contributing to Grooming and Sprint Planning

Executing SQL queries and test cases in DEV and QA and Prod env

Raising a bug if any and following it through till the closure

Maintain the test result document

Performed automation using Tosca Tool

Compliance testing performed for the healthcare domain

Ameriprise Minneapolis Aug 2021 - Jan 2022

Description: Advice insights Project – Data Profiling and creating advice insights for the Ameriprise customers to help them take better financial decisions

Tools Used: Amazon AWS, HP – ALM, Jira, Python

Roles & Responsibilities:

Communicating with the business and collecting the requirement, understanding the requirement

Data Profiling – collecting data and examining, analyzing, and creating useful summaries of data for business.

Creating an Advice Insight design document – Data curation, creating Pseudocode, data mapping, data modelling

Handing off the Insight design document to the development team

Performing Content Data testing via AWS Data Lake once we get the code from the developers, and data lake validation

Raising defects if any and executing all the test cases and scenarios – Jira and HP- QC ALM

Attending Stand up calls and scrum meetings – Agile methodologies

Presenting the Insights to the business through e-meetings

Moody’s Analytics New York Remote Nov 2020- Aug 2021

Description: HDS Fermat Project - Testing a web application for our financial services client to facilitate a bidirectional connection and data transfer on-demand between client’s existing RDBMS Oracle databases and their new Big Data Analytics platform (Hadoop) using modern tech stack (Java, Spring frameworks and rich UI).

Tools Used: Oracle SQL Developer, DBeaver, Hadoop Hive Hue, Amazon AWS, HP – ALM, Jira, TEAMS App

Roles & Responsibilities:

Positive attitude, appreciation for technology and strong desire to accomplish assigned tasks

Extensive experience in ETL, DWH, Report testing, automated testing, performance testing and code review tools.

Prepare tests plan, test strategy, test cases and other documentation necessary

Experience in debugging web applications to find root-cause.

Work with developers to resolve technical issues and fix bugs

Analyze and provide recommendations for enhancement and improvements

Working experience using Cucumber BDD framework

Work closely with onshore/client business and technical teams and end-users

Always follow organization and project software development processes

Worked with Agile methodologies

Worked with Hadoop and Hive Hue to interact with the DB as opposed to SQL server

Wells Fargo India Nov 2018- Nov 2020

Description: Market Risk QA: Generate and validate the Value at Risk for all the underlying trades daily. Trade and portfolio information from 100+ sources are staged and transferred before used for generating general and stressed VAR along with Incremental and Specific risk. Generated Var will be read by an in-house UI for corresponding Market Risk Officer’s review.

Tools Used: SQL Server Management Tool, SSIS Packages, Autosys Jobs, HP – ALM, Jira

Roles & Responsibilities:

Schedule and run batch of jobs for loading, transformation and valuation using Autosys.

Used Unix Shell Scripting for Autosys i.e., command prompt to run jobs.

Validate transactional data loading, transformation, and report extraction by preparing SQL.

Verify trade and portfolio information for VAR calculation.

Perform functional testing of new features and Regression testing.

Validate UI results and functionalities before MRO review.

Develop Test Scenarios, Test cases, and Test scripts.

Test execution includes defect reporting, defect tracking and defect retesting for progression and regression testing.

Started implementing selenium – with java batch jobs.

Worked with Agile methodologies.

Data Risk management, Identity Management

Fifth Third Bank India Dec 2015- Nov 2018

Description: Based on customer relationship with the 5/3rd bank which was SLK's client, PEGA - a Decision making tool would offer services/products to the customer, which a banker would call and explain to the customer E.g.: Debit /Credit card or eligibility criteria for diff types of loans, Customer then has delinquent days within which he must opt in with the services to get the benefits availed.

Tools Used: AQT, HP – ALM, Microsoft Office, SAP – BO Tableau dashboards, JIRA, big data, Hive, Hadoop

Roles & Responsibilities:

Extraction of data from source DB and comparison with master data in Target DB.

Understanding the business requirements.

Writing the Test Plan, Test Cases, and SQL Queries to fetch Data from DB through AQT Tool and comparing the Source data with target data through Beyond Compare Tool (Macro).

Performing Table Structure Validation, Data Validation, Unique Index Validation, Count Validation, Distinct column validation, Boundary Value Analysis, Equivalence Partitioning.

Execution and Defect Tracking, Preparing System Understanding Document.

Worked with Agile methodologies, Worked with Informatica Tool for ETL Data integration and since it offers capability to connect and fetch data from different heterogenous source and processing of data.



Contact this candidate