Post Job Free
Sign in

Data Analyst Modeling

Location:
Chicago, IL
Posted:
March 30, 2025

Contact this candidate

Resume:

Afreen

Email: *************@*****.***

Ph#: 630-***-****

Professional Summary:

• IT Professional experience with over 5+ years as a Data Analyst.

• Strong Data Warehousing, Data Marts, Data Analysis, Data Organization, Metadata and Data Modeling experience on RDBMS databases.

• Extensive knowledge in Designing, Developing and implementation of the Data marts, Data Structures using Stored Procedures, Functions, Data warehouse tables, views, Materialized Views, Indexes at Database level using PL/SQL, Oracle.

• Excellent hands on experience in relational data modeling.

• Experience in creating views for reporting purpose which involves complex SQL queries with sub-queries, inline views and multi table joins, with clause and outer joins as per the functional needs.

• Good Working experience in Teradata, Ab Initio, Business Objects, Crystal Reports, PL/SQL, SAS, MS Excel, MS Access.

• Proficient in Teradata SQL coding, using Teradata BTEQ utility, Working Knowledge on Teradata/Parallel Transport utility/TPUMP. (TPT) coding.

• Good hands on experience in UNIX shell scripting.

• Good Experience on Data archival process to SAS data sets and flat files.

• Experience in creating datasets using SAS proc sql from flat file, .csv files, ^,, etc. through UNIX dev box and Wrote Macros, functions, etc. in SAS 9.2 and SAS 9.3.

• Strong experience on Base SAS, SAS/Stat, SAS/Access, SAS/Graphs and SAS/Macros, SAS/ODS and SAS/SQL in Windows Environment.

• Maintain ethical standards with data by ensuring database integrity as well as compliance with legislative, regulatory and accrediting requirements.

• Experienced in performance tuning and optimization for increasing the efficiency of the scripts on large database for fast data access, conversion and delivery.

• Experience creating design documentation related to system specifications including user interfaces, security and control, performance requirements and data conversion.

• Extensively Worked in Agile delivery environments and all phases of Software Development Life Cycle (SDLC).

• Team Player as well as able to work independently with minimum supervision, innovative & efficient, good in debugging and strong desire to keep pace with latest technologies.

• Excellent Communication and presentation skills along with good experience in communicating and working with various stake holders.

Technical Skills:

ETL Tools Informatica, SSIS, DataStage, Ab Initio

Data Modeling Erwin, Power BI

Databases Teradata, Oracle, MS SQL Server, MS Access, DB2 OLAP Tools Micro strategy OLAP Suite, Cognos, Business Objects Languages C#.Net, Asp.Net, ADO.Net, SQL, PL/SQL, Unix Shell Scripts Operating Systems Windows, Unix, Sun Solaris, Linux Testing Tools HP ALM and Quality Center

Domain Banking, Finance, Retail

Professional Experience:

Client: Broadway Bank, San Antonio, TX. Apr 2023 – Till Date Role: Data Analyst

Responsibilities:

• Interacted with clients to gather business and system requirements which involved documentation of processes based on the user requirements.

• Analyzed the pertinent client data and worked with the Analytics team to inspect, cleanse, transform and model data and provide valuable feedback to the client.

• Performed Data Analysis using visualization tools such as Tableau, Spotfire, and SharePoint to provide insights into the data.

• Implemented Multidimensional and Tabular cubes to perform interactive analysis.

• Configured Azure platform offerings for web applications, business intelligence using Power BI, Azure Data Factory etc.

• Conceptualized and Designed Extraction, Transformation and Loading of client data from multiple sources into SQL Server Integration Services (SSIS).

• Created DDL scripts for implementing Data Modeling changes. Created ERWIN crystal reports in HTML, RTF format depending upon the requirement,

• Published Data model in model mart, created Skilled in System Analysis, E-R/Dimensional Data Modeling, Database Design and implementing RDBMS specific features.

• Created the conceptual model for the data warehouse using Erwin data modeling tool. Part of team conducting logical data analysis and data modeling JAD sessions, communicated data-related standards implementing Data modeling, Erwin, Dimensional Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Datamarts, OLAP, FACT & Dimensions tables, Physical & Logical data modeling and Oracle Design.

• Assisted in mining data from the SQL database that was used in several significant presentations.

• Assisted in offering support to other personnel who were required to access and analyze the SQL database.

• Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Teradata.

• Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.

• Responsible for development of workflow analysis, requirement gathering, data governance, data management and data loading.

• Have setup data governance touch points with key teams to ensure data issues were addressed promptly.

• Responsible for facilitating UAT (User Acceptance Testing), PPV (Post Production Validation) and maintaining Metadata and Data dictionary.

• Responsible for source data cleansing, analysis and reporting using pivot tables, formulas (v-lookup and others), data validation, conditional formatting, and graph and chart manipulation in Excel.

• Actively involved in data modeling for the QRM Mortgage Application migration to Teradata and developed the dimensional model.

• Created views for reporting purpose which involves complex SQL queries with sub-queries, inline views, multi table joins, with clause and outer joins as per the functional needs in the Business Requirements Document (BRD).

• Conducted data cleaning, manipulation, modification and combination using variety of SAS steps and functions. Analyzed, tested, documented and maintained SAS programs and macros to generate SAS datasets, spreadsheets, data listing, tables and reports.

• Responsible for generating Financial Business Reports using SAS Business Intelligence tools (SAS/BI) and also developed ad-hoc reports using SAS Enterprise Guide Environment: Agile, Teradata, BTEQ, Fast Load, Fast Export, Multi load, Oracle, Unix Shell Scripts, SQL Server, SAS, PROC SQL, MS Office Tools, MS Project, Windows XP, MS Access, Pivot Tables. Client: Citizens Insurance Jacksonville, FL. Feb 2022 – Mar 2023 Role: Data Analyst

Responsibilities:

• Perform Daily validation of Business data reports by querying databases and rerun of missing business events before the close of Business day.

• Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks.

• Worked on claims data and extracted data from various sources such as flat files, Oracle and Mainframes.

• Gathered Business requirements by interacting with the business users, defined subject areas for analytical data requirements.

• Created Datasets using SAS proc SQL from flat file.

• Optimizing the complex queries for data retrieval from huge databases.

• Root cause analysis of data discrepancies between different business system looking at Business rules, data model and provide the analysis to development/bug fix team.

• Lead the Data Correction and validation process by using data utilities to fix the mismatches between different shared business operating systems.

• Conduct downstream analysis for different tables involved in data discrepancies and arriving at a solution to resolve the same.

• Extensive data mining of different attributes involved in business tables and providing consolidated analysis reports, resolutions on a time to time basis.

• Performed data analysis and data profiling using complex SQL

• Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.

• Verifying data quality after every deployment and perform extensive analysis for the data variance pre and post implementation.

• Utilize data to prepare and conduct quality control with SAS.

• Provide programming parameters to technical staff for all data extracts.

• Ensure to partner with internal teams to prepare analytics as well as validate results with SAS.

• Conducted data cleaning, manipulation, modification and combination using variety of SAS steps.

• Developed and maintained MS Access database.

• Created an MS Access database to collect data systems in order to provide lists, status reports and management overview reports.

• Managed all data related aspects of multiple directory projects through various stages of production.

• Defined the periodic extraction and performed formatting of data from multiple sources and laid out basic queries and reported required by the customer.

• Analysis of CDE (Critical data elements) in business process, data conversion, data movement and Data integrity check before delivering data to operations, financial analyst for uploading to databases in accordance with IM policy compliance.

• Extensively involved in User Acceptance Testing (UAT) and Regression testing.

• Involved in Data Reconciliation Process while testing loaded data with user reports.

• Documented all custom and system modification

• Worked with offshore and other environment teams to support their activities.

• Responsible for deployment on test environments and supporting business users during User Acceptance testing

(UAT).

Environment: DataStage, Oracle 10g, DB2, Sybase, TOAD, Cognos, SQL Server, TSYS Mainframe, SAS PROC SQL, SQL, PL/SQL, ALM/Quality Center 11, QTP 10, UNIX, Shell Scripting, XML, XSLT. Client: Molina Healthcare, Long Beach, CA. Nov 2020 – Jan 2022 Role: Data Analyst

Responsibilities:

• Responsible for gathering data migration requirements.

• Identified problematic areas and conduct research to determine the best course of action to correct the data.

• Analyzed reports of data duplicates or other errors to provide ongoing appropriate inter-departmental communication and monthly or daily data reports.

• Monitor for timely and accurate completion of select data elements

• Identify, analyze, and interpret trends or patterns in complex data sets

• Monitor data dictionary statistics

• Developed logical and physical data models that capture current state/future state data elements and data flows using Erwin.

• Extensively used Erwin tool in Forward and reverse engineering, following the Corporate Standards in Naming Conventions, using Conformed dimensions whenever possible

• Involved in data mapping and data clean up.

• Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.

• Enhance smooth transition from legacy to newer system, through change management process.

• Archived the old data by converting them in to SAS data sets and flat files.

• Analyzed and maintained SAS programs and macros to generate SAS datasets.

• Created Technical Design Documents, Unit Test Cases.

• Involved in Test case/data preparation, execution and verification of the test results

• Reviewed PL/SQL migration scripts.

• Coded PL/SQL packages to perform Application Security and batch job scheduling.

• Created user guidance documentations.

• Created reconciliation report for validating migrated data.

• Analyzed problem and solved issues with current and planned systems as they relate to the integration and management of order data.

• Planned project activities for the team based on project timelines using Work Breakdown Structure. Environment: UNIX, Shell Scripting, XML Files, XSD, XML Spy, SAS PROC SQL, Cognos, Oracle 10g,Teradata, Sybase, Mercury Quality Center, Toad, Autosys.

Client: The Home Depot, Atlanta, GA. Sep 2019 – Oct 2020 Role: Data Analyst

Responsibilities:

• Analyzed problem and solved issues with current and planned systems as they relate to the integration and management of order data.

• Data flow check with source to target mapping of the data.

• Data matrix creation for mapping the data with the business requirements

• Data profiling to cleanse the data in the data base and raise the data issues found.

• Created and reviewed mapping documents based on data requirements

• Engaged in logical and physical designs and transforms logical models into physical model through forward engineering of Erwin tool.

• Perform small enhancements (data cleansing/data quality).

• Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.

• Involved in data mapping and data clean up.

• Enhance smooth transition from legacy to newer system, through change management process.

• Created Datasets using SAS proc SQL from flat file.

• Extracted, transformed and loaded the data into databases using Base SAS.

• Involved in Test case/data preparation, execution and verification of the test results.

• Reviewed PL/SQL migration scripts.

• Coded PL/SQL packages to perform Application Security and batch job scheduling.

• Created user guidance documentations.

• Created reconciliation report for validating migrated data. Environment: UNIX, Shell Scripting, XML Files, XSD, XML Spy 2010, SAS PROC SQL, Oracle, Teradata, Sybase, Toad.



Contact this candidate