Post Job Free
Sign in

Data Analyst Sql Server

Location:
Fort Worth, TX
Posted:
August 01, 2025

Contact this candidate

Resume:

PARVEEN JALEEL . J

Data Analyst

Phone: +1-404-***-****

Email: *******.***********@*****.***

SUMMARY:

•11+ years of progressive IT experience in the field of Business Requirement Analysis, Data Analysis, Testing of Data warehousing & Database, ETL Development, and Data Modeling.

•Good experience in evaluating business systems for user needs, business modeling and document processing.

•Strong background in designing various Logical and Physical Data Models using Erwin, MS Visio, ER Studio, Power Designer and toad data modeler.

•Extensive work experience on Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) system environment.

•Expert knowledge in SDLC (Software Development Life Cycle) and was involved in all phases in projects process and familiar with Agile, Scrum, and Waterfall methodologies.

•Extensive knowledge and expertise in Metadata Management System, Data Integration, Data Mapping, Information Gathering, Data Cleansing, Data Manipulation and Performance Tuning techniques for finest performance in dimensional and relational database environments.

•Solid experience in Relational Modeling, Dimensional Modeling, Conceptual Modeling, Creating Logical Modeling, Physical Modeling, Data warehousing, Fact Tables, Dimension Tables, Star Schema and Snowflakes Schema as per enterprise standards.

•Experience in developing PL/SQL Scripts, stored procedures, Triggers, Views, and Indexes.

•Experience in Extract, Transform and Load (ETL) data from spreadsheets, database tables and other sources using SQL Server Integration Services (SSIS) and SQL Server Reporting Service (SSRS) for managers and executives.

•Strong understanding and usage of various type of dimensions like Junk Dimensions, Slowly Changing Dimensions, Role Playing and Degenerate Dimensions.

•Experienced in working with RDBMS like Oracle 10g/ 9i/ 8i, Microsoft SQL Server 2008 R2 and Teradata.

•Proficient in designing of Star and Snowflake schemas with a very good understanding of fact and dimensional tables.

•A very good knowledge with dealing different types of data sources such as flat files, Excel, Oracle, Sybase, and SQL Server.

•Working closely with ETL/ Report Developers, Database administrators, Business Analysts and support teams.

•Well versed in Conceptual, Logical/Physical, Relational and Multi-dimensional modeling, Data analysis for Decision Support Systems (DSS), Data Transformation and Reporting.

•Exposure to Both Ralph Kimball and Bill Inmon Data Warehousing Approaches.

•Having Good Experience with Normalization (1NF, 2NF and 3NF) and De-normalization techniques for improved database performance.

•Working experience in Dimensional Data modeling, Star Schema/Snow flake schema, Fact & Dimensions Tables.

•Experience in building reports using SQL Server Reporting Services and Crystal Reports.

•Experience in performing Reverse Engineering of Physical Data Models from data, SQL scripts.

•Experience in using SSIS in solving complex business problems.

•Expertise in SQL Server Analysis Services (SSAS) and SQL Server Reporting Services (SSRS) tools

•Strong working Experience with Agile, Scrum, Kanban and Waterfall methodologies.

•Capture, validate and publish metadata in accordance with enterprise data governance policies and MDM taxonomies.

•Extensive experience in development of T-SQL, Oracle PL/SQL Scripts, Stored Procedures and Triggers for business logic implementation.

•Hands on experience in analyzing data using Hadoop Ecosystem including HDFS, Hive, Streaming, Elastic Search, Kibana, Kafka, HBase, Zookeeper, PIG, Sqoop, and Flume.

•Good Knowledge on SQL queries and creating database objects like stored procedures, triggers, packages and functions using SQL and PL/SQL.

•Experience in testing Business Intelligence reports generated by various BI Tools like Cognos and Business Objects.

•Experience in generating DDL (Data Definition Language) Scripts and creating Indexing strategies.

•Impressive presentation as well as communications skills with very good leadership qualities.

TECHNICAL SKILLS:

Programming Languages

C, and C++, SQL, Python, R, Java.

Scripting Languages

MS-DOS, Bash, Korn.

Reporting Tools

SSRS, Power BI, Tableau, SSAS, MS-Excel, SAS BI Platform.

Concepts

Predictive analysis, Decision trees, NLP, Data mining, Neural Networks, Computer Vision, Big Data, Machine Learning, Snowflake

Technologies

MS Excel, Tableau, PowerBI, Google Looker, SQL, NoSQL, AWS Redshift, Jira, TensorFlow, OpenCV, Keras, Flask, KNN

OLAP Tools

Tableau 7, SAP BO, SSAS, Business Objects, and Crystal Reports.

Data Modeling

Erwin Data Modeler, Erwin Model Manager, ER Studio v17, and Power Designer.

Application/Web Servers

JBoss, Glassfish 2.1, WebLogic, Web Sphere, Apache Tomcat Server.

MS-Office Package

Microsoft Office (Windows, Word, Excel, PowerPoint, Visio, Project).

Visualization tools

Tableau Desktop, Python, Pandas, NumPy, Datorama

ETL Tools / Tracking tool

Informatica, SSIS, SSAS, SSRS / JIRA.

Database Development

T-SQL and PL/SQL, Microsoft Hyper-V Servers

Databases

Teradata R12 R13 R14.10, MS SQL Server, DB2, Netezza

BI/Reporting Tools

MS Power BI, Power BI Service, Google Analytics, Google Data Studio, MicroStrategy 9.x/8.x, IBM Cognos 10/8.x, Pentaho, OBIEE 11g, Oracle BI Publisher, SAP Business Objects XIR2, SSRS, SAS

RDBMS

Oracle 10g/9i/8i/7.x, MS SQL Server, UDB DB2 9.x, Teradata, MS Access 7.0

Environment

Windows (95, 98, 2000, NT, XP), UNIX

Utilities

SQL Server Management Studio, gamin 3, IBM Data Studio 4.1.2, Toad, SQL Developer, SQL plus, Rapid SQL, Teradata SQL Assistant 16, Teradata Studio 16, NZSQL, SQL*Loader, DB Visualizer, Navicat

EXPERIENCE:

Client: CTS/Verizon. USA

Role: Data Analyst Feb 2023-Present

Description: CTS mobility is a National Value-Added Reseller and Integrator of Mobility Solutions for SMB, Enterprise & Government markets. Since 2004, the company has emerged as a category leader by creating innovative solutions.

Responsibilities:

Collaborated with Subject Matter Experts and Product Owners to define data requirements and source-to-target mapping necessary to support advanced analytics initiatives within M&SC (Manufacturing and Supply Chain), including the interfaces between applications, databases and platforms.

Created a Datawarehouse and Integrated different systems data into the Warehouse (Salesforce and SAP). Worked with multiple teams for combining the different Databases to a single Universe.

Made Power BI reports more interact and activate by using storytelling features such as bookmarks, selection panes, drill through filters also created custom visualizations using “R-Programming Language”.

Used Machine learning using “Python-Programming Language” for converting different language text to English for analysis.

Utilized Power BI to create various analytical dashboards that helps business users to get quick insight of the data

Published and maintained workspaces in Power BI Service, allotted the time refresh for the data, maintained the apps and workbooks.

Used DAX functions to create measure and calculated fields to analyze data for logical visualization and used Power Queries to transform the data.

Converted the Boulder Tables from Excel to Power BI with better Graphs and visualizations for the understanding and improvement of business.

Collected the Plants Data in Excel sheets from different Regions around the World and combined them in Power BI to know the Loss and Profit of the Product.

Performed COPQ and DPPM analysis in Power BI and helped Global Quality Leaders to take the Business-driven decisions.

Translated data requirements to produce comprehensive, detailed data maps and canonicals.

Identified supply chain problem areas and opportunities to improve service, reduce cost, and increase inventory turns

Performed ETL on multiple platforms such as SAP HANA and Teradata, familiarized with Confidential Legacy, SAP and other backend environments to pull the data to Data Warehouse

Created and maintained Apps and Reports in Salesforce for Customer Issue Resolution and maintained different Process of CRM.

Used SQL queries to create interactive Dashboard in Salesforce Einstein Analytics Studio and analyzed CRM data.

Created and maintained SharePoint page for Lean Six Sigma using Power Queries for Plant Data and Audited the different projects in the company for Discrepancies.

Used BOXI (Business Objects) for creating reports for Legacy Systems.

Mapped the data in different systems and Documented them, IBM Lotus Notes is mostly used for mapping the Legacy Systems and Customer Feedback Resolution.

Performed Queries, maintained and extracted Data from Teradata, cleaning and maintaining the Production Environment in Teradata for the performance.

Environment: Power BI Desktop, Power BI Service Pro, Teradata, Salesforce, Einstein Analytics Studio, Business Objects, SharePoint, SAP HANA, SAP Logon, QlikView, IBM Lotus Notes, SQL, SQL Server, MS Access, Microsoft Office Suite (Word, PowerPoint, Excel)

Client: Amazon USA

Role: Data Analyst Oct 2022-Jan 2023

Description: Amazon is guided by four principles: customer obsession rather than competitor focus, passion for invention, commitment to operational excellence, and long-term thinking. We are driven by the excitement of building technologies, inventing products, and providing services that change lives.

Responsibilities:

Participated in daily scrum meeting with team to discuss on the challenges and development of the project

Involved in requirement gathering from business users and IT teams to understand and express business process and business logic

Extracted data for reporting from multiple data sources like SQL Server, SQL Server Analysis, Service, Azure SQL Database, Azure SQL Data Warehouse, Salesforce, etc.

Extracted data from data warehouse server by developing complex SQL statements using stored-procedures and common table expressions (CTEs) to support report building

Implemented Tableau for visualizations and views including scatter plots, box plots, heat maps, tree maps, donut charts, highlight tables, word clouds, reference lines, etc.

Generated complex reports in Tableau using multiple functionalities including context filters, hierarchies, parameters, LOD Expressions, time intelligence, etc.

Built interacting and storytelling dashboards in Tableau workbooks using dashboard actions

Published Tableau workbook on Tableau Server and set report refresh

Used Data Analysis Expressions (DAX) to create measures and calculated fields to analyze data and to create time intelligence.

Extensive use of DAX (Data Analysis Expressions) functions for the Reports and for the Tabular Models.

Used Data Analysis Expressions DAX to create custom calculations in PowerPivot for Microsoft Excel workbooks and Analysis Services tabular model projects.

Created PowerPivot models and PowerPivot reports, publish reports onto SharePoint, and deploy models into SQL Server SSAS instance.

Assisted in creating SQL database maintenance logs and presenting any issues to the database architects.

Worked on Power BI reports using multiple types of visualizations including line charts, doughnut charts, tables, matrix, KPI, scatter plots, box plots, etc.

Made Power BI reports more interact and activate by using storytelling features such as bookmarks, selection panes, drill through filters, etc.

Created and administrated workspaces for each project on Power BI service and published the reports from Power BI Desktop to Power BI Services workspace

Utilized Power BI to create various analytical dashboards that helps business users to get quick insight of the data

Installed Data Gateway and configured schedule plans for Data Refresh of Power BI reports

Created SSIS packages for ETL processes by using control flow and data flow components

Maintained and tuned performance for slowly running SSIS packages

Responsible for accurately reporting on the financial aspect of the company's operations by rendering reports on daily / weekly / monthly base

Evaluated returns and risks for various types of investments including stocks, fixed-income assets, hedge fund, real estate and commodities and evaluated risk tolerance for customers

Created ad-hoc reports such as YTD, YOY, MTD, MOM analysis for income statement, cash flow by requests from business units

Implemented report integration among Power BI, Tableau, SSRS and Python reports.

Environment: Power BI Desktop, Power BI Service, Project Online, SQL Server 2016/2017, SSRS, SSIS, SSAS, DAX, Tableau, Microsoft Azure, Python, SharePoint, Visual Studio, Scrum/Agile, BI Development Studio (BIDS), MDX, XML, SQL Profiler, TFS, MS Office, Windows 10/7, Data Analyzer.

Client: Precision System Design Inc. USA

Role: Data Analyst Jan 2021-Sep 2022

Description: Precision brings an unparalleled combination of Technical Expertise, Integrity and Responsive Service to both our clients and candidates.

Responsibilities:

Worked wif Data Analysis Primarily Identifying Data Sets, Source Data, Source Metadata, Data Definitions Data Formats, Data Validation, Data Cleansing, Data Verification and Identifying Data Mismatch.

Created various profiles using Informatica Data Explorer (IDE) & Informatica Data Quality (IDQ), from existing sources and shared those profiles wif business analysts for their analysis on defining business strategies in terms of activities such as assigning matching scores for different criteria etc.

Actively participated in HL7 working group activities and implemented teh MITA Medicaid Eligibility Architecture based on HL7.

Used SQL and SAS skills to perform ETL from DB2 and Teradata databases and created SAS datasets, SAS macros, Proc/data steps and SAS formats as required.

Operationalized Collibra wif Rochade, teh former being used for Data Governance and teh latter for Data Lineage.

Drove feature engineering for teh AI, ML models for demographics, vitals, labs, medications, clinical events, healthcare utilization, and comorbidities and HIT Interoperability FHIR Server(s) and FHIR API management.

Responsible for envisioning and architecting teh enterprise-level Identity and Access Management Infrastructure initiative, addressing teh need for compliance wif teh HIPAA regulations pertaining to security and privacy of patient health information. Importing large amount of data (Enrollment, Medical claims, Pharmacy Claims) from various healthcare group to SCIO SAS system.

Involved in the requirements gathering and analysis.

Created and deployed SSIS packages using various transformations such as Slowly Changing Dimension, Multicast, Merge Join, Lookup, Fuzzy Lookup, Conditional Split, Aggregate, Derived Column, and Data Conversion Transformations.

Designed the Data model and Load strategy to get data from different systems.

Exploring and executing SSIS packages in Business Intelligence Development Studio (BIDS).

Configured and maintained Report Manager and Report Server for SSRS.

Created Sub-Reports, Drilldown-Reports, Summary Reports, and Parameterized Reports in SSRS.

Creation of SSRS Subscription and Snapshot reports.

Created Tableau Dashboards with interactive views, trends and drill downs.

Designed and developed various analytical reports from multiple data sources by blending data on a single worksheet in Tableau Desktop.

Perform performance testing for the main business areas.

Trillium Tool to test Quality Checks.

Assist Test lead to create test plan and strategy for the tasks to mitigate the risks to system quality.

Create Test Scenarios, Test Cases and expected results for module and system testing

Review the Test Cases and documents prepared by other team members

Logging of defects and tracking them till resolution.

Prepare the validation document like test summary and test process approach document.

Performed T-SQL tuning and optimization of queries for reports that took longer execution time using MS SQL Profiler, Index Tuning Wizard and SQL Query.

Responsible for creating fact and dimension tables, build cubes using SSAS.

Evaluated database performance and performed maintenance duties such as Tuning, Backup, Restoration and Disaster Recovery.

Created indexes, constraints and rules on database objects.

Ensured best practices are applied and integrity of data is maintained through security and documentation.

Experienced in making technical specifications report by gathering information from end user.

Environment: Windows XP Professional, SQL Server 2008 R2, SQL Server 2012, Business intelligence Studio.

Client: Allied Consultants Inc.USA

Role: Data Analyst Sep 2018- Dec 2020

Description: Allied Consultants, Inc (Allied), helps organizations determine what technology they need and how best to employ it. We provide strategic consulting as well as customized tools, programs and solutions to ensure that clients achieve their business and information technology goals..

Responsibilities:

Designed & Created Test Cases based on the Business requirements (Also referred Source to Target Detailed mapping document & Transformation rules document).

Involved in extensive DATA validation using SQL queries and back-end testing

Used SQL for Querying the database in UNIX environment

Developed separate test cases for ETL process (Inbound & Outbound) and reporting

Involved with Design and Development team to implement the requirements.

Developed and Performed execution of Test Scripts manually to verify the expected results

Design and development of ETL processes using Informatica ETL tool for dimension and fact file creation

Involved in Manual and Automated testing using QTP and Quality Center.

Conducted Black Box – Functional, Regression and Data Driven. White box – Unit and Integration Testing (positive and negative scenarios).

Defects tracking, review, analyzes and compares results using Quality Center.

Participating in the MR/CR review meetings to resolve the issues.

Defined the Scope for System and Integration Testing

Prepares and submit the summarized audit reports and taking corrective actions

Involved in Uploading Master and Transactional data from flat files and preparation of Test cases, Sub System Testing.

Document and publish test results, troubleshoot and escalate issues

Preparation of various test documents for ETL process in Quality Center.

Involved in Test Scheduling and milestones with the dependencies

Functionality testing of email notification in ETL job failures, abort or data issue problems.

Identify, assess and intimate potential risks associated to testing scope, quality of the product and schedule

Created and executed test cases for ETL jobs to upload master data to repository.

Responsible to understand and train others on the enhancements or new features developed

Conduct load testing and provide input into capacity planning efforts.

Provide support to client with assessing how many virtual user licenses would be needed for performance testing, specifically load testing using Load Runner

Create and execute test scripts, cases, and scenarios that will determine optimal system performance according to specifications.

Modified the automated scripts from time to time to accommodate the changes/upgrades in the application interface.

Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.

Creation of SSRS Subscription and Snapshot reports.

Used SQL Server Agent for scheduling jobs and alerts

Active part in creating OLAP and ROLAP cubes and used MDX queries to retrieve data from the cubes.

Used various transformation tasks to create ETL packages for data conversion.

Involved in creating SSISpackages to migrate large amounts of data.

Involved in using SSASdata sources inSSRSreports.

Environment: Windows XP, Informatica Power Center 6.1/7.1, QTP 9.2, Test Director 7.x, Load Runner 7.0, Oracle 10g, UNIX AIX 5.2, PERL, Shell Scripting

Client: Ilensys Technologies Hyderabad India

Role: Data Analyst Apr 2013-Jun 2016

Description: iLenSys Technologies is a global engineering company providing specialized and innovative design services and solutions in the field of Mechanical Engineering, Electrical and Electronics, Automation, Quality Assurance and Regulatory Affairs, Environmental Compliance, Product Obsolescence Management.

Responsibilities:

Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.

Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center.

Setting up of environments to be used for testing and the range of functionalities to be tested as per technical specifications.

Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.

Responsible for different Data mapping activities from Source systems to Teradata

Created the test environment for Staging area, loading the Staging area with data from multiple sources.

Responsible for analyzing various data sources such as flat files, ASCII Data, EBCDIC Data, Relational Data (Oracle, DB2 UDB, MS SQL Server) from various heterogeneous data sources.

Delivered file in various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.)

Performed ad hoc analyses, as needed, with the ability to comprehend analysis as needed

Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.

Executed the SAS jobs in batch mode through UNIX shell scripts

Created remote SAS sessions to run the jobs in parallel mode to cut off the extraction time as the datasets were generated simultaneously

Involved in code changes for SAS programs and UNIX shell scripts

Reviewed and modified SAS Programs, to create customized ad-hoc reports, processed data for publishing business reports.

Responsible for creating test cases to make sure the data originating from source is making into target properly in the right format.

Tested several stored procedures and wrote complex SQL syntax using case, having, connect by etc

Involved in Teradata SQL Development, Unit Testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports.

Configured and maintained Report Manager and Report Server for SSRS.

Created Sub-Reports, Drilldown-Reports, Summary Reports, and Parameterized Reports in SSRS.

Design Dashboards using Filters, Parameters and Action Filters using Tableau.

Perform performance testing for the main business areas.

Trillium Tool to test Quality Checks.

Assist Test lead to create test plan and strategy for the tasks to mitigate the risks to system quality.

Create Test Scenarios, Test Cases and expected results for module and system testing

Review the Test Cases and documents prepared by other team members

Logging of defects and tracking them till resolution.

Prepare the validation document like test summary and test process approach document.

Performed T-SQL tuning and optimization of queries for reports that took longer execution time using MS SQL Profiler, Index Tuning Wizard and SQL Query.

Responsible for creating fact and dimension tables, build cubes using SSA

Environment: Informatica 7.1, Data Flux, Oracle 9i, Quality Center 8.2, SQL, TOAD, PL/SQL Flat Files.

Client: Fusion Software Solutions Pvt. Ltd. Hyderabad India Nov 2010-Mar 2013

Role: Business Analyst

Description: At Fusion Plus Solutions Inc, we believe that it’s an exceptional company - a company of people proud of the work they do and the solutions they provide. By understanding what drives our specialty industries, becoming involved in our communities on a professional and personal basis, following a disciplined process.

Responsibilities:

Conducted Joint Application Development Sessions (JAD) session with business people, stakeholders, SMEs, developing team as well as QA teams to gather high level requirement and in the meantime conducted discussion sessions with the Development and the QA teams about the requirements' feasibilities according to the system.

Gathered Business and Functional Requirements based on Waterfall SDLC methodology.

Created Use Cases Diagrams, Use Case Narratives, Activity Diagrams, Class Diagrams, and Collaboration Diagrams using UML modeling tools like MS Visio.

Developed/maintained all project communication deliverables including road map, planning list, change request, time line, scope and requirements.

Responsible for gaining a good understanding of user needs and accurately representing them in a well-documented software functional specifications document (FSD)

Gathered Business Requirements, interacted with users, designers, developed, project manager, and QA to get a better understanding of the Business Processes.

Documented the Business Requirements Document (BRD) and the Functional Requirements Document (FRD)

Maintained a Traceability Matrix (RTM) to ensure that all Functional Requirements are addressed at the Use Case Level as well as the Test Case Level.

For the downstream applications, Created System Requirements Specifications (SRS) Data Requirements (DRD), Universe Requirements Document (URD), Source to Target Mapping (STTM) documents for Data modeling and Data Lineage purposes.

Worked on Business Process Flows, UML modeling, Use Cases.

Assisted in designing test plans and test cases for User Acceptance Testing (UAT) to improve the overall quality of the system.

Environment: IBM Websphere, Oracle 9i, PL/SQL, IBM Business Process Modeler, UML, MS Visio, MS Project, Adobe Acrobat, JavaScript.



Contact this candidate