Post Job Free
Sign in

Data Analyst/ DataModeler/ Business Analyst

Location:
Chantilly, VA
Posted:
June 17, 2025

Contact this candidate

Resume:

SUMMARY PROFESSIONAL:

With over ** years of experience in Data Analytics, Data Modeler, and Business Intelligence, I am a results-driven professional skilled in transforming complex data into actionable insights. My expertise encompasses end-to-end data solutions, from extraction and transformation to advanced visualization, using cutting-edge tools like SQL, Python, Tableau, and Qlik Sense. I have extensive experience designing and implementing robust data models, including star and snowflake schemas, and leading large-scale data migration and ETL projects on major cloud platforms like AWS, Azure, and GCP. My background includes leading cross-functional teams to deliver data-driven solutions while ensuring compliance with key regulations such as HIPAA, GDPR, and BSA/AML. Adept at integrating emerging technologies like Artificial Intelligence (AI) and machine learning into analytics workflows, I have a proven track record of optimizing processes, improving efficiency, and driving digital transformation across both healthcare and financial services industries. My strong analytical mindset, combined with deep technical knowledge and business acumen, allows me to bridge the gap between data and strategic decision-making.

Technical Skills

Data Management

Database design & Management, Data Analysis, Data Quality Assessment, Salesforce CRM Administrator, Data Modeling, Star Schema Modelling, Snowflake Schema, MS Visio.

RDBMS

SQL Server 2000/05/08, Oracle 10g/9i, Teradata 15,14,12, Snowflake

Computer Science

System Administration, MS Office, G-Suite Slack,

Tools

SQL Server Integration Service (SSIS), Teradata Fast load, Teradata BTEQ, Informatica Cloud, GCP, AWS Azure, S3, EC2, AMI, EBS, EMR, Lambda, SNS, SQS, GitHub, Splunk, Data Bricks, Kibana, Jira.

Languages

Structure query language (SQL), T-SQL, R, Python

Operating System

Windows, Unix, Linux

Project Management Tools

MS Office tools (WBS, PERT, Gantt charts), Visio, Access, Excel, etc.

SDLC/Design Methodology

Agile, Scrum, Kanban, and Waterfall.

Requirements/Document Tools

Rational Requisite-Pro, Rational Clear–Quest and MS SharePoint.

Modeling and Design Tools

Rational Rose, Microsoft Visio.

Testing Tools

Mercury Win Runner, Load Runner, Quality Center, QTP.

Databases

SQL Server, MYSQL, PL/SQL, Oracle9i/10g, Access.

Operating Systems

Windows95/98/NT/2000/XP/Vista/Windows7/10 & Mac OS

PROFESSIONAL EXPERIENCE:

Truist Bank, Atlanta, GA(Remote) Feb 2022 –Present

Role: Sr. Business Analyst/ Data Analyst

Responsibilities:

Fascinated GAP, SWOT analysis and led JAD session, data modelling with ERWIN and performed testing while collaborating with cross-functional teams to develop and ensure data governance compliance.

Utilized MS Visio to create UML diagrams and workflow drawings to aid data flows and system process documentation and understanding.

Typed complex SQL queries to extract, reorganize, and validate data throughout the ETL process. Extracted data from OLTP systems using the Canonical Message Model for ETL, created and populated tables in Snowflake DW; built ETL workflows on AWS (Python Lambda functions, Glue, Athena SQL) and carried out ETL testing with Informatica Power Center.

Utilized MS Project to schedule project timelines and monitor the development and integration progress of ETL.

Developed workflow orchestration of ETL for HDFS and big data with Databricks using Spark SQL, integrated GCP services to improve data pipelines through Cloud Data Fusion and Cloud Composer.

Created STTM documents for business transformation and data validation and identifying gaps and mapped documents for tables and corresponding DDL files to ensure data flow between systems.

Built ETL pipelines to consolidate workforce data from Kronos, and Learning Management Systems (LMS) into Snowflake.

Used MS Excel to map and manage source-to-target system data, and Power BI for visualizing transformation rules and mappings.

Create STTM data mapping between various environments and data warehouses RD to RDS, Glue, Snowflake and other data marts like SAS, Cognos and MicroStrategy.

Performed data profiling on source databases to optimize source to target mapping and reverse engineered source database using Erwin and refine data structures.

●Ensured Function and Integration testing for DB and Application and ensured the Development and Testing phase of SDLC.

●Built Python framework with JSON workflow and transformed data from AWS S3 into data lake and Snowflake and using machine learning models like random forest and spark for predictive analysis. Presented findings and performance of the model to stakeholders through MS PowerPoint.

●Used Pandas and Apache Airflow to conduct data quality checks optimizing workflows enhancing data accuracy approaching data validation and cleansing process using Python Scripts. Built Power BI dashboards to monitor data quality statistics and overtime improvements.

●Used SharePoint and ensured that team members confirmed all documents were collected in the central task location. Created Use case diagrams, UML diagrams, and Sequence diagrams using MS Visio.

●Worked on Data Export and Data Migration using tools like Informatica. Automated and Scheduled Informatica tool using UNIX shell scripting configuring using Kron-jobs.

●Designed, developed, and maintained Data flux to support various data integration and cleansing tasks.

●Spearheaded the Model development and implementation of advanced machine learning models for fraud detection and anti-money laundering (AML) initiatives, enhancing risk management strategies.

●Build Metadata for Production Use (Store Domicile) Store and retrieve data from One Lake Using Databricks.

●Experienced with requirements analysis, development, testing, execution, maintenance, documentation, and archival of SAS reports.

●Registered DSET on Collibra as a data governance policy and ensured data have been ingested on various banking platforms including Redshift, S3(Glue), Delta Lake, and Snowflake.

●Identification of primary index and skew ratio for populating data and ensured data extraction from sources like Oracle, SQL Server, and flat files as of requirements.

●Utilized AWS services, including S3, Redshift, and Sage Maker, for scalable data management and AI machine learning model deployment and create monitoring approaches using Pager Duty.

●Worked with technical teams to design and implement business solutions using Adobe Experience Cloud platforms, ensuring alignment with business requirements.

●Defined key identifiers for mapping interface and ensured data repositories using GitHub tool and cloud formation on EC2.

●Worked on validation of Plain ID implementation, and CyberArk upgrade. Validated plain ID tool in a noon production environment.

●Worked on multiload BTEQ to update the large table with many rows and performed multiple DML operations for small tables for small rows.

●Worked on Data Architecture based on AS IS and TO BE to identify Gap Analysis, pre-trade, and post-trade risk management with regulatory compliance based on Capital Market and regulatory compliance.

●Worked on internal audits focusing on BSA/AML and OFAC Sanctions compliance, identifying key risk areas and recommending improvements.

●Developed and implemented risk assessment frameworks to evaluate the effectiveness of regulatory controls.

●Conducted thorough analysis and documentation of complex business processes, identifying inefficiencies, and proposing solutions.

●Conducted in-depth analysis of Equity, Fixed Income, FX, and Derivatives data, ensuring accurate and timely Cognos reporting. Managed and maintained security reference data and account reference data attributes, enhancing data quality and consistency.

●Worked on master data management, and metadata management tools to create and maintain comprehensive data catalogs, dictionaries, and glossaries.

●Worked on ServiceNow and creating/resolving Change Management and Incident Management, AYS ticket integration using Slack Messaging with monitoring tools like pager duty.

●Continuous data streaming platform like SDP and monitoring loads daily and fixing missing loads. Played a key role working on/Offshore with ingesting data with data sources to AWS cloud and Snowflake.

●Worked on creating Data model using ERWIN data modeler tool and worked on deploying Snowflake tables.

●Working with data systems like Kibana, and Elastic search. Worked on ELK (Elastic Search, Log Stash, and Kibana) creating indexes, and dashboards and checking the error logs using logs stash into Kibana (real-time).

●Hands-on Experience with the ServiceNow tool for creating Change Orders and Incident reports. Data sources for various Reporting and analytical needs. Familiar with fine tuning of SQL queries to optimize the performance.

●Having Strong analytical, problem-solving, and multi-tasking skills. Experience in using the JIRA Ticketing tool. Expertise in Data Analysis, Business Analysis, Data Extraction, Data Marts.

●Developed reports and dashboards using tools like Power BI, SAP BO, and Tableau. Integrated interactive capabilities into Power BI dashboards to allow drill-down to data and viewing of key metrics.

●Worked on Qlik Sense, Tableau dashboard to create scorecards using bar graphs, scattered plots, and Gantt charts along with stack bars functionality.

●Created various KPI dashboards using trend lines, and log axes based on groups and sets to create detailed summary reports with granular level details.

●Worked on creating JIRA ticket and ensured Story Creation based on priority. Also played a vital role in backlog refinement by conducting retro. Used MS Project to plan sprints and track completion of stories within the Agile setting.

●Customized Oracle ERP workflows, reports, and dashboards to optimize processes and improve decision-making. Developed test plans and test cases to validate Oracle ERP configurations and customizations.

Environment: Snowflake 3.27.2, Teradata, Teradata SQL Assistant, R, Python, Databricks, SQL Workbench, JSON, Power BI, GraphQL, MS Excel Macros, MS Visio, MS Project, Kibana, SAS, AWS, AWS EC2, AWS S3, SQL, SAP, Tableau, Adobe AEM, Unix, Teradata, Oracle ERP, SAP FICO, PL/SQL, GitHub, BTE, DE vex, G-Suite, Jira, ServiceNow, Nebula.

Anthem, Chicago, IL Oct 2018 – Dec 2021

Role: Data Quality Analyst/ Data Modeler

Responsibilities:

●Worked on creating policy briefs, reports, and periodic management reports ensuring the accuracy of data and database integrity. Utilized SQL queries to validate data accuracy and created management reports in MS Excel and Power BI.

●Created Software Development Life Cycle (SDLC) like requirements for specification review based on blue-chip applications development and defect Cognos reporting. Used MS Project to manage project tasks, timelines, and deliverables during the SDLC.

●Worked on MDM hub and ensured path components with application integrations and ensured ETL methodology for supporting data extractions using complex EDW using Informatica. Wrote complex SQL queries to read and transform data for the MDM hub.

●Good Knowledge working as SME for Facets related to the facets data model and extracting data as needed for the modules such as billing, provider, claims, and membership modules.

●I worked on various platforms for health care including Medicare, Medicaid, Medigap, HIPPA Standards, EDI Transaction, EHR/EMR implementation, and HIX.

●Worked on SharePoint and MOSS, Team Foundation Systems (TFS), and Visual Studio Team Systems (VSTS).

●Processed EDI 837P, 837I, 834, and 837D transactions, verified 837 transactions were converted correctly to XML file format, and verified the claims data loaded to Facets for further processing.

●Involved in API Testing using POSTMAN to determine reliability, functionality, performance, and security for calculation, transaction, and communication with each other in a highly efficient way.

●Testing of Web applications at various stages to make sure it is user friendly and as per standards. Validated the reports and files according to HIPAA X12 enforced standards.

●Worked with all stakeholders to obtain all requirements (Functional/Non-Functional/Security). Involved in HIPAA EDI Transactions and mainly focused on PA and Eligibility Transactions.

●Performed functional testing of SOAP and RESTFUL Web Services using SOAP UI Tool. Worked with Designers, UX/UI, Strategy, and SEO and Analytics to gather initial requirements and ensure user stories reflected desired features and functionality.

●Utilized technical writing skills to create effective documentation and training materials. Facets support systems were used to enable inbound/outbound HIPAA EDI transactions in support of HIPAA 834, 835, 837 270/271 transactions.

●Developed documents like BRD, FRD, Data specification documents, technical specification documents, file transfer documents, Data mapping documents, etc. Experience in creating JavaScript for using DML operations with MongoDB.

●Utilized MS Visio to create process flow diagrams and data flow diagrams of these documents and MS Excel for data mapping documents.

●Engaged in HL7 working group and implementing MITA Medicaid Eligibility Architecture based on HL7. Created profiles using Informatica Data Explorer IDE and masking the data as a process of NPI PCI deidentification.

●Implemented MDM tool to support customer master data management by removing duplicates, standardizing data, and incorporating tools to remove incorrect data and filter data as per requirements.

●Responsible for conversion of data in Data Mapping and writing transformation rules. Designed interactive dashboards in Tableau using drill-downs, prompts, filters, and variables. Presented dashboard views and functionality using MS PowerPoint to stakeholders.

●Worked in importing and cleansing data from various sources like Teradata, flat files, and SQL Server.

●Worked on creating tests for forms on target servers integrating AEM forms on it.

●Used advanced Excel functions to generate spreadsheets and pivot tables. Used MS Excel for reporting and data analysis needs. Organized quarterly projection data gathered by senior analysts in an orderly way for executive management.

●Gathered healthcare data and was responsible for collecting, cleaning, and analyzing healthcare data using SQL queries and R scripts.

●Implement and enforce data security measures in Azure, ensuring compliance with healthcare standards and regulations.

●Monitor and manage access controls, encryption, and data masking to safeguard sensitive information within the Azure environment.

●Worked on the development of SQL and stored procedures on MYSQL. Reported bi-monthly data analysis findings to upper management to use in making organizational decisions.

●Involved in extensive DATA validation by writing several complex SQL queries. Responsible for different Data mapping activities from Source systems to Teradata.

●Worked on creating relational and dimensional models for Clinical EMR records and ensured MDM data model with integration of WebSphere Process Server.

●Worked with the Agile team to enhance Maturity Assessments of Data Governance and Collibra. Ensure HL7 and FHIR data migration.

●Worked on Facet Output generation, facets data modeling, and ensuring functional and technical requirements within Facets.

●Created data modeling and produced data mapping Document and Metadata in data flux for the Enterprise Data Warehouse (EDW).

●Worked on EDI all payers process and ensured data integrity and payer contacts. Ensured data governance and stewardship, ETL, and ODS with analytical, segmentation, and predictive modeling.

●Worked on developing SSIS packages for transformation i.e. lookup, condition split, data flow task, and execute package task to generate underlying data.

●Worked on Informatica Data Quality (IDQ) tools and ensured with the Facets team to ensure the HIPPA Claims Validation and Verification process.

Environment: SQL Server, Teradata, SQL Server Integration Service (SSIS), SQL Server Reporting Services (SSRS), SQL Server Management Studio, JavaScript, Azure, Excel, MS Visio, MS Project, R, Python.

Goldman Sachs, NY Jan 2015 – Sept 2018

Role: Business System Analyst/ Data Analyst

Responsibilities:

●Followed Agile Methodology in analyzing, defining, and documenting the application, which will support functional and business requirements. Coordinate these efforts with Functional Architects. Utilized MS project to maintain Sprints, to track the user stories, and visualize the project timelines within the Gantt charts.

●Actively experienced in Analysis, Detail Design, Development, Bug fixing, and Enhancement in Agile methodology.

●Supported internal audit and risk management functions by gathering and analyzing data related to BSA/AML compliance. Assisted in the development of compliance policies and procedures in line with BSA/AML and OFAC regulations.

●Migrated gathered business requirements from portal to Adobe AEM and functionalities which include multi-site, multi-channel management, and Digital asset management.

●Utilized AEM to update website content and implement enhancement requested by project stakeholders.

●Experienced working with Spark/Python to improve performance and optimization of the existing algorithms in Hadoop using Spark Context, Spark-SQL, Spark ML lib, Data Frame, Pair RDDs, and Spark YARN.

●Developed Scala and Python scripts, and UDFs using both data frames/SQL and RDD in Spark for data aggregation, queries, and writing back into S3 bucket.

●Create and maintain data model/ architecture standards, including master data management (MDM).

●Used AWS services with a focus on big data architect /analytics/enterprise Data warehouse and business intelligence solutions to ensure optimal architecture, scalability, flexibility, availability, and performance, and to provide meaningful and valuable information for better decision-making.

●Worked as SME for Risk Management that includes both Market and Credit risk, emerging/equity market, fixed income, and money market.

●Managed Amazon Web Services like EC2, S3 bucket, ELB, Auto-Scaling, SNS, SQS, AMI, IAM, Dynamo DB, Elastic Search, and Virtual Private Cloud (VPC) through AWS Console and API Integration.

●Worked on comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships. Worked in data mining, data cleansing, data profiling, data modeling, data quality and data warehousing.

●Involved in writing Data Mapping Documents for the system and involved in documenting the ETL process and writing SQL Queries for the retrieval of the data.

●Worked in the investment banking industry, project finance, credit card industry, mortgage loan servicing, ACH, Wire transfers, commercial banking, retail, banking, Capital Management, and credit risk fundamentals.

●Utilized Loan IQ's data capabilities to extract, analyze, and report on key loan-related metrics, providing valuable insights for decision-making and regulatory compliance.

●Collaborated with cross-functional teams to understand business needs, translating them into data-driven solutions within the Loan IQ system to support strategic initiatives and improve overall loan management processes.

●Maintained Collibra data catalog to ensure Data accuracy and metadata management which is aligned with the data governance policies and standards.

●Worked with the project manager to document and manage change requests and worked with SMEs to define multifamily mortgage business rules. Highly used data bricks for creating Spark auto-scheduler and SMTP triggering.

●Worked on creating Tableau, Qlik Sense, and Grafana dashboards to produce different views of data visualizations and presented dashboards on web and desktop platforms to End-users and help them make effective business.

●Worked in Relational Database Management Systems RDBMS and Rapid Application Development Methodology.

●Created data pipeline for different events of ingestion, aggregation, and load data in AWS S3 bucket into Hive external tables in HDFS location to serve as feed for tableau/Grafana dashboards.

●Designed roles and groups for users and resources using AWS Identity Access Management IAM and configured/created new IAM roles based on the requirements.

●Support applications in production. Note interruptions or bugs in operation and perform problem-solving exercises to determine problems and ensure continued use of the application.

●Trouble-shooting production incidents requiring detailed analysis of issues on web and desktop applications, Autosys batch jobs, and databases.

●Extracted and analyzed data from Oracle ERP systems to generate insights and support decision-making.

●Developed and maintained Oracle ERP reports and dashboards to monitor KPIs and track business performance.

●Develop the Spark SQL logics based on the Teradata ETL logics and point the output Delta back to Newly Created Hive Tables and as well the existing TERADATA Dimensions, Facts, and Aggregated Tables.

●Assist with planning and testing of application, configuration, and database changes, and installation of upgrades and patches, and update production support documentation.

●Worked on microservices a fully automated continuous integration system using Git, Jenkins, MySQL, and custom tools developed in Python and Bash. Excellent knowledge of J2EE architecture, design patterns, and object modeling using various J2EE technologies and frameworks with Comprehensive experience in Web-based applications using J2EE Frameworks like Spring, Hibernate, Struts, and JMS.

Environment: Agile, Java/J2EE, AWS EC2, EMR, Lambda, Security groups, S3, EBS, Direct Connect, VPC, Cloud Watch, IAM, SAS, RDS, CPM, NAS, Chef, Jenkins, Microservices, Oracle, SQL, One Lake, Snowflake, Hadoop, Bigdata, JMS, REST Web Services, Snowflake, Servlets, Junit, Chef, Jenkins, Splunk, ANT, GIT and Win.

Education: Bachelors in Computer Science from I-Global university, VA-2016



Contact this candidate