Post Job Free
Sign in

Data Engineer North Carolina

Location:
Charlotte, NC
Salary:
110000
Posted:
June 10, 2025

Contact this candidate

Resume:

Data Engineer

Name: Anusha Munama

Location: Charlotte, North Carolina

Email: *************@*****.***

Phone: +1-980-***-****

PROFILE SUMMARY

A results-driven software professional with 12+ years of progressively responsible experience in Data Analysis, Design, Implementation, Administration, and Support of OLTP, OLAP, Business Intelligence, Data Modeling using SQL Server 2022, ETL -Tool SSIS and Azure Data Factory and SSAS Tabular Model and Report tools like SSRS, Power BI.

Skillful in design ETL jobs, ETL - Extraction Transformation Loading, Operations Data Store concepts, data marts, and OLAP-Online Analytical Processing technologies.

Experienced in Data Warehousing methodologies and concepts, including star schemas, snowflake schemas, dimensional modeling, Slowly Changing Dimension (SCD Type II) and reporting tools.

Extensively involved in all phases (system analysis, design, development, testing, and implementation) of software development life cycle (SDLC) and experience in Agile and Scrum

and Project Management Methodologies.

Experienced in designing, developing, optimizing and administrating SSAS Tabular Models.

Expert level skills in Data Mapping, Normalization, Batch jobs, Data migration, Data collation, Data cleansing, Entity Relationship Design (ERD) model and application-oriented design.

Extensively used native tools to troubleshoot SQL issues by using SQL Tools Execution plan, Index Tuning Wizard, performance Monitor, SQL Profiler, DMV, Activity Monitor, and Event Viewer for performance analysis to solve deadlocks, troubleshoot cubes.

Experienced in Migration of data warehouse from on-premises SQL Server infrastructure to Snowflake to enhance scalability and performance.

Developed complex ETL packages for large volumes of data from different data sources and destinations (SQL Servers, Excel, CSV, XML, Oracle, Salesforce) and loaded the data into staging database and data warehouse target tables by performing different kinds of transformations available in SQL Server Integration Services (SSIS).

Generated multiple Enterprise Reports using SSRS and Power BI from OLTP and OLAP and included various features like drill-down, drill through, multi-value parameters.

Experience in Deployment and scheduling jobs in Autosys and Monitor and create alerts and Email Notifications in Autosys Job Scheduler.

Having experience in implementing Cloud Azure Data factory pipeline components such as linked services, Datasets, and Activities, Blob Storage access.

Experienced in performance tuning of Query optimization, Data Validations using Hash values compare and database consistency checks. Excellent skills in documenting different kinds of metadata Management.

Working knowledge on Azure CI/CD and Azure ARM templates in order to support deployment activities.

Collaborated with cross-functional teams to integrate data-driven solutions into business processes, ensuring seamless implementation.

Used the version control system GIT and TFS to access the repositories and Jira and Confluence tools for tasks.

Excellent in written and oral communication and Team skills with self-motivated and the ability to interact with all levels, including senior management, Business teams and the customers.

Technical Expertise:

Database Tools

MS SQL Server 2022, Snowflake

ETL Tools

MS SQL Server Integration Services (SSIS), ADF.

IDE’s

SQL Server Management Studio (SSMS), SSDT, Visual Studio

Reporting Tools

MS SQL Server Reporting Services (SSRS), Power Pivot, Power View, Power BI

Languages

T-SQL, U-SQL, Snow SQL, C#, Python

Cloud Computing:

Azure Data Factory (ADF), SQL Azure DWH, and Azure Storage

Analytics Tools:

SSAS – Multidimensional and Tabular Models, cube

Applications:

MS Excel, MS Word, MS PowerPoint

EDUCATION

Bachelor's degree in chemistry from Acharya Nagarjuna University, Guntur, India,2008.

Master's degree in biotechnology from SPMVV University, Tirupati, India,2010.

PROFESSIONAL EXPERIENCE

Client: RXO Inc, North Carolina, USA (On-Site)

Role: Business Intelligence Lead

Duration: January 2024 to Till Now

Description: RXO is a freight and logistics company that offers multi-model, adaptive services and technology to shippers and carriers.

Responsibilities:

Partnered with business stakeholders to gather reporting requirements and translate them into BI solutions.

Led a team of offshore developers, providing technical guidance, and ensuring code quality and Code review sessions and adherence to project timelines.

Developed and maintained ETL pipelines using SSIS to extract, transform, and load data from various sources such as Transportation Management Systems (TMS), Warehouse Management Systems (WMS), and RXO Connect, ensuring seamless integration into centralized data warehouses.

Collaborated directly with Finance, AP/AR, and Treasury stakeholders to understand business needs and translate them into actionable data models and reports across the source-to-pay and order-to-cash processes.

Led discussions with business users to clarify data ownership, validation rules, and transformation logic, ensuring clean and reliable financial reporting outputs.

Implemented complex SSIS Workflows for error handling and package logging that stores the logging and Audi Financial data results in SQL table and flat files on error, on warning, on task failed event types.

Designed Tabular models to support financial performance analysis at various levels for Account Payable and Receivable for customer, supplier, invoice, receipt, Fixed Assets, and ledger.

Developed granular, audit-compliant financial reports in Power BI that enabled visibility into Invoice-to-payment tracking and Customer collections and receipts and Supplier banking and remittance details and GL balances and journal entry movements and Monthly and quarterly financial closes.

Developed and Configured Data Sources and Data Source Views, Dimensions, Hierarchies – Levels, Measure Groups, Dimension Usage Tab to define relation between Dimension and Measure Groups, Calculated Measures, KPIs and Storage Partitions in OLAP Cubes.

Worked on defining Measures Groups and Dimensions by analyzing Data Warehouse in order develop multidimensional cubes and tabular models in SSAS and Responsible for developed MDX queries for faster retrieval historical data from cubes.

Used Power BI to demonstrate to the team new forms of Data visualization and advanced reporting methods.

Environment: SQL Server 2022, T-SQL, SQL Server Integration Services (SSIS) 2022, SQL Server reporting services (SSRS), SQL Server Analysis Services (SSAS), Snowflake, Tabular Models (Cube), Power BI.

Client: AvidXchange, NC, USA (On-Site)

Role: Sr. Data Engineer

Duration: October 2023 to December 2024

Description: AvidXchange is a leading provider of accounts payable automation software and payment solutions for middle market businesses and their suppliers, and boost efficiency, accuracy and speed.

Responsibilities:

Involved in requirements gathering, analysis, design, development, and implementation of various applications on MS SQL Server and Microsoft ETL SSIS.

Led the development and deployment of data models to optimize marketing strategies, resulting in a 20% increase in customer engagement.

Built scalable solutions for invoice data normalization, coding consistency, and vendor master data alignment to ensure clean and actionable financial datasets.

Developed new stored procedures and Functions, modified existing ones, and tuned them to achieve optimum performance using execution plans using T-SQL.

Developed and maintained end-to-end ETL pipelines using SSIS and Azure Data Factory (ADF) to integrate high-volume Accounts Payable (AP) and payment transaction data across multiple business systems.

Created 30 to 40 complex stored procedures with temporary tables, CTE, Functions to implement the business logic and load data to History and Audit tables about 10 years previous data.

Implemented SSIS, ADF Pipelines to automate ETL processes, resulting in a 30% reduction in data processing time.

Conducted data migration projects, including transitioning data from legacy systems to modern SQL Server environments, ensuring data consistency and minimal disruption.

Designed and implemented a secure data access model in Azure Blob Storage, enhancing data security and compliance.

Used various SSIS tasks such as conditional split, derived column, lookup, Merge, Union which were used for data scrubbing, data validation checks during staging, before loading the data into the data warehouse.

Supported integrations with the Avid Pay Network and other external financial systems, optimizing digital payment flows and enabling accurate electronic funds transfer (EFT) records.

Worked as a team for defining measures and dimensions by creating cubes in SSAS and was Responsible for developing MDX queries for faster retrieval of historical data from cubes.

Worked extensively through agile development methodology by dividing the application into iterations.

Executed SQL conversion pre-validation and post-validation SQL scripts to avoid data loss and to make sure that the Count, redemption amount for each source matches the data loaded in EDW.

Documented all ETL projects and workflows by using Bi Documenter and including supporting documents for troubleshoot for each project.

Designed and published Power BI dashboards for executive and finance stakeholders, covering key metrics like invoice lifecycle, payment throughput, transaction yield, and cash flow visibility.

Environment: SQL Server 2022, T-SQL, SQL Server Integration Services (SSIS) 2022, SQL Server reporting services (SSRS), C#, SQL Server Analysis Services (SSAS), Snowflake, Erwin, Power BI.

Company: Amnet Digital,India

Client: CDW Corporation, USA

Role: Data Engineer

Duration: May 2019 - April 2022

Description: Client is a leading multi-brand provider of information technology solutions to Banking,Finance,government,education, and healthcare customers in the United States, the United Kingdom, and Canada.

Responsibilities:

Interact with clients to analyze Finance and Banking business needs, provide software product design and develop technical roadmap for the solution.

Designed, Developed, and deployed database Objects and structures and ensured their stability, reliability, and performance using SSIS, SQL SERVER Tools and SnowFlake Data Base.

Created Banking Data warehouse dimensional modeling and OLTP database design with Star and Snowflake Schema and exposure to modern relational database capabilities.

Developed ETL pipelines to automate data processing workflows, improving data accuracy and reporting efficiency to load data from Legacy to Snowflake data objects.

Used ETL Tool to develop jobs for extracting, cleaning, transforming, and loading data into the enterprise data warehouse (EDW), worked with a variety of data sources including OLEDB Oracle Source, ODBC sources, and Flat File Sources.

Used CA Schedular and Autosys for Job scheduling in QA and Productions and was involved in Support work to track failed jobs and resolve and resubmit the job in Production.

Automated ETL workflows using SSIS scheduling tools to ensure timely data processing and delivery.

Conducted thorough data profiling to analyze data structures, identify patterns, and assess data quality, facilitating informed decision-making for data integration.

Configured workflow dependencies and triggers to streamline data processing pipelines and reduce manual intervention.

Developed ADF pipelines to Push and Pull data using Rest APIs and other different sources and vendors.

Implemented Snowflake as a cloud data warehouse solution, including data migration, schema design, and integration with existing data systems.

Built SSIS package to load data from different files and databases using Lookup, Fuzzy Lookup, Derived Columns, Condition Split, Aggregate, Pivot, Slowly Changing Dimension, and other transformations from different sources.

Monitored time-taking queries with Performance Tuning to reduce the execution time and created packages in SSIS with error handling.

Conducted and automated the SSIS ETL operations to Extract data from multiple data sources and transform inconsistent and missing data to consistent and reliable data and finally load it into the multi-dimensional data warehouse using Slowly Changing Dimension (SCD Type II) to save Historical data along with current data.

Created and maintained ADF workflows to automate the data ingestion process from various sources into Snowflake, improving data processing efficiency.

Environment: Microsoft SQL Server 2022, SSIS 2022, POWER BI, XML, HTML, ETL, OLTP, SQL Server Profiler, Windows Server

Organization: Hitachi Consulting Pvt Ltd, India

Client: Schroder’s Bank, UK

Role: Lead SQL Developer

Duration: January 2015 - April 2019

Responsibilities:

Client is One of the leading banking and financial service Providers in the UK.

Interact with various Onshore Team Business Analysts, Management, and IT Units to analyze business and functional requirements and convert business requirements into technical use cases.

Responsible for preparing high-level and low-level design documents for change requests and bug fixes.

Maintained business intelligence models to design, develop, and generate both standard and ad-hoc reports using SSMS, SSIS Pipelines, SSRS, and Microsoft Excel.

Managed the migration of legacy data from on-premises SQL databases to Azure SQL Database with Snowflake. Ensured data integrity and optimized cloud-based data access and improved scalability and performance of millions of finance data.

Worked closely with DBA and Solution Architect and gave significant input on Normalization/De-Normalization of database tables and built Referential Integrity for RDBMS.

Involved in Data validations using checksums or hash values for data sets and compare them against precomputed checksums to detect any changes or discrepancies to maintain financial data integrity.

Worked on Client and Server tools like SQL Server Enterprise Manager and Query Analyzer to administer SQL Server and created Job Alerts using Autosys.

Responsible for integrating data into a centralized database with a dynamic design and user-friendly web application, to keep track of Direct Bill Payments from various Clients across the nation.

Organization: Quess Corp, India

Client: Deloitte Consulting India Pvt Ltd

Role: SQL Developer

Duration: August 2013 - December 2014

Responsibilities:

Analyzing the Finance requirements and reviewing user stories and functional requirements from the client.

Involved in writing stored procedures for both Online and Batch requests handling business logic and Functionality of various modules with Conceptual, logical, and physical database designs.

Worked with various upstream and downstream systems and customers in interfacing various systems for Data extractions, ETL, Analytics and reporting needs.

Extensively used T-SQL in constructing Stored Procedures, Common Table Expressions (CTEs), User Functions, Indexes, User Profiles, Relational Database Models.

Developed SQL scripts to load the custom data into Development, Test and production Instances using Import/Export.

Identified issues and developed a procedure for correcting the problem which resulted in the improved quality of critical tables by eliminating the possibility of entering duplicate data in a Data Warehouse.

Developed, Code Reviewed, fixed bugs, tested, and deployed SSIS packages using SQL Server 2014/2012/2008 R2 Business Intelligence Development Studio.



Contact this candidate