Post Job Free
Sign in

Data Operations Production Support

Location:
Raleigh, NC, 27603
Posted:
May 28, 2025

Contact this candidate

Resume:

Summary:

*+ years of IT experience in ETL and Snowflake Development with the focus on managing and optimizing the data environments.

Proficient in developing and implementing SnowSQL procedures to optimize data operations and enhance performance.

Configured and monitored data replication for high availability and disaster recovery in Snowflake.

Hands-on experience in designing and managing the Virtual data warehouses according to business requirements.

Have the ability to analyze and resolve the performance issues in Snowflake data pipelines to maintain the smooth data flow.

Experienced in cost optimization and also monitoring the resources to improve operational efficiency.

Provided production support and collaborated with cross-functional teams to provide the solutions as per business alignment.

Experienced Data Operations Engineer with a focus on production support for cloud-based data platforms using Snowflake and Matillion.

Skilled in monitoring, troubleshooting, and maintaining ETL pipelines, ensuring high data availability and system reliability.

Proficient in tools such as Azure DevOps, Git, ServiceNow, Control-M, and CyberArk for managing jobs, tickets, and secure operations.

Strong understanding of incident management, job scheduling, and operational documentation to support business-critical data workflows.

Experienced in conducting Proof-of-Concept (POC) implementations to evaluate and integrate new Snowflake features.

Skilled in utilizing Tableau for creating dynamic dashboards and visualizations to deliver actionable insights for stakeholders.

Strong understanding of ETL and Data Warehousing concepts, ensuring efficient data transformation and integration.

Actively monitored the team members on Snowflake operational procedures to enhance the productivity of the team.

In order to accomplish project objectives, I engaged with business teams to specify needs and provide customized Snowflake solutions.

Created and managed information for Snowflake procedures, guaranteeing uniformity and clarity for teamwork.

Expertise in creating custom views for leveraging the Snowflake meta data for Internal usage.

Experienced in using Matillion for designing, configuring, and optimizing ETL workflows to ensure efficient data integration and transformation.

Proficient in leveraging Python for scripting, data transformation, and automating workflows in ETL processes and Snowflake environments.

Extensive experience in writing and optimizing complex SQL queries for data manipulation, reporting, and performance tuning in Snowflake and other databases.

Experience:

Client: Paycor

Snowflake Developer: Aug 2024 – Apr 2025

Responsibilities:

Extracted and loaded payroll and financial data from AWS S3 into Snowflake using Matillion ETL, ensuring efficient data processing and integration.

Integrated Snowflake with Tableau, enabling real-time visualization of pay trends and department-wise expenses to aid decision-making.

Optimized SQL queries, reducing execution time and enhancing overall system performance.

Created and managed Snowflake Stored Procedures and User-Defined Functions (UDFs) for custom business logic.

Developed Python scripts for data transformation, task automation, and orchestration in Snowflake ETL workflows.

Implemented data masking, role-based access control, and object-level permissions to ensure data security.

Designed and maintained data models (Snowflake schema) to support analytics.

Designed and implemented Snowflake data warehouses to support business intelligence and analytics related to payroll, payments, and paystubs.

Integrated Snowflake with third-party tools like Tableau, and ETL platforms for seamless reporting and data processing.

Configured and managed resource monitors while integrating workflows with Matillion to track usage and prevent cost overruns.

Monitored compute resource usage with Snowflake resource monitors, preventing cost overruns and optimizing cloud expenditures.

Built custom views leveraging Snowflake’s metadata for analytics and reporting purposes.

Worked with business stakeholders to understand data requirements and deliver scalable Snowflake solutions.

Managed stored procedures, user-defined functions, ensuring data security and compliance with financial regulations.

Conducted proof-of-concept (POC) implementations to test and adopt new Matillion features, improving automation and task orchestration for payroll data processing.

Environment: Matillion, Snowflake, SnowSQL, Python, SQL, Tableau.

Client: Veterans Sourcing Group

Data Operations Engineer: Aug 2022 – Jul 2024

Responsibilities:

Monitored and maintained Matillion and Matillion DPC ETL jobs to ensure consistent and reliable data pipeline execution.

Performed daily health checks and status reviews on Snowflake data warehouses and pipelines to ensure operational stability.

Used ServiceNow for managing incidents, tracking service requests, and resolving issues within defined SLAs.

Handled job scheduling, monitoring, and failure recovery using Control-M, ensuring timely completion of critical data workflows.

Utilized Azure DevOps (ADO) and Git for reviewing and tracking changes to ETL job configurations and pipeline deployments.

Managed credentials and secure connections through CyberArk, ensuring compliance with security protocols across environments.

Conducted routine validation and reconciliation of data outputs using Excel, ensuring accuracy and consistency in production reports.

Created and maintained operational documentation in Word, including SOPs, runbooks, and support logs.

Collaborated with development, QA, and infrastructure teams to troubleshoot job failures and provide resolution support for production incidents.

Ensured minimal downtime by proactively monitoring alerts, logs, and job metrics across platforms.

Responded to on-call production alerts and incidents, ensuring 24/7 data pipeline reliability across Snowflake and Matillion environments.

Executed and validated patch releases, configuration changes, and credential updates without impacting ongoing ETL processes.

Coordinated with stakeholders to escalate critical issues and provide timely communication during high-impact incidents.

Participated in audit and compliance checks by maintaining clean ticketing trails in ServiceNow and proper job execution logs in Control-M.

Environment: Snowflake, Matillion, Matillion DPC, Azure DevOps (ADO), Git, Control-M, ServiceNow, CyberArk, SQL.

Client: Cognizant

ETL Developer: Jun 2020 – Jul 2022

Responsibilities:

Migrated ETL workflows from Oracle to Snowflake using Matillion, ensuring seamless data transition and optimized performance.

Integrated data from heterogeneous systems such as databases, APIs, and flat files to create a unified data view for analytics and reporting.

Optimized SQL queries and ETL workflows in Matillion, improving execution speed and resource utilization.

Designed ETL pipelines to analyze user login/logout behavior and navigation patterns, providing key insights into user engagement and business growth.

Implemented data validation rules to ensure accuracy and consistency during data movement.

Automated recurring ETL tasks using Matillion's job scheduling features for seamless data integration.

Designed robust error-tracking mechanisms to log and resolve ETL job failures efficiently.

Partnered with data architects and analysts to define requirements and deliver data pipelines tailored to business needs.

Maintained ETL process metadata for better understanding, traceability, and debugging of data pipelines.

Created detailed documentation for ETL processes, including data flow diagrams and troubleshooting guides.

Supported the development and maintenance of data warehouses, ensuring seamless data integration and storage for reporting and analytics.

Conducted POCs to migrate workflows from Oracle to Snowflake, ensuring seamless data transition and performance optimization.

Environment: Matillion, SQL, Snowflake, Python, Oracle.

Client: Federal Home Loan Bank of New York

Data Engineer: Feb 2019 - May 2020

Responsibilities:

Extracted and transformed raw data from AWS S3 using MySQL, ensuring clean and structured datasets for analytics.

Maintained and optimized relational databases like MySQL to support data processing and storage needs.

Utilized Python for automating data extraction, preprocessing, and validation, ensuring high-quality datasets for analytics.

Monitored and optimized ETL pipelines to analyze user login/logout behavior and navigation trends, providing insights into platform engagement and business growth.

Integrated AWS S3 with on-premises databases, enabling seamless data movement and real-time availability for business users.

Worked closely with Data Analysts and Data Scientists to understand data requirements and provide reliable datasets.

Implemented data masking, encryption, and access controls to enhance data security and governance standards.

Set up monitoring tools to track data pipeline performance and identify bottlenecks.

Documented pipeline architecture, ETL workflows, and troubleshooting guides to facilitate team understanding and maintenance.

Conducted POCs for migrating legacy data systems to modern platforms like Snowflake.

Designed and managed ETL pipelines processing high-volume data, ensuring efficient transformation and deduplication before analytics processing.

Developed performance-tuned ETL processes to manage large-scale data processing, improving efficiency and reducing execution time.

Integrated AWS S3 storage solutions with data pipelines, ensuring seamless access and availability for analytical workloads.

Environment: MySQL, Python, AWS S3, SQL.

Client: Appsika Soft Tech Private Limited

Data Analyst: Dashboard with Punch-IN and Punch-OUT data of a Library. Jan 2018 - Jan 2019

Responsibilities:

Designed and implemented a dashboard to track and visualize data for a library.

Highlighted visitor trends and usage patterns using Python in Jupyter Notebook.

Performed data cleansing to ensure accuracy and consistency from the raw data.

Applied a masking policy to anonymize sensitive data, ensuring compliance with data privacy standards while maintaining usability for analytics.

Utilized tools like Excel and Tableau to create an interactive dashboard displaying key metrics such as peak hours, daily visitor counts, and average visit durations.

Analyzed visitor patterns to identify peak usage times and optimize resource allocation in the library.

Documented the project workflow and dashboard features to facilitate ease of understanding for stakeholders and future enhancements.

Environment: MySQL, Python, Excel, Jupyter Notebook, Tableau.

Education: Master’s degree in Computer Science.

Certification:

oMatillion ETL Foundations, Credly URL:

https://www.credly.com/badges/42ce45f2-6fd6-44ec-9d2d-1bcd7bf063d6.

oData Productivity Cloud Foundations, Credly URL:

https://www.credly.com/badges/14ad48a0-1813-403a-bd30-492e4ee53e6d.

oSnowPro Core Certification:

https://achieve.snowflake.com/fb8a0e7a-e746-4bc1-affc-5b95a3bb2d53#acc.NeQXjmbI.



Contact this candidate