Post Job Free
Sign in

Senior Data Engineer & BizOps Platform Lead

Location:
Aubrey, TX
Posted:
April 30, 2026

Contact this candidate

Resume:

Rohit Rudra

Little Elm, TX Ph: 620-***-**** *************@*****.*** Linkedin

PROFESSIONAL SUMMARY

Experienced Data Engineer and Senior BizOps Engineer with over 8 years of experience in designing, building, and supporting large-scale batch processing systems and data pipelines across Hadoop and Apache NiFi ecosystems. Specialized in supporting production environments and recently transitioned into platform-level responsibilities including SRE tasks such as pipeline deployments, OS patching, PCI compliance, and vulnerability resolution. Strong collaborator with a focus on automation, data pipeline reliability, and cross-functional coordination in enterprise environments.

TECHNICAL SKILLS

Big Data & ETL Tools: Apache NiFi, Apache Airflow, Batch Manager Processing, Dumbo Framework, Hive, Impala, Hue, Spark Processing

Languages: SQL, PL/SQL, Python, Unix Shell Scripting

Databases: Oracle, SQL Server, Netezza, Hadoop HDFS

Tools: Jenkins, Vault, Venafi, Bitbucket (Stash), IntelliJ, Splunk, Cloudera Manager, DOMO, Remedy, Rally (ALM), Chef Automate, Venafi

OS: Unix, Linux, Windows

PROFESSIONAL EXPERIENCE

Client: Mastercard – O'Fallon, MO

Senior BizOps Engineer/ ETL Hadoop Developer May 2019 – Present

Developed and maintained robust ETL batch pipelines using Batch Manager and Dumbo frameworks to support high-volume data processing across business units.

Engineered and optimized Hive and Impala queries for large-scale data transformation and reporting, improving data availability and processing times.

Led platform-level responsibilities including NiFi pipeline deployments, OS patching, server certificate renewals, and PCI compliance efforts.

Resolved production issues in Apache NiFi pipelines while ensuring SLA compliance and minimizing downtime.

Debugged Spark and Batch Manager logs to analyze and remediate data delays and job failures.

Automated pre- and post-patching procedures with Chef Automate and scripting, improving system reliability.

Collaborated with developers and QA to implement data validation strategies and ensure data integrity.

Managed certificate deployments across multiple environments and maintained cluster-level compliance posture.

Performed onboarding for new users and managed access to production systems.

Used Splunk to monitor data pipelines and trigger alerts for proactive issue resolution.

Conducted table partition management, backloads, and handled various ingestion patterns (initial, incremental, daily).

Executed MinIO bucket creation and sizing for scalable data storage.

Supported weekend on-call rotations and streamlined handovers to reduce incident response time.

Client: Optum – Minneapolis, MN

Informatica Developer Nov 2018 – April 2019

Collaborated with business SMEs to define and implement data cleansing rules using Informatica Data Quality (IDQ), improving data accuracy across critical systems.

Developed and executed IDQ plans to identify and resolve data quality issues, and presented cleansing results to operational companies (OpCos).

Built complex Informatica mappings, mapplets, sessions, and workflows to process data from diverse sources including Teradata, Oracle, and flat files.

Utilized Teradata FastLoad and MultiLoad utilities to efficiently ingest and stage large datasets.

Applied performance tuning techniques such as Pushdown Optimization to enhance ETL throughput.

Participated in design reviews, code walkthroughs, and performance evaluations to ensure scalable and maintainable ETL architecture.

Documented all cleansing logic and data profiling insights to support traceability and future audits.

Client: OnBlick – Irving, TX

Data Science Analyst/ Data Analyst Sept 2017 – Oct 2018

Built ETL pipelines and automated workflows with Python and Shell scripting.

Created technical documentation and improved Informatica performance.

Backed up production Netezza databases to maintain data availability.

Developed and debugged ETL mappings using Informatica.

Tuned performance for data integration jobs and handled production support.

EDUCATION

Master of Science in Technology

Pittsburg State University, KS – May 2017

Bachelor of Technology

JNTUH, India – May 2015



Contact this candidate