Bala Manohar
Plano, TX +1-331-***-**** ***********.**@*****.***
Professional Summary
Data Warehouse Engineer with 5+ years of experience maintaining and improving data warehouse architecture, tools, and processes with a strong backend focus using Python, Shell Scripting, Oracle, and Linux-based environments. Proven expertise in implementing and managing Linux-based infrastructure, enhancing ETL and database load/extract processes, and automating manual workflows for continual process improvement. Strong working knowledge of relational databases including Oracle, orchestration tools including Airflow, and agile methodologies, with a passion for architecture improvement and scalable data warehousing solutions.
Education
Lewis University Romeoville, IL
Master of Science in Business Analytics and Operations Management Loyola Academy Hyderabad, India
Bachelor of Science in Mathematics, Statistics and Computer Science Technical Skills
Languages: Python, Shell Scripting, SQL, Perl
Data Engineering: Data Warehousing, ETL Pipelines, Database Load/Extract, Informatica, Apache Airflow Databases: Oracle, Oracle Exadata, SQL Server, PostgreSQL Infrastructure: Linux Environment, Unix File Systems, Shell Jobs & Scripts, Process Automation Cloud/Tools: Azure, Git, CI/CD Pipelines
Domain: Healthcare Analytics, Enterprise Data Warehouse, Data Governance, Agile Methodology Professional Experience
AT&T Plano, TX
Data Engineer – Data Warehouse & Backend Infrastructure May 2024 – Present
– Implemented, configured, and managed Linux-based processes and infrastructure for data warehousing, enhancing shell scripts, jobs, and toolsets to improve pipeline reliability and operational efficiency.
– Identified and implemented system and architecture improvements to optimize data warehouse performance, scalability, and maintainability across enterprise data environments.
– Enhanced ETL and database load/extract processes using Python and Shell Scripting, automating manual workflows and improving data delivery speed and accuracy for downstream consumers.
– Developed and maintained Oracle-based data warehouse solutions, applying practical knowledge of Unix file systems, permissions, pipes, and standard tools to support backend data infrastructure.
– Orchestrated data pipeline workflows using Apache Airflow with Python, ensuring reliable scheduling, monitoring, and alerting across batch data processing jobs.
– Collaborated with cross-functional stakeholders to gather requirements, resolve data-related technical issues, and deliver architecture improvements aligned with agile methodology practices. Cigna Healthcare Chicago, IL
Data Engineer – Oracle & ETL Pipeline Development Nov 2022 – Aug 2023
– Maintained and improved data warehouse architecture using Oracle and Python, enhancing ETL and database load/extract processes to support healthcare claims and membership data workflows.
– Implemented and managed Linux-based infrastructure and shell scripting processes, automating manual data workflows and improving operational efficiency across the data warehouse environment.
– Enhanced ETL pipelines using Informatica and Python to extract, transform, and load large healthcare datasets, ensuring data accuracy, completeness, and auditability for regulatory reporting.
– Applied practical knowledge of Unix file systems and shell scripting to configure and manage data warehousing jobs, scripts, and backend processes in a Linux environment.
– Partnered with stakeholders across clinical and business teams to identify system improvements, resolve data issues, and deliver scalable data warehouse solutions following agile methodology. Skillsoft Hyderabad, India
Data Engineer – Warehouse Automation & Process Improvement Dec 2018 – Jan 2022
– Designed and developed ETL and database load/extract processes using Python, Shell Scripting, and SQL to ingest and transform large datasets into relational data warehouse environments.
– Enhanced Linux-based toolsets, shell scripts, and automated jobs to improve data pipeline reliability, reduce manual processing, and optimize data delivery across the warehouse infrastructure.
– Applied working knowledge of Unix file systems including mount types, permissions, standard tools, and pipes to manage and maintain backend data warehousing processes effectively.
– Orchestrated batch data pipeline workflows using Apache Airflow with Python, maintaining scheduling, monitoring, and error handling across production data warehouse jobs.
– Demonstrated a passion for automation and continual process improvement, identifying inefficiencies in existing workflows and implementing scalable backend solutions aligned with agile practices.