Post Job Free
Sign in

Data Engineer A Team

Location:
Campbell, CA
Posted:
September 10, 2025

Contact this candidate

Resume:

Murari Kandula

Email: ****************@*****.***

Mobile: +1-408-***-****

Data Engineer

PROFESSIONAL SUMMARY:

Over 5 years of experience in data engineering, demonstrating strong analytical thinking and problem-solving skills while working in Agile/Scrum teams, prioritizing work effectively. I am a team player who can influence and guide teams.

Expert in writing and analyzing complex PL/SQL queries and stored procedures, connecting dots across various applications to understand the E2E view, and identifying priorities with minimal supervision.

Designed high-performance data solutions, focusing on attention to detail and innovative thinking, while managing multiple projects simultaneously and providing effort and financials estimation.

Developed real-time ingestion pipelines, supporting decision-making and operational alerts, and demonstrating proficiency in connecting dots across various applications and businesses for an E2E view.

Implemented metadata and access governance to ensure secure data usage and compliance, while effectively communicating across the organization to both technical and non-technical audiences.

Built cloud-native ETL workflows, enabling hybrid data processing for enterprise-level business intelligence and reporting, and demonstrating strong communication and presentation skills.

Managed storage across various platforms, optimizing storage tiering and access performance, and exhibiting a willingness to ask questions and reach out for assistance as required.

Developed modular, reusable Python and SQL components for ETL pipelines, enabling scalable data cleansing and transformation, and working well in a team environment with minimal supervision.

Engineered ACID-compliant Delta Lake architectures, managing historical and current data for analytics and regulatory compliance, and demonstrating analytical thinking and problem-solving skills.

Integrated Kafka, Event Hub, and Pub/Sub to build real-time data streams, allowing event-driven architectures, and demonstrating proficiency with query tools to aid data analysis.

Automated deployment pipelines using Git, Jenkins, and Azure DevOps for version control, CI/CD, and secure promotion, and exhibiting a willingness to ask questions and reach out for assistance.

Built optimized Snowflake and BigQuery data models, focusing on partitioning, clustering, and caching, and demonstrating expertise with Microsoft Office suite usage for various tasks.

Designed ingestion frameworks from APIs, databases, and file systems into data lakes and warehouses, unifying heterogeneous datasets, and demonstrating strong PL/SQL skills for data analysis.

Orchestrated complex workflows using Airflow and Composer, handling conditional logic and retries, and demonstrating know-how in effort and financials estimation for project planning.

Modeled dimensional data with facts and dimensions and developed reporting dashboards using Power BI and Google Data Studio, enabling real-time visualization and KPI tracking for business teams.

Monitored pipeline health and performance using GCP Stackdriver and Azure Monitor, establishing alerts and dashboards, and demonstrating experience with Oracle Exadata or 10g and above.

Collaborated across data science, engineering, and business teams to deliver trusted data products, ensuring data quality and compliance, and demonstrating innovative thinking skills.

TECHNICAL SKILLS:

Databases - Azure SQL DB, PostgreSQL, Oracle, MySQL, MongoDB, Oracle Exadata

Languages - Python, SQL, T-SQL, Scala, Shell Scripting, PL/SQL

Others - Microsoft Office Suite

PROFESSIONAL EXPERIENCE:

Costco Aug 2023 – Present

Data Engineer

Responsibilities:

Demonstrated expertise in analytical thinking by designing scalable data pipelines using Azure Data Factory and Databricks, processing retail datasets for real-time reporting and downstream analytics. This involved connecting dots across various applications to understand the E2E view and deliver impactful solutions.

Integrated Azure Event Hub and Azure Stream Analytics to stream customer behavior data in near real-time, enabling advanced analytics and immediate insights into transactional and behavioral trends, showcasing attention to detail. This required strong communication skills.

Built reusable PySpark jobs and analytical model pipelines in Databricks to support rapid data preparation, reducing data scientist onboarding time and improving machine learning model performance, demonstrating problem-solving skills. This improved team success.

Structured data lakes in ADLS Gen2 using raw, curated, and presentation layers; enhanced Spark workloads using partitioning, Delta Lake compaction, and caching to optimize runtime and performance, showing innovative thinking skills. This required minimal supervision.

Developed CI/CD pipelines in Azure DevOps to automate deployment of data workflows, configuration assets, and linked services across QA, UAT, and production environments seamlessly, demonstrating proficiency in connecting dots. This improved team success.

Ensured secure data access by integrating Azure Key Vault, applying row-level masking, and configuring role-based access controls to comply with enterprise governance and compliance policies, showcasing attention to detail. This required minimal supervision.

Created and maintained Power BI dashboards to track pipeline health, business KPIs, and operational performance across retail departments, enabling data-driven insights and proactive intervention, demonstrating strong communication skills. This improved team success.

Partnered with cross-functional teams to gather business requirements and architect scalable, cost-effective, and secure data processing pipelines using ADF and Databricks, demonstrating proficiency in connecting dots. This improved team success.

Modernized legacy ETL jobs by migrating SSIS workflows into ADF, improving reliability, reducing technical debt, and enabling flexible pipeline configurations with parameterized datasets, showcasing innovative thinking skills. This required minimal supervision.

Scheduled orchestrations using ADF triggers and monitored pipeline job health using Azure Monitor and Log Analytics, ensuring high availability and proactive issue resolution, demonstrating problem-solving skills. This improved team success.

Cigna Sep 2022 – Jul 2023

Junior Data Engineer

Responsibilities:

Built end-to-end ingestion pipelines in Azure Data Factory (ADF) to move policy and claims data from on-prem SQL Server to Azure Synapse Analytics, enabling centralized reporting and analytics workloads, demonstrating analytical thinking. This improved team success.

Developed scalable transformation logic using Mapping Data Flows to cleanse, join, and enrich healthcare data, improving reporting accuracy and compliance readiness for critical insurance metrics and KPIs, showcasing attention to detail. This required minimal supervision.

Engineered parameterized pipelines and datasets using dynamic content and JSON templates, promoting reusable design, modularity, and reduced configuration time across multiple healthcare data sources, demonstrating problem-solving skills. This improved team success.

Secured ADF-linked services by integrating Azure Key Vault, ensuring credentials, secrets, and keys were encrypted, rotated, and managed according to enterprise data protection policies, showcasing attention to detail. This required minimal supervision.

Modeled star schema structures using T-SQL, created fact and dimension tables in Azure SQL Database, and implemented SCD Type 1/2 logic for historical data management, demonstrating analytical thinking. This improved team success.

Utilized Git integration within ADF to enable version control, branching strategies, and automated CI/CD promotion of data pipelines across development, staging, and production environments, showcasing attention to detail. This required minimal supervision.

Participated in agile ceremonies including sprint planning, backlog grooming, and daily stand-ups to provide development estimates and align project deliverables with business priorities, demonstrating strong communication skills. This improved team success.

Created metadata-driven ADF frameworks to dynamically construct pipelines and orchestrations, accelerating the onboarding of new data sources and improving operational efficiency, demonstrating problem-solving skills. This improved team success.

Tuned SQL queries and optimized ADF pipeline parallelism settings to reduce data movement time and resolve performance issues across staging and transformation workflows, showcasing attention to detail. This required minimal supervision.

Designed Power BI dashboards connected to Synapse Analytics to display key metrics including claims volume, provider performance, and member health trends for business analysis, demonstrating analytical thinking. This improved team success.

Goldman Sachs Jun 2020 – Aug 2022

Data Analyst

Responsibilities:

Analyzed high-volume trading and financial transaction data, generating reports and insights to support trading desk performance, risk analysis, and regulatory compliance across multiple business units and regions, demonstrating analytical thinking. This improved team success.

Built interactive dashboards in Power BI to monitor trade execution speed, risk exposure, and market movement trends, empowering front-office traders and compliance teams with real-time visual analytics, showcasing attention to detail. This required minimal supervision.

Created complex SQL stored procedures in Oracle and PostgreSQL to transform raw trading data into structured reporting tables used in compliance and regulatory audits across multiple jurisdictions, demonstrating problem-solving skills. This improved team success.

Automated Excel-based reports using Python and Pandas, reducing manual intervention, standardizing reporting logic, and increasing accuracy and delivery speed for recurring analytics, showcasing attention to detail. This required minimal supervision.

Interfaced with trading teams, compliance officers, and operations stakeholders to gather requirements and convert them into report models, KPI definitions, and SQL-based data products, demonstrating strong communication skills. This improved team success.

Applied rigorous data validation logic and consistency checks to maintain data integrity and minimize discrepancies across multi-source dashboards and regulatory reporting packages, showcasing attention to detail. This required minimal supervision.

Performed root cause variance analysis to resolve data mismatches across upstream systems, improving confidence in financial KPIs used by auditors and trading leads, demonstrating problem-solving skills. This improved team success.

Developed optimized data models and reporting layer schema for risk dashboards, reducing report query time and improving scalability of analytical services, demonstrating analytical thinking. This improved team success.

Maintained metadata repositories, SQL script libraries, and business glossaries to promote standardized reporting practices and support data discoverability for downstream users, showcasing attention to detail. This required minimal supervision.

Delivered ad-hoc analysis on trading desk KPIs, assisting portfolio managers with strategy refinement and resource allocation through insights on volumes, profit margin, and execution efficiency, demonstrating analytical thinking. This improved team success.

Educational Details:

Master's in Computer Science - Southeast Missouri State University, USA

Bachelor of Technology - Computer Science - NRI Institute of Technology, India



Contact this candidate