Post Job Free
Sign in

Azure Data Engineer with 8+ Years Experience

Location:
Hillsboro, OR, 97124
Salary:
70
Posted:
April 01, 2026

Contact this candidate

Resume:

Shravani Challa

Data Engineer

*************.******@*****.*** 647-***-****

https://www.linkedin.com/in/shravani-challa-89b025206/ Professional Summary:

Results-driven Azure Data Engineer with 8+ years of experience in designing, developing, and optimizing ETL/ELT pipelines and data Lakehouse solutions. I am skilled in Databricks, PySpark, SQL, and Azure Data Factory, with hands-on experience implementing data governance, quality frameworks, and CI/CD. Proven track record of enabling analytics and BI through clean, scalable, and high-performing data pipelines in financial and enterprise environments.

Over 5 years of experience managing and leading mid-level professional teams, driving performance improvements through data-driven insights and effective leadership. Adept at developing and implementing performance management metrics tailored to claims organizations to improve operational efficiency, identify trends, and optimize business outcomes.

Conversant with all phases of the Software Development Life Cycle (SDLC) especially Agile, Scrum, involving business process analysis, requirements gathering and analysis, detailed design, development, testing and post implementation support.

Extensive work knowledge in handling huge databases and worked on performance tuning and query optimizations.

TECHNICAL SKILLS:

Cloud & Data Platforms: Azure Data Factory, Azure Databricks, Azure Data Lake (ADLS) Frameworks: PySpark, Python, SQL, Spark Streaming

ETL & BI Tools: SSIS, Informatica, Talend, Power BI Version Control: GitHub, GitLab

Governance & Security: Unity Catalog, Delta Lake, Role-Based Access Control (RBAC) Databricks Expertise: Delta Tables, Delta Live Tables, Table Triggers, Databricks Asset Bundles, Lakehouse Architecture

Professional Experience:

Longview Systems, Toronto, ON Aug 2024 – till date Data Engineer

Responsibilities:

Designed and maintained Azure Databricks ETL/ELT pipelines using PySpark to process large-scale financial and operational datasets.

Implemented Lakehouse Architecture using Delta Lake and Unity Catalog, enabling full lineage, governance, and 99.9% schema consistency.

Integrated data from ADLS, SQL Server, and APIs to build analytics-ready Delta tables for finance, regulatory, and leadership reporting.

Optimized cluster sizing and SQL/PySpark transformations, reducing computing cost by 30% and improving job runtime by 45%.

Built CI/CD automation using Databricks Asset Bundles and GitLab, enabling 100% repeatable deployments across environments.

Partnered with Finance/BI teams to design curated datasets, resulting in 35% faster Power BI refresh times and improved report accuracy.

Automated reconciliation and validation scripts, eliminating 90% of manual checks and cutting data errors by 80%.

Developed optimized stored procedures and partition strategies, improving query performance by 30–50%.

Conducted code reviews using GitHub pull requests, improving code quality, enforcing standards, and ensuring alignment across ETL development teams.

Strong problem-solving skills with the ability to troubleshoot complex pipeline failures and production issues in real time.

Adept at translating business requirements into scalable, secure, and maintainable technical solutions.

Supported junior developers and analysts through mentoring, code reviews, and best practice guidance.

Known for being dependable, flexible, and proactive in solving problems and supporting team goals.

Encouraged open dialogue, feedback, and cross-team collaboration to ensure smooth project execution.

Strong understanding of the Software Development Life Cycle (SDLC), including requirements gathering, design, development, testing, deployment, and post-production support.

Actively participated in Agile ceremonies (daily standups, sprint planning, backlog grooming, retrospectives) to improve sprint efficiency and deliver predictable outcomes.

Collaborated with product owners, business analysts, and QA teams to deliver features aligned with business priorities.

ADP India Pvt Ltd, Hyderabad, IN Apr 2019 – Nov 2022 Sr. Member Technical

Responsibilities:

Developed and optimized SSIS/Talend pipelines to integrate multi-country payroll data.

Led cloud migration effort, moving on-prem SQL Server workloads to ADLS Gen2 + Azure Synapse, improving scalability and performance by 40%.

Automated ingestion & transformation using ADF + Databricks, reducing manual effort by 85% and improving pipeline reliability to 99% success rate.

Applied GDPR-compliant data governance, implementing validation rules and schema checks that reduced data issues by 70%.

Created Informatica mappings (Lookups, Aggregators, Joins, Filters, Update Strategy, Union) to load fact/dimension models.

Rebuilt ETL pipelines using stored procedure–based sources, improving load flexibility and reducing dependency issues by 30%.

Built BI reports and regulatory dashboards using SSRS and Report Builder, improving visibility for global payroll teams.

Used GitHub for version control of Informatica mappings, ADF pipelines, SQL scripts.

Supported branching, pull requests, and version tagging to enable consistent, traceable, and automated deployments across environments.

Experience working in hybrid (cloud + on-prem) environments commonly used in Canadian enterprises.

Focused on documentation, governance, and best practices, ensuring solutions meet organizational and compliance standards.

Skilled in preparing technical documentation, runbooks, data dictionaries, and design specifications for operational teams.

Demonstrated strong communication and documentation skills, delivering clear status updates, architecture diagrams, and technical explanations for non-technical stakeholders.

Partnered with cross-functional teams including Finance, BI, Product, and Compliance, ensuring data solutions meet business and regulatory requirements.

Conducted knowledge-sharing sessions and walkthroughs to align teams and maintain transparency throughout the project lifecycle.

Comfortable engaging with senior stakeholders to gather requirements, clarify business rules, and present solution proposals.

TECHSPERTS SOFTWARE SOLUTIONS, Hyderabad, IN October 2015 – Mar 2019 Software Engineer

Responsibilities:

Designed and developed Informatica and Azure Data Factory pipelines for migration of legacy systems into Azure Data Lake.

Implemented parameterized ETL pipelines and optimized partitioning, reducing pipeline runtime by 35%.

Supported Terraform-based infrastructure provisioning and integrated with CI/CD for automated releases.

Improved data observability with logging, lineage, and metadata tracking, reducing issue resolution time by 40%.

Migrated Excel/CSV data into SQL Server with SSIS and built PL/SQL/SQL scripts for analytics and operational reporting.

Designed OLAP cubes, KPIs, and dashboards that improved executive decision-making and reduced report turnaround by 50%.

Maintained Informatica mappings, ADF pipelines, and SQL scripts in GitHub repositories, ensuring traceability, rollback safety, and streamlined collaboration across teams. Projects:

Cloud Migration (2TB): Led migration of financial data from on-prem SQL to ADLS & Databricks with zero data loss, achieving 40% faster query performance.

Data Quality Framework: Built Python-based validation modules for schema & threshold checks, reducing data quality issues by 70%.

CI/CD Automation: Implemented DevOps pipelines for Databricks notebooks via Azure DevOps/Git, enabling one-click deployments.

Databricks Optimization: Applied caching, partition tuning, and autoscaling, cutting pipeline runtime by 40%. Education:

PG Diploma in Cloud Computing at Loyalist College, Toronto, Ontario, Canada.

Bachelor’s in information technology at JNTU, India. Badges and Certification:

Databricks Certified Data Engineer Associate

https://credentials.databricks.com/804fef83-aa14-4ff4-b408-1c2c7c246307#acc.1Ejk25Cp Academy accreditation: Databricks Fundamentals

• https://credentials.databricks.com/4d06ba2a-edb3-4b6d-a507- 9ed58ce77abf?utm_source=whatsapp&utm_medium=social Academy Accreditation: Generative AI Fundamentals

• https://credentials.databricks.com/5b9a6aa3-28aa-4e3c-b748-ec08d39e10d7#acc Academy Accreditation: Azure Databricks Platform Architect

• https://credentials.databricks.com/b340148a-b1c3-44cc-bac8-32c0f70579d2#acc.u7hIPMFY



Contact this candidate