Post Job Free
Sign in

Business Intelligence Data Warehousing

Location:
Irvine, CA, 92618
Posted:
June 10, 2025

Contact this candidate

Resume:

NAGENDER REDDY

ETL Developer

+1-949-***-****

********.*****@*****.***

linkedin.com/in/nagender-reddy-81175b2a5

Career Objective:

To work in a quality environment where I can use my knowledge and skills to the best extent to develop elegant quality software and keep in touch with ever-changing trends and technologies in the field. including experience in the design, development, implementation, and administration of database systems for both OLTP & Data Warehousing Systems applications with various business domains like Financial, pharmaceutical, and manufacturing.

Professional Summary

Skilled and motivated IT professional with 12+ years of experience designing, developing, and implementing data integration, data warehousing, business intelligence, and ETL solutions using Informatica PowerCenter/Informatica Cloud (IICS/IDMC), DBT, SnapLogic & MicroStrategy.

Experience working with different Informatica cloud (IICS) microservices (Cloud Data Integration, Mass Data Ingestion, Data Quality, Data Profiling, Customer 360 (MDM) & Business 360)

Experienced in designing and implementing Directed Acyclic Graphs (DAGs) to automate data workflows and scheduling tasks.

Strong Data Warehousing ETL experience using Informatica 9.1/8.6.1/8.5/8.1/7.1 PowerCenter Client tools - Mapping Designer, Repository Manager, Workflow Manager/Monitor and Server tools.

Familiar with integrating Airflow with different data sources, including databases, data warehouses, and cloud storage solutions like AWS S3, Google Cloud Storage, and Azure Blob Storage.

Good Strong expertise in SSIS and proven experience migrating DTS packages to SSIS.

Implemented some automation in IICS and PowerCenter.

Having good exposure to Salesforce and Azure integrations.

Work on both Development and Operational Support (for P1 and P2 issues).

Work closely with data analysts, business analysts, and other stakeholders to understand data requirements and deliver solutions

Proficient in using Terraform for Infrastructure as Code (IaC) on AWS, including creating, managing, and maintaining infrastructure resources such as EC2 instances, S3 buckets, VPCs, and RDS databases

Extracting data from production data stores, Built Data structures, and developed departmental reports for performance and response purposes by using Oracle, SQL, MS Access, MS Excel, and PL/SQL.

Hands-on Experience in the development of Reports and Dashboards in Tableau. Working Experience in the `design and development of Tableau visualization solutions.

Proficient in developing security metrics programs and creating Tableau dashboards and visuals in Tableau.

Experience on MuleSoft ESB 4. x and Any Point Cloud Hub.

Implemented robust error handling and retry mechanisms to ensure the reliability and resiliency of data pipelines.

Having a good understanding of different data formats like CSV, XML, and JSON.

Experienced in requirement analysis, development, and issue resolution.

Strong experience in writing UNIX Shell scripts, and SQL Scripts for development, automation of ETL process, error handling, and auditing purposes.

Excellent analytical and problem-solving skills, A Self-starter with a positive attitude, willingness to learn new concepts, and acceptance of challenges.

Technical skills:

Environment: Windows, Linux/Unix, Cloud-based Azure and AWS

ETL Tools: Informatica PowerCenter Tools (8. x or 9. x), Pentaho DI, Snap logic, MSBI, Talend DI, Tableau Desktop.

Integration as a Service: Informatica Cloud IICS, MFT, MuleSoft

Database: SQL Assistant, Squirrel SQL, AQT, TOAD, SQL Developer, SQL Plus and SQL Server Management Studio.

FTP: Putty, Ultra Edit and WinSCP

Reporting Tools: BIRT, SSRS, Tableau, Power BI, Cognos, MicroStrategy

ERP/CRM: Oracle EBS, Salesforece.com (SFDC) and Vtiger CRM.

RDBMS: MySQL, Oracle, MSSQL Server, PostgreSQL, MongoDB

Tool: Toad, SQL Developer, Aurora, SFDC Workbench.

Project Management tools: Atlassian, Jira, Confluence, and Bit Bucket.

Project Summary:

Sapphire Software Solutions

Client: Cristus Health Jan 2024 – to date

Role: ETL Developer

Stryker is one of the world's leading medical technology companies and, together with our customers, is driven to make healthcare better. The Company offers a diverse array of innovative products and services in Orthopedics, Medical and Surgical, and Neurotechnology and Spine that help improve patient and hospital outcomes.

Responsibilities:

Involved in business case gathering and design discussion with Tech Leads.

Reviewing functional requirement documents with team and tech leads.

Implementing IICS code based on requirement documents. And preparing test cases for Unit testing.

Converted Dash database from SQL Server to AWS Aurora database. Created database, and migrated the data to Aurora. Re-written the reports, procedures, functions, and triggers in Aurora. ETL and SQL performance tuning by creating indices, partitioning the tables, and re-writing the SQL.

Developed different types of Mappings, Mapping Configuration Tasks, Data Synchronization Tasks, and Task Flows using SSIS

Using Informatica PowerCenter created mappings and mapplets to transform the data according to the business rules.

Developing code as bug-free, IICS best practices to improve performance and with error handling methods.

Developed Tableau server audit reports to view and analyze Tableau server insights (tableau server usage, tableau server data extract failures, etc.)

Documented Informatica mappings in an Excel spreadsheet.

Performed Data validation and data integrity before delivering data to operations, Business, and financial analysts by using Oracle, SQL Server, MS Excel, and MS Access.

Understand technical issues and identify architecture and code modifications to support changing user requirements for multiple Data Services jobs and applications.

Experience in debugging execution errors using Data Services logs (trace, statistics, and error) and by examining the target data.

Wrote UNIX shell Scripts & PMCMD commands for FTP of files from a remote server and backup of repository and folder.

Utilized data integration/ETL tools (Informatica, SnapLogic) to extract, transform, and load large volumes of data into the data warehouse to enable efficient data analysis and reporting.

Performance tuning on Snap Logic Pipelines in a Production environment.

Worked with different connectors like Salesforce, Oracle, and MS SQL.

Code review for others' code and deployment walkthrough with the operations team.

Generated Tableau Dashboard with quick/context/global filters, parameters, and calculated fields on Tableau (9. x) reports

Participated in hypercare Support of any Production release.

CIEL HR / Deloitte Ind Pvt Ltd Jan 2018 – Sep 2023

Role: Sr ETL Developer

The HSP MediTrac application is the central system for member and provider data and claims. This project implements HSP to replace the legacy systems in transactions, workflows, business processes, etc. HSP provides medical claims processing software and healthcare information systems to meet the needs of Health Insurance companies and medical providers.

Responsibilities:

Gathered ETL requirements from the Business team and designed its high-level documents.

Debug the mappings using debugger to resolve code issues.

Involved in Dimensional Modelling such as star schema, snowflake schema, creating Facts, Dimensions

Export/Import interfaces, packages, and their scenarios across different environments.

Implemented Interfaces for loading Source data into the Staging area and Staging data into Dimension and Fact tables of the Reporting Data mart.

Created UNIX shell scripts for the Informatica ETL tool to automate sessions

Designed end-to-end ELT process flow and implemented it using SSIS packages.

Performed unit and system testing of ELT Packages.

Involved in tuning the mappings with audit, error design, and reprocessing strategies.

Created a UNIX program to preprocess data in files before loading them into Data Warehouse tables

Analysed Session log files in operator navigator to resolve errors in mapping and managed session configuration.

Involved in IICS performance tuning of the mappings/sessions.

Extensive experience creating reports using SSRS 2008/2010/2012 using Report Manager, Report Builder, and BIDS.

Resolving tableau dashboard development and performance issues.

Extensively used different transformations like Source Qualifier, Filter, Aggregator, Expression, Connected and Unconnected Lookup, IICS Sequence Generator, Router and Update Strategy.

Developed Power Maps using Excel 2013 for Reporting.

Tableau server status reporting and data extract management.

Developed applications in Visual Basic .NET with EDI files and SQL Server.

Designed SSRS reports using different data regions like Tables, Matrices, Lists, Sub-reports, and Charts. Created drill-down and drill-through reports for easy navigation within the reports.

Utilized data integration/ETL tools (Informatica, Snap Logic) to extract, transform, and load large volumes of data into the data warehouse to enable efficient data analysis and reporting.

Used SSIS to build performance data integration solutions including extraction, transformation, and load packages for data warehousing. Extracted data from the XML file and loaded it into the database.

Client: Faber Sindoori Management Services Pvt. Ltd Jun’ 2012- Dec’ 2017

Role: System Administrator

Responsibilities:

Extracted structured and unstructured data from source systems, transforming them to fit dimensional models. Data source formats include relational databases, flat files, and non-relational structures such as XML

Implemented and enhanced different Knowledge Modules in mappings for loading and integrating the data from sources to targets.

Debug the sessions by utilizing the logs of the sessions.

Created SQL agent and monitored the daily update loads.

Designed and developed various analytical reports from multiple data sources by blending data on a single worksheet in Tableau Desktop

Performed unit testing at various levels of the ETL experience in configuring SQL Agent on SQL server and creating new agents and testing them.

Working with various upstream and downstream customers in interfacing various systems and processes for Data extractions, and ETL.

Responsible for designing, developing, and automating ETL processes using Informatica PowerCenter.

Point of contact for all the Data processes, Data Mappings, Data dictionaries, Data pulls.

Deployed and Scheduling reports using Subscriptions and Data-Driven Subscriptions.

Version Controlling all ETL components including Informatica Workflows, DB scripts, Parameter files, and configuration files in Tortoise SVN and CA Software Change Manager.

Preparing Linux/Unix scripts to archive and compress files, create folders, change permissions, and modify the parameter/configuration files.

Querying, creating stored procedures, and writing complex queries and T-SQL joins to address various reporting operations and random data requests.

Academic Details:

Bachelor of Technology, Information Technology from Jawaharlal Nehru Technological University Hyderabad in 2011.



Contact this candidate