Email ID: **************@*****.***
Contact: +1-646-***-****
Chaitanya Tangellamudi
Data Engineer
CAREER SUMMARY
●10+ Years of strong IT Experience in Cloud/BI/ETL/Data Warehousing, Data Integration and expertise in design, development, implementation of database systems.
●Good hands on Azure Data Factory.
●Implementation of data movements from On-premises and Salesforce to Azure Data Lake Storage in Azure using Azure Data Factory.
●Good Hands on experience in using different activities like copy and Dataflow in Azure Data Factory.
●Creating and managing datasets in Azure Data Factory.
●Cloning the repositories and connecting the project to Azure Devops in Visual Studio 2022.
●Checking in the code and creating pull requests to deploy code in production without any merge conflicts using Azure Devops.
●Developed Custom Reports and different types of Reports, drilldown, drill-through, sub-report, Ad-hoc Reports, and distributed Reports in multiple formats using SSRS.
●Created Ad-Hoc Reporting with Report Builder and Programming Reporting Services using SSRS.
●Involved in developing the Technical specification documents from the Functional specification documents.
●Experience in Reporting Services, Power BI(DashBoard Reports), SSRS using MS SQL Server.
●Generated ad-hoc reports in Excel and sheared them using Power BI to the decision makers.
●Designed and developed Power BI graphical and visualization solutions with business requirement documents and plans for creating interactive dashboards.
●Generated ad-hoc reports in Power BI to the decision makers for strategic planning.
●Good knowledge on Power BI to import data from various sources such as SQL Server, Azure SQL DB, MS Excel etc.
●Developed analysis reports and visualization using DAX functions like table function, aggregation function and iteration functions.
●Enhanced existing Snowflake tables and views for new reporting requests.
●Created materialized views to improve query performance for complex reports.
●Experience in using Regex, Window functions, Semi-Structured functions to write queries to meet complex reporting needs.
●Good Experience in Query Optimization to increase report performance.
●Created SSIS packages with complicated transformations like Lookup, Derived Column, Condition Split, Multicast, Sort, Union All, Cache Transform,etc.
●Developed, monitored and deployed SSIS packages.
●Experience with ETL tool Informatica 10.x/9.x in designing the Mappings, Configuring the Informatica Server and creating and scheduling the Workflows and sessions.
●Hands on experience in creating Design specifications, Test Plans, Test Scripts, managing Testing Specifications and conducting User Acceptance Testing (UAT) and Training.
●Experience in ETL Performance Tuning/Optimization.
●Possess expertise in OBIEE Data modeling including creating Alias Tables, Aggregate tables, Dimensional Hierarchies, Level Based Measures, Time series Functions, and Data visualization components including Dashboards prompts, pages, links, images, embedded content, text, folders & guided navigation links.
●Involved in Troubleshooting, resolving and escalating data related issues and validating data to improve data quality.
●A well-organized, goal-oriented, highly motivated effective team member with excellent analytical, troubleshooting, and problem-solving skills. Ability to work individually and meet tight deadlines.
●Active participation in decision making with business and BA meetings and regularly interacted with Business to gain good understanding on the requirements.
TECHNICAL SKILLS
Languages / Web:
Microsoft SQL Server, PL/SQL, Oracle, Python, Snowflake
Software / Tools:
SSIS, SSRS, Visual Studio 2022/2017, OBIEE 10.x/11.x, Tableau 8.x/8.1, Informatica Power Center 10.2, Power BI
Cloud Technologies
Microsoft Azure Data Factory (V2)
Configuration Tools
Azure Devops, GitHub
RDBMS:
Oracle 11g/10g/9i, SQL Server 2008/2012/2016
Environments:
Windows 9x/NT/2000/XP/ 2003/ 2007, Unix, Linux, DB2
PROFESSIONAL EXPERIENCE
Rabo Agrifinance, St Louis
April 2021 – Till date
Data Engineer
Responsibilities:
●Building pipelines to copy data from source to sink in Azure Data Factory.
●Worked on creating dependencies of activities like the Copy data and Data flow in Azure Data Factory.
●Created datasets that link to different data sources.
●Successfully created linked services that connect to the sources and sink in Azure Data Factory.
●Successfully moved data from Salesforce to Azure Data Lake Storage and saved the files in Parquet format.
●Created a pipeline and reused the pipeline in another pipeline using execute pipeline activity.
●Created Global parameters and used the parameters in pipelines.
●Created Alerts for Success and fail criteria using Azure Log Analytics Workspace.
●Expertise in creating, managing and delivering Server based reports using SQL Server Reporting Services (SSRS) BI.
●Identified and defined the Datasets for the report generation and wrote the queries for the drill down reports, sub reports.
●Created Ad-Hoc Reporting with Report Builder and Programming Reporting Services using SSRS.
●Involved in developing the Technical specification documents from the Functional specification documents.
●Created Reports, and dashboards using power BI.
●Conducted data cleansing and transformation tasks using power Query to prepare data.
●Identified and defined the Datasets for the report generation and wrote the queries for the drill down reports, sub reports.
●Scheduled SQL Server Agent jobs and monitoring the jobs.
●Involved in Query Optimization to increase report performance.
●Active participation in decision making with business and BA meetings and regularly interacted with Business to gain good understanding on the requirements.
●Cloning the repositories and connecting the project to Azure Devops in Visual Studio 2022.
●Checking in the code and creating pull requests to deploy code in production without any merge conflicts using Azure Devops.
●Create Pull requests and push the code to different environments using Azure Devops.
●Created Internal and external stage and transformed data during load.
●Created snowpipe for continuous data load.
●Used Regex, Window functions, Semi-Structured functions to write queries to meet complex reporting needs
Environment: Azure Data Factory(V2), MS SQL Server, SSRS, Visual Studio 2017/22, T- SQL, PL/SQL, SQL Server Management Studio, DAX Studio, Snowflake, Power BI
HIH Inc, IL
January 2021 – April 2021
SQL Reporting Developer
Responsibilities:
●Worked with the Functional Teams to comprehend their requirements to design and develop reports that meet business expectations.
●Identified and defined the Datasets for the report generation and wrote the queries for the drill down reports, sub reports.
●Created Ad-Hoc Reporting with Report Builder and Programming Reporting Services.
●Involved in developing the Technical specification documents from the Functional specification documents.
●Involved in mapping multiple data sources to a single target database by analyzing the business requirements that are specified in the functional documents.
●Expertise in creating, managing and delivering Server based reports using SQL Server Reporting Services (SSRS) BI.
●Identified and defined the Datasets for the report generation and wrote the queries for the drill down reports, sub reports.
●Created Ad-Hoc Reporting with Report Builder and Programming Reporting Services using SSRS.
●Involved in developing the Technical specification documents from the Functional specification documents.
●Created Sub-Reports, Drilldown-Reports, Summary Reports, and Parameterized Reports in SSRS.
Environment: MS SQL Server, SSIS, SSRS, TFS, Visual Studio 2017, T- SQL, PL/SQL, SQL Server Management Studio, Power BI
Kroger Specialty Pharmacy, Lake Mary, Florida
May 2018 – January 2021
SSRS/SSIS Developer
Responsibilities:
●Worked with the Functional Teams to comprehend their requirements to design and develop Jobs that meet business expectations.
●Identified and defined the Datasets for the report generation and wrote the queries for the drill down reports, sub reports.
●Created Ad-Hoc Reporting with Report Builder and Programming Reporting Services.
●Involved in developing the Technical specification documents from the Functional specification documents.
●Involved in mapping multiple data sources to a single target database by analyzing the business requirements that are specified in the functional documents.
●Expertise in creating, managing and delivering Server based reports using SQL Server Reporting Services (SSRS) BI.
●Identified and defined the Datasets for the report generation and wrote the queries for the drill down reports, sub reports.
●Created Ad-Hoc Reporting with Report Builder and Programming Reporting Services using SSRS.
●Involved in developing the Technical specification documents from the Functional specification documents.
●Created Sub-Reports, Drilldown-Reports, Summary Reports, and Parameterized Reports in SSRS.
●Developed analysis reports and visualization using DAX functions like table function, aggregation function and iteration functions.
●Creating reports by importing, direct query and live data from various data sources.
●Developed SSIS Packages to extract data from various data sources including Access database, Excel spreadsheets, and flat files into SQL server.
●Involved in the initial load for the project in Different environments.
●Created various SSIS packages for the ETL functionality of the data and importing data from various tables.
●Responsible for ETL operation using SQL Server Integration Services and Worked with users in training and support and have experience in incremental data loading.
●Designed the Target Schema definition and Extraction, Transformation and Loading (ETL) using SSIS.
●Extensively used transformations, control flow tasks, data flow tasks, containers, and event handlers.
●Developed common modules for error checking and different methods of logging in SSIS.
●Extensively used SSIS Import/Export Wizard, for performing the ETL operations.
●Performed Data Analysis and Reporting by using multiple transformations provided by SSIS such as data conversion, conditional split, bulk insert, merge, and union all.
Environment: MS SQL Server 2012/2008R2, SSIS, SSRS, TFS, Visual Studio 2017, T- SQL, PL/SQL, SQL Server Management Studio, Power BI Desktop
Dell, India
July 2014 – July 2015
Informatica and BI Senior Developer
This project mainly deals with Dell’s Human Resource Management systems like SABA, TALEO, WORKFORCE, and COMPENSATION. It records all the events for assignments from hire/start date through to termination. This includes all the information starting from recruitment and candidate, learning management, compensation, appraisals, salary reviews. The Idea is to fully integrate enterprise in to a common platform (PeopleSoft 9.1) and replace the direct architecture in to common processes across the enterprise and the strategy is to align with enterprise services that link end to end services delivery, Human Resources, Finance and Reporting tools and processes in a global integrated platform.
Responsibilities:
●Worked as Senior ETL Developer/BI Developer and implemented OBI solutions for DELL International.
●Was responsible in managing tasks and deadlines for the ETL teams both Onsite and Offshore.
●Participated in business requirement gathering through interactive sessions with Business Teams and DB Architect teams.
●Understood from business requirements for reporting Metrics, KPIs and standard analytics, transforming these requirements into Business Intelligence solutions available to end users.
●Prepared High level & low-level ETL flow design and technical documents.
●Reviewed Informatica Code, Unit Test Cases & Results with the Developers.
●Designed & Customized Informatica mappings according changes & requirements by the business team.
●Created mappings employing various transformations, filters, joiners, lookup, SQL Transformation & SQL overrides etc. to load data from multiple databases into Warehouses and performing the full, incremental loading and processing.
●Extracted and customized the pre-built BI solutions using Oracle BI Apps 7.9.6.4 (HR Analytics,).
●Built new DAC subject areas & execution plans to run data loads on daily/weekly basis.
●Took necessary backups of Informatica repository and DAC repository.
●Performed unit testing of these Workflows from source to staging and staging to warehouse.
●Analyzed & fixed on QA defects made by QA team.
●Built the OBIEE Data model repository (RPD) – Physical, logical and presentation layers required for BI reporting and necessary variables and Initializing blocks to implement security.
●Built ad hoc BI reports and dashboards using OBIEE.
●Deftly managed Cache management, RPD Merging, MUDE Setup.
●Migration of Webcat between environments including refresh of GUID's.
Environment: Informatica 9.0.1, Oracle 11g, Linux, DAC, Oracle BIAPPS 7.9.6.4, OBIEE 11g, Toad.
CALIX, Bodhtree Consulting Ltd, India
May 2013 – June 2014
Informatica and BI Developer
Calix implemented Single Source of Truth (SSOT) project to enable single source of data for all the reporting needs. As part of the SSOT project, Calix designed Sales Booking Lines Data Model. The scope of this project includes Cleaning-up (Rename/remove duplicate column/Add metadata description) of Sales Bookings Model attributes to improve the usability of Bookings Model.
Responsibilities:
●Understood business requirements and mapped them to existing data warehouse schema and OBIEE Model
●Made necessary schema changes as part of cleaning up process
●Created new mappings and modified existing mappings to load the data as per new schema
●Prepared and executed unit and integration test cases to test the changes
●Increased the performance of the mappings and workflows and performed performance test
●Cleaned up OBIEE repository and made necessary data model and configuration changes
●Modified pre-built dashboards, reports and other BI objects to support the cleaned up data model
●Build ad hoc BI reports and dashboards using OBIEE.
●Deftly managing Cache management, RPD Merging, MUDE Setup.
●Migration of Webcat between environments including refresh of GUID's.
Environment: Informatica 8.5.1, SQL Server 2008, Oracle, Linux, SQL Developer, OBIEE 11g.
Solitaire Softech Pvt Ltd, INDIA
February 2009 – May 2013
SQL BI Developer
Responsibilities:
●Involved in various phases of Software Development Life Cycle (SDLC) and developed use Case diagrams, Object Diagrams, Class Diagrams, and Sequence diagrams to represent the detail design phase using Rational Rose
●Propping up the new schema changes and additional new features on Development, Staging/QA and Production servers.
●Configured Log Shipping between Primary and Secondary servers.
●Created database, tables, Indexes and views.
●Created linked server between SQL servers.
●Dropped and recreated the indexes on tables for performance improvements.
●Designed DTS packages to refresh data on Development and QA/Staging.
●Involved in writing T-SQL Programming for implement Stored Procedures and Functions for different tasks.
●Created Stored Procedures and triggers to perform automated rules, updating to related tables.
●Involved in creating SSIS jobs to automate the reports generation, cube refresh packages.
●Involved in creating Tablix Reports, Matrix Reports, Parameterized Reports, Sub reports using SQL Server Reporting Services.
Environment: SQL Server, T-SQL, Oracle, PL/SQL, SSIS, SSRS, Visual Studio, SQL Server Data Tools, SQL Server Management Studio, SQL Developer, TOAD.
EDUCATION
●Bachelor of Technology in Computer Science Engineering – JNTU Hyderabad – 2008
●Master’s In information systems - TAMIU - 2016
CERTIFICATIONS
●Microsoft Certified Azure Fundamentals, Azure DP-200.
●Oracle Business Intelligence Foundation Suite 11g Certified Implementation Specialist
●Oracle Database 11g Certified
●Sun Certified Java Professional (SCJP)
●ITIL Certified