Post Job Free
Sign in

Data Analytics & BI Engineer with SQL & ETL Expertise

Location:
Wylie, TX
Posted:
February 26, 2026

Contact this candidate

Resume:

Reshma Reddy Chintakuntla

Wylie, Texas, ***** *****.*********@*****.*** +1-571-***-****

PROFESSIONAL EXPERIENCE

9+ years of work experience across multiple industry technology and domains including Business & Data Analysis, Database Development and Business Intelligence Development.

Experienced in Oracle - PL/SQL Packages, Stored Procedures, Functions, Packages, Cursors, Triggers, Views, Materialize Views, table partitioning, external tables.

Experience in data warehousing tools, data mapping, unit testing, migration, conversions, and process documentation

Worked on SQL Server Management Studio SSMS, Integration services SSIS, Analysis services SSAS and Reporting services SSRS in analyzing data and deploying business reports.

Experience in performing system/data analysis and reporting using BI reporting tools (Tableau, Power BI & Business Objects, Looker).

Skillful in Data Analysis using SQL on Oracle, MS SQL Server, Netezza, Snowflake & Teradata.

Experienced in PL/SQL Performance Tuning using Bulk Collections and other optimization techniques

Extensive knowledge on Tableau Server, Tableau Reader, Tableau Public - ability to create dashboard reports, aggregate data, slice and dice data, perform data connectivity to external sources for analysis, create data options, data interpretation techniques and pivot data to meet reports specifications.

Published Power BI Desktop reports and created Report view to the Power BI service & Report Server.

Experienced in utilizing ETL tools: Talend Data Integration, SQL Server Integration Services (SSIS), Pentaho, Workato and developed slowly changing dimension (SCD) mappings using Type-I, Type-II, and Type-III methods.

Experience in UNIX shell scripting includes file monitoring scripts, data manipulation scripts and data load scripts.

TECHNICAL SKILLS

Operating Systems : UNIX, LINUX, Windows, MS-DOS

Programming Language : C, T-SQL, PL/SQL, MySQL, JAVA, C#, Python

Databases : SQL Server, Oracle 12c/11g/10c, Netezza, Teradata, Snowflake, PostgreSQL

Reporting Tools : SQL Server Reporting Service, Tableau, ThoughtSpot & Power BI

Web Technologies : HTML5, XML, Java Script, VB Script, Web Services

Microsoft Tool : SSIS, SSAS, SSRS, Excel (Pivot tables, V-lookup), Word, Access, MS-Project, MS VISIO

ETL Tools : SSIS, Talend, Pentaho, Workato, Coalesce

EDUCATION:

Masters in information technology, University of Mary Hardin Baylor, Texas. May 2015

Bachelors in Electronics and Communications, Anurag Group of Institutions, India. May 2012

CERTIFICATION:

Querying Microsoft SQL Server 2012/2014 - 70-461

S2 Capital, Dallas, Texas Oct 2023 – Oct 2025

Role: Data Engineer

Responsibilities:

Developed and maintained scalable data pipelines and ETL processes for loading data into Snowflake from diverse sources using tools like Pentaho, Workato and Coalesce.

Designed and implemented data transformation scripts using python in Workato, converted file formats from JSON to CSV to load data from Workday to S3.

Automated data loading workflows using Snowflake stages, file formats, and stored procedures in Workato.

Created complex transformation logic in SQL and materialized views to support BI reporting requirements in Looker.

Configured external authentication mechanisms including key pair authentication for secure automated data loading.

Supported Looker reporting platforms by ensuring high-performance data delivery from Snowflake.

Developed, debugged and optimized PostgreSQL queries for reporting and app development.

Created, inserted and extracted JSON data from columns in Postgres database.

Developed and deployed automated Workato recipes to ingest, transform, and load data from AWS S3, Workday into Snowflake.

Designed and developed end-to-end ETL pipelines in Pentaho Data Integration (PDI) to extract, transform, and load data into Snowflake from various sources including flat files, databases and cloud storage AWS S3.

Integrated Pentaho with Snowflake using JDBC connections and key pair authentication, ensuring secure and automated data workflows.

Created dimension and fact tables using Coalesce visual builder and pushed changes to Snowflake with Git-integrated deployment.

Designed and deployed scalable data transformation pipelines using Coalesce to automate ELT processes in Snowflake, improving pipeline maintainability and transparency.

Created custom connectors in Workato to integrate with non-native applications and ensure seamless data exchange.

Generated Tableau dashboards with quick filters, parameters and sets to handle views more efficiently.

Environment: Snowflake, PostgreSQL, AWS S3, AWS Glue, Pentaho, Workato, Coalesce, Looker, Workday, Python, Tableau

TransCore, Plano, Texas Feb 2023 – May 2023

Role: SQL Developer/ Report Developer

Responsibilities:

Created stored procedures to generate HTML Output

Built efficient SSIS packages for processing fact and dimension tables with complex transforms and type 1 and type 2 changes.

Customized reports by adding filters, calculations, summaries and functions. Created reports to retrieve data using Stored Procedures that accept parameters

Created Parameterized Queries, generated Tabular reports, sub reports, cross tabs, drill down reports using expressions, functions, sorting the data, defining data sources and subtotals for the SSRS reports.

Optimized slow performance queries and improved performance of long running views and stored procedures.

Identified slow performing views, store procs and queries using SQL profiler by running simultaneously while browsing the application.

Worked on complex SQL queries for validating the data against different kinds of reports.

Developed solution driven views and dashboards by developing different chart types including Pie charts, Bar charts, tree maps, circle views, line charts,area charts, scatter plots in Power BI

Environment: SSRS, T- SQL, SSIS, SQL Profiler, SSMS, Power BI, JIRA

The Vitamin Shoppe, Secaucus, NJ Jan 2022 – Jan 2023

Role: Data Engineer

Responsibilities:

Designed SSIS packages to provide CRM data to the CrowdTwist environment.

Created custom reports using CRM data.

Involved in Error handling, Debugging, error logging for production support in SSIS

Designed SSIS packages to provide CRM data to the CrowdTwist environment.

Monitored and scheduled existing and new jobs in production environment

Managed data quality in CRM.

Used SQL to analyze, evaluate and create reports on sales to identify steps to increase sales productivity and results.

Designed queries to draw relevant customer information for development of financial reports utilized in forecasting, trending and result analysis.

Developed Data Quality checks for source and target data sets.

Worked with Engineering teams to collect required data from internal and external systems.

Developed, debugged and optimized PostgreSQL queries for reporting and app development

Developed Tableau data visualization using Cross tabs, Heat maps, Scatter plots, Geographic Map, Pie Charts and Bar Chats.

Utilized Tableau server to publish and share the reports with Marketing team

Responsible for creating fields, combined fields, sets, hierarchies using Tableau Desktop.

Implemented data blending to blend related data from different data sources

Environment: SSIS, SQL Server, PostgreSQL, Talend, Snowflake, CRM Aptos, Tableau Desktop, Tableau Server

Wells Fargo, Dallas, TX Sep 2021 – Jan 2022

Role: Software Developer

Responsibilities:

Developed, deployed, and monitored SSIS Packages

Involved in performance tuning of TSQL queries and Stored Procedures.

Used Autosys scheduling tool to run the SSIS packages

Created source to target documents as per technical requirements.

Created Clustered and Non-clustered Indexes for Query Optimization and involved in Database Tuning.

Optimized the performance of queries by removing unnecessary columns, eliminated redundant and inconsistent data and normalized tables.

Responsible for Deploying, Scheduling Jobs, Alerting and Maintaining SSIS packages.

Developed and modified triggers, packages, functions and stored procedures for data conversions and PL/SQL procedures to create database objects dynamically based on user inputs.

Wrote SQL, PL/SQL, SQL Plus programs required to retrieve data using cursors and exception handling.

Created custom calculated columns and parameters on power Query with M code.

Worked with data models with complex relationships in PowerBI and connected different data sources to Power BI desktop.

Created partitioned tables and partitioned indexes to improve the performance of the applications.

Experience with performance tuning for Oracle RDBMS using explain plan and hints.

Environment: SSIS, SQL Server, Autosys, TFS, Teradata, Oracle PLSQL, Power BI.

BioPharm Communications, New Hope, PA Dec 2020 – Sep 2021

Role: Senior Data Analyst

Responsibilities:

Developed SQL queries SnowSQL

Used COPY commands to bulk load the data.

Redesigned the Views in snowflake to increase the performance.

Created internal and external stage and transformed data during load.

Created Talend jobs to copy the files from one server to another and utilized Talend FTP components

Performed data manipulations using various Talend components like tMap, tJavarow, tjava, tOracleRow, tOracleInput, tOracleOutput, tMSSQLInput and many more.

Used AWS S3 Buckets to store the file and injected the files into Snowflake tables using Snow Pipe and run deltas using Data pipelines.

Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart.

Built complex queries and wrote stored procedures using PL/SQL.

Designed, developed, tested and maintained tableau functional reports based on user requirements.

Developed Tableau data visualization using measures and dimensions data from various databases by using data blending concept.

Developed Tableau data visualization using Cross tabs, Heat maps, Scatter Plots, Geographic Map, Pie charts and bar charts and density chart.

Environment: PLSQL, T-SQL, Toad, GitHub, SQL Developer, Java, Talend 7.1.1, DBeaver 21.0.4, Snowflake, Datamodeler, Tableau Desktop, Tableau Server

University of Pennsylvania, Philadelphia, PA May 2019 – Aug 2020

Role: ETL Developer

Responsibilities:

Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures.

Worked on various Talend components such as tMap, tFilterRow, tAggregateRow, tFileExist, tFileCopy, tFileList, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator etc.

Worked extensively on SQL Queries for validating the records.

Extensive experience in extraction of data from various sources like relational databases Oracle, SQL Server, and Flat Files.

Experience in key Oracle performance related features such as system optimization, Query Optimizer, statistical functions, DB functions.

Created table function, indexes, table partitioning, materialized views, query re-Write, analytical functions.

Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote Java code to capture global map variables and used them in the job.

Designed, developed and improved complex ETL structures to extract, transform and load data from multiple data sources into data warehouses and other databases based on business requirements.

Troubleshooting, debugging & fixing Talend specific issues, while maintaining the health and performance of the ETL environment.

Created Context Variables and Groups to run Talend jobs against different environments like Dev, Test and prod.

Worked on Joblets (reusable code) & Java routines in Talend and Mainly Involved in Performance Tuning of long running ETL jobs.

Environment: PLSQL, T-SQL, Toad, GitHub, JIRA, Java, XML files, Talend 7.1.1.

Verizon, Basking Ridge, NJ Nov 2018 – May 2019 Role: MSBI Developer

Responsibilities:

Created the automated processes for the activities such as database backup processes and SSIS Packages run sequentially using SQL Server Agent job.

Developed and extended SSAS Cubes, Partitions, KPIs, Perspectives, Actions, Data Mining Models, deploying and processing SSAS objects, wrote MDX queries.

Worked on SQL Server Integration Services (SSIS) to integrate and analyze data from multiple heterogeneous information sources (Oracle, Teradata, flat files and Excel)

Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan.

Partitioned the fact tables and materialized views to enhance the performance.

Wrote numerous BTEQ scripts to run complex queries on the Teradata database. Used volatile table and derived queries for breaking up complex queries into simpler queries.

Performance tuned Teradata SQL queries to optimize various ETL processes.

Created incremental refreshes for data sources on Tableau server.

Generated various dashboards to Tableau Server using different data sources such as Teradata, Oracle, SSAS, MS Excel etc.

Created Schedules and Extracted the data into Tableau Data Engine.

Scheduled data refresh on Tableau Server for weekly and monthly increments based on business changes to ensure that the views and dashboards displayed the changed data accurately.

Experience in writing Unix shell scripts for daily updating the Data to the ThoughtSpot.

Environment: SQL Server, Tableau, SSIS, SSAS, JIRA, Teradata, Thought Spot, T-SQL, PLSQL.

Fannie Mae, Reston, VA Oct 2017 – Jul 2018

Role: Data Analyst

Responsibilities:

Created complex plsql scripts using cursors, xml extract to get desired output and load the data into ThoughtSpot.

Involved in periodic Performance Tuning and Unit Testing of the PL/SQL code by using appropriate joins, indexes, and partitions

Wrote (Back-end) PL/SQL code to implement business rules through triggers, cursors, procedures, functions, and packages using SQL*Plus Editor or TOAD

Created answers, pin boards, various dashboards and wrote TQL scripts to create tables in ThoughtSpot.

Used Tableau desktop as a decision-making tool for identifying the appropriate business model to initiate a customer risk analysis project, the flow of the project monitored through Gantt charts made using Tableau and MS project schedule.

Created visually impactful dashboards in Excel and Tableau for data reporting with respect to customer, policy information and analyzed to identify the key metrics.

Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.

Created Relationships, actions, data blending, filters, parameters, hierarchies, calculated fields, sorting, groupings, live connections, and in-memory in both Tableau and excel.

Worked on shell scripting to load data automatically using Autosys.

Wrote Unix Shell Scripts to process the files on daily basis like renaming the file, extracting data from the file, unzipping the file and remove the junk characters from the file before loading them into the base tables in Thoughtspot.

Implemented various UNIX scripts to generate the data in CSV and txt format.

Used PowerShell scripting to automate processes, migration, reporting, monitoring and troubleshooting.

Environment: SQL Server 2014/2016, T-SQL, Oracle, Netezza, XML, ThoughtSpot, Tableau (Desktop/Server), JIRA, UNIX, Autosys, SharePoint, SQL*Loader, TOAD

Microsoft Corporation, Seattle, WA Aug 2015 – Sep 2017 Role: MSBI Developer

Responsibilities:

Developed SSIS, SSRS / T-SQL Code and scheduled jobs for Jobs Monitoring Automation.

Identified slow running query and optimization of stored procedures to tweak the performance.

Created different visualization in Power BI per the requirements.

Created calculated measures in tabular model using DAX Queries according to end user requests and reporting needs.

Expertise in publishing Power Bi reports of dashboards in Power BI server and scheduling the dataset to refresh for live data in Power BI server

Experience in publishing Power BI Desktop reports created in Report view to the Power BI service.

Managed relationship between Power BI using Schemas such as Star & Snowflake.

Created calculated columns and measure in Power BI and Excel depending on the requirement using DAX expressions.

Responsible for deploying the SSIS packages and scheduling the package through jobs in all tiers Including dev, test and production.

Created classes/modules in C# using Visual Studio.Net.

Performed tuning of tables using normalization and by creating indexes.

Used Microsoft Azure as the back-end for rendering reports and dashboard for the Power BI

Used Indexing strategies and query optimization techniques to optimize Stored Procedures and long running queries.

Code Reviews, customization and maintenance of code across applications using TFS

Environment: SQL Server 2012/2014/2016, T-SQL, SSIS, SSRS, C#.Net, Excel, Power Pivot, Power BI Desktop, Power BI Web, TOAD, TFS, VB.Net, C#.Net, Visual Studio 2010/2013, AZURE Database, SharePoint, XML, HTML



Contact this candidate