Post Job Free
Sign in

Data Engg Snowflake,Dbt,AWS,ETL,SSIS,SQL,Oracle

Location:
Dublin, CA
Posted:
July 13, 2024

Contact this candidate

Resume:

PUJA

+1-973-***-****

********@*****.***

California, US

Snowflake / DBT / AWS/Genrative AI/Sr. ETL (SSIS) Developer/SQL /Oracle /Developer

PROFESSIONAL EXPERIENCE SUMMARY:

Over 13 years of IT Experience as a Snowflake Dbt, SQL Server Developer / ETL (SSIS) Developer in analysis, design, development, testing, Deployment, and Maintenance of SQL Server Technologies.

Certification in Generative AI ( LLM, FM models, Amazon generative AI business skill- Amazon Q)

Expertise in Oracle database of versions like 11g,12c,18c,19c,21c

Expert in MS SQL Server suite of products like SSRS, and SSIS of MS SQL Server 2008/ 2012 / 2014/ 2016/ 2017/2019/2022, Power Bi, MYSQL

Proficient in ETL (Extract – Transform – Load) data pipeline

Strong experience with Data and AI solutions on Azure data factory, and AWS platforms services like AWS Redshift, RDS, Kinesis, Kibana, Airflow, Lambda, EC2, pyspark, python

Proficient in T-SQL, MySQL Queries using Store Procedure, Dynamic SQL, joins, Views, User defined functions, Triggers, Cursors, CTE, DML, DDL, etc.

Having Good Experience in code deployment in version control systems like git hub, octopus, and team city.

Experience in DTS Migration and Metadata Management.

Migrating DTS packages to SSIS, Package Migration Wizard, and Storage Management.

SKILLS

Generative AI, LLM, ML FM models, Data Warehouse SQL DBT, Business Intelligence,Snowflake Jira, GitHub, Copilot Octopus, ETL, Oracle, AWS S3 Bucket, AWS redshift, lambda,RDS, Big data, AWS Kinesis, Kibana, Airflow, Lambda, EC2, pyspark, Azure, SSRS, Team City, TFS Azure DevOps, Azure fundamentals Airflow Service, Csv Data Extraction Data Integration, Data integrity Data Mining Data Modeling, Data validation Python, Database Design, Tidal, Autosys, DIS, Dax, Dbms, Erwin,Etl tools,Excel Flat Files, Git, JavaScript, Json, Microsoft SQL, Microsoft SQL Server Microsoft Visual Studio MS Access Tool / Technology, Python

Certification- Generative AI (LLM, FM models, Amazon generative AI business skill- Amazon Q)

PROFESSIONAL EXPERIENCE:

LPL Financial Holdings Remote

Sr. Snowflake Dbt, ETL, Oracle, SQL Developer 10/2021 – Present

Responsibilities:

Heavily involved in the data pipeline for calculating and managing financial metrics like AUM (Assets Under Management), NNA (Net New Assets), and revenue.

Building and maintaining data pipelines to collect and store data related to user activity, performance, and other metrics on the LPL Practice Hub platform.

Developing SPʼs and implementing ETL data pipeline to clean, transform, and analyze the collected data from Oracle to MS SQL

Creating, Scheduling, and monitoring DIS jobs for Monthly Data load by the 3rd Business Day of the following month on different environments (DEVINT, LDQA, PROD)

Code Development and deployment to different environments to source control version Studio Team Foundation Server Azure DevOps repository using Team City, octopus.

Repository migration from TFS to GitHub, LPL_DIS_Controller_SQL, Oracle

Documenting and approval of BPUG, Handling Practice Hub releases and practices of pre - post-release data validation, and On-demand data load to different environments.

Enhancement and bug fix for Advisor NNA Metric, Additional features – By Quarter time periods revenue generation, AUM Breakdown, ROA, Analysis: Incorrect NNA for Forethought Life Ins Co Implementation and data validation for NNA Bundle and BOI/JNL, AUM Details Stats Update Going Forward, Enhancement on NNA Export Fields – Historical data load. Data Retention: Historically from 2019.

End to End-to-end development and implementation of FIS OPT OUT client preference.

Retire legacy and redundant applications on CW like Practice Metrics and Practice Management Dashboard.

Cloud Migration to Advisor Performance Business Warehouse using Snowflake and DBT.

Ensuring the accuracy and consistency of the data used in AUM, NNA, and revenue calculations is crucial. The data engineer might implement data quality checks and monitoring processes to identify and rectify any issues. Supporting Athena, Practice hub application migration from Mapr to Snowflake through HVR load

Design develop and maintain ETL process using AWS Glue/ EMR to extract transform and load sources (S3, Parquet, CVS into AWS Redshift)

Proficiency in creating different types of reports such as Crosstab, Conditional, Drill-down, Top N, Summary, Form, OLAP, and Sub reports, and formatting them

Experience in developing different Statistical, Text Analytic, and Data Mining solutions to various business generating and problems data visualizations usability to define and develop Report Structure with data definition, report layout, and report definition language.

Environment: SQL Server2019/2022Visual Studio 2019/2022, Oracle 18c,19c,21c, git hub, octopus, team city, Data Warehouse SQL DBT, Business Intelligence, Snowflake Jira, GitHub, Copilot Octopus, ETL, Oracle, AWS S3 Bucket, AWS redshift, RDS,AWS Kinesis, Kibana, Airflow, Lambda, EC2, pyspark, Azure, SSRS, Team city, TFS Azure DevOps, Azure fundamentals Airflow Service, Csv Data Extraction Data Integration, Data integrity Data Mining Data Modeling, Data validation Python, Database Design, Tidal, Autosys, DIS, Dbms, Octopus, Azure DevOps team Foundation

Newell Brands Hoboken, NJ

BI/ETL Developer/SQL Server, SSIS Developer/ETL Developer 11/2018 - 07/2021

Description: Use the SSIS packages to improve internal inefficiencies in managing the payment process structure of supply chain management and give them the metric-driven insight to take appropriate actions to overcome these inefficiencies. Providing flexibility to extract data through disparate payment methods and perform payment and invoice match processes through ETL processing.

Responsibilities:

Created new databases and objects such as tables, stored procedures, constraints, and indexes to maintain the referential integrity of data T-SQL for SQL Server 2017/2016

Worked in Extracting Transforming and Loading (ETL) data from Excel. Flat file, MySQL to MS SQL Server by using SSIS

Implemented data pipeline and workflow using AWS, Expert in S3, AWS glue/EMR. Redshift, RDS.

Handled Performance Tuning and Optimization on SSIS and MDX, with strong analytical and troubleshooting skills for quick issue resolution in large-scale production environments located globally.

Immensely used the concept of the transaction to take care of all the DML queries that are executed in the store procedures and roll backed the transactions whenever the stored procedure is error out

Possess a strong understanding of relational databases, likely including Microsoft SQL Server. This could involve. writing SQL queries to extract and manipulate data within SSIS packages. Knowledge of data modeling concepts might help design efficient data transformations within the ETL process.

Developed unit test plans for SSIS packages. Modified the existing SSIS packages to meet the changes specified. by the users. Scheduled jobs for executing the stored SSIS packages which were developed to update the database on a Daily/Weekly/Monthly basis.

Extensive use of Team Foundation Server to implement version control for MSBI(SSIS/SSRS/SSAS) projects.

Environment: ETL, Oracle, AWS S3 Bucket, AWS, SQL Server2017/2016/2014, Visual Studio 2019/2017, Python 3.7/3.8, MS Excel 2013, T-SQL, MDX, MYSQL, VB Script, Window 7, Power Bi, DAX, M query, Team Foundation Server.

Alameda Health System Oakland, CA

MS BI Developer/ETL(SSIS) Developer 03/2016 - 10/2018

Responsibilities:

Created various database objects tables, indexes, views, stored procedures and triggers, Co-related sub-queries, and Dynamic SQL queries and implemented referential integrity constraints for enforcing data integrity and business rules. Extensively used T-SQL, MySQL in constructing user functions, views, indexes, user profiles, relational database models, data dictionaries, and data integrity.

Experience in T-SQL, MySQL, MDDL/DML, perform most of the SQL Server Enterprise Manager and Management Studio functionality using T- SQL Scripts and Batches.

Implemented Complex business logic with user-defined functions and indexed Views and also created User user-defined data types, Clustered & Non- clustered Indexes. Experience in Error handling through TRY-CATCH statement and common Table Expression (CTE).

Excellent in creating SSIS Packages for integrating data using OLE DB connection from heterogeneous sources (Excel, CSV, Oracle, flat file, Text Format Data) by using multiple transformations provided by SSIS such as Data

Conversion, Conditional Split, Bulk Insert, merge, and union all.

Extensively used Joins and sub-queries for complex queries involving multiple tables from different databases.

Expert in designing complex reports like reports using Cascading parameters, Drill-Through Reports Parameterized Reports, and Report Models using SQL Server Reporting Services (SSRS) based on client requirements.

Performance tuning, and optimization of procedures and queries. Designed the SSIS package in 2014 and used the latest features for deployment and configurations.

Used .Net code for implementing advanced file operations and downloading JSON files. Optimized the database by creating various clustered, non- clustered indexes and indexed views.

Involved in Requirements gathering, Analysis, and design of all the client requirements.

Experience in managing and automating control flow, data flow events, and logging pro grammatically using Microsoft .Net framework for SSIS packages.

Used various SSIS tasks such as Conditional Split and derived Column, which were used for Data Scrubbing, and data

validation checks during Staging, before loading the data into the Data warehouse.

Created Tableau Scorecards, and dashboards using stack bars, bar graphs, Scattered plots, and geographical maps.

Environment: MS SQL Server 2016/2017, SQL Server Integration Services, SQL Server Reporting Services, Java, DTS, MS

Access, MS Excel, Windows 2008 servers, T-SQL, PL/SQL, .Net, VBA, Visual Studio 2015/2017

Calsoft Inc, Pune, India

SQL, SSRS Developer 09/2011 - 06/2014

Responsibilities:

Developing SQL procedures, triggers, and other database objects. Fine-tuned and checked the performance of existing objects in SQL Server and implemented transactional logic for them.

Designed and Developed Database Objects like Tables, Stored Procedures, Triggers, Rules, Defaults, and user-defined data types Actively involved in Normalization & DE-normalization of database Wrote Triggers and Stored

Procedures to capture updated and deleted data from OLTP systems Created packages in SSIS with error handling.

Worked with different methods of logging in SSIS.

Experience in creating complex SSIS packages using proper control and data flow elements. Created packages to load data from various sources like Flat File, Excel, and OLEDB in SSIS 2012.

Have deployed packages on different servers by using package configurations. Excellent report creation skills using Microsoft Reporting Services (SSRS). Created Sub-Reports, Drilldown- Reports, and Cross Tab reports using MS SQL Reporting Services (SSRS).

Created Subscriptions and data-driven subscriptions and managed security for reports on SSRS Report Manager.

Designing and Optimizing Database Solutions with Microsoft SQL Server 2014.

Major responsibilities include getting requirements from the user, writing stored procedures for reports, and developing reports using Crystal Reports.

Created reports including formatting options such as Grouping, Sorting, Drill-down, and Parameter prompts Created workflows using SharePoint 2010 designer.

Extensive knowledge of designing Reports, Scorecards, and Dashboards using SQL Server Reporting Services, and Performance Point.

Created reports using stored procedures and was involved in scheduling subscriptions for the reports. Designed and implemented stored procedures and triggers for automating tasks.

Responsible for creating rules, defaults, tables, views, clustered & non-clustered indexes, user-defined data types, and user-defined functions depending upon Business Analyst requests.

Environment: SQL Server 2012/2014/ Visual Studio 2013/ Python, MS Excel, T-SQL, VB Script, Window 7, Team Foundation Server

EDUCATION:

Master of Engineering - Savitri Bai Phule University Pune, India

Bachelor of Engineering - Shivaji University, India



Contact this candidate