Post Job Free

Resume

Sign in

Sql Server Data Engineer

Location:
Frisco, TX, 75036
Posted:
September 01, 2023

Contact this candidate

Resume:

Vinitha Arathala

Senior Software Engineer

Contact: +1-469-***-**** Email:adzeor@r.postjobfree.com

PROFESSIONAL SUMMARY

Senior Data Engineer with extensive experience in several Data Warehousing and Data Migration implementations. Seeking a data engineer position where I can leverage existing skills, experience, and continue to learn new ones.

Snowflake

Informatica PowerCenter 10.5.1/10.4.1/10.2.1/9.6

SQL

IBM DB2

MS SQL Server

Oracle Pl/SQL

Informatica Developer 10.5.1

SOAP & REST web services

XML, JSON

Unix shell scripting, JavaScript

Scheduling with Control M & Autosys

ServiceNow (SNOW)

BMC Remedy

ETL Framework

Agile / Waterfall

Dimensional Modelling

Star & Snowflake Schema

C, C++, Java, .NET

HTML,XML,CSS, JavaScript

TOAD, SQL Developer,

AQT

Data Mapping

ETL Pipeline

Data Validation

Data Verification

Data Migration

Production Support

Requirement gathering

ETL and DB Design

Leadership

TECHNICAL SKILLS

PROFESSIONAL EXPERIENCE

Christus Health, Irving, TX.

Data Engineer (Snowflake Developer) 12/2021-02/2023

Description: Being part of Shared Services Group and was responsible in providing EDI solutions for Plan Domain business needs. Few projects to name which I supported during my tenure are, Automation of Small business, FDR (Federated Data Repository), IOP etc., Supported multiple applications under Plan Domain like PSIL, PSA, etc.

Responsibilities

Bulk loading from the external stage (AWS S3), internal stage to snowflake cloud using the COPY command.

Loading data into snowflake tables from the internal stage using snowsql.

Used COPY, LIST, PUT and GET commands for validating the internal stage files.

Used import and Export from the internal stage (snowflake) from the external stage (AWS S3).

Writing complex snowsql scripts in snowflake cloud data warehouse to business analysis and reporting.

Used FLATTEN table function to produce a lateral view of VARIANT, OBJECT, and ARRAY

column.

Used SNOW PIPE for continuous data ingestion from the S3 bucket.

Developed snowflake procedures for executing branching and looping

Created clone objects to maintain zero-copy cloning.

Data validations have been done through information schema.

Performed data quality issue analysis using Snow SQL by building analytical warehouses on Snowflake

Experience with AWS cloud services: EC2, S3, EMR, RDS, Athena, and Glue

Cloned Production data for code modifications and testing

Perform troubleshooting analysis and resolution of critical issues

Environment: Snowflake, Informatica PowerCenter 10.5.1, SQL Server, AWS, SQL, Oracle.

Signify Health, Irving, TX 03/2019 - 08/2021.

Senior ETL Developer

Description: Advantage system is the enrollment process. It allows clients to enroll their employees into an HSA or FSA. The second major process of the Advantage system is Claims Reimbursement Process. When customers have an eligible medical expense, they submit a claim to be reimbursed from their account. Advantage is intended to provide improved flexibility, scalability and easy maintenance in handling member enrollment and claims reimbursements.

Responsibilities

Involved in complete SDLC phase in collecting and documenting requirements. Prepared technical design/ specifications for data Extraction, Transformation and Loading.

Extensively used Informatica Power Center 10.4 as an ETL to extract data from sources like MS SQL Server, Flat files, Oracle, IBM DB2 and load into target database.

Designed and developed ETL processes using DataStage designer to load data from MS SQL Server, Flat Files (Fixed Width) and XML files to staging database and from staging to the target Data Warehouse database.

Used Power center transformations namely Expression, Aggregator, Sorter, UDT, Joiner, Lookup, Stored procedure, Rank, Update Strategy in accomplishing the ETL Coding.

Designed and developed Power center jobs with proper job sequences, dependencies and releases using CONTROL-M.

Implemented slowly changing dimension mechanism using snowflake stream object.

Involved in performance tuning and optimization of Power center mappings using features like Pipeline and source Partition and cache to manage very large volume of data.

Created UNIX Shell Scripts for pre/post session commands for automation of loads.

Used various Joins for data retrieval from multiple tables based on the requirements.

Experience with usage and creation of indexes on the table for better performance of complex query.

Wrote complex SQL queries with Co-Related Sub queries, nested loops for faster data retrieval from multiple tables.

Created mapplet and used them in different mappings.

Developed and scheduled Workflows using task developer, worklet designer, and workflow designer in Workflow manager and monitored the results in Workflow monitor.

Implemented Error Processing for Informatica Mappings and Workflows.

Used TOAD to develop and debug oracle PL/SQL Functions, Procedures and Packages.

Involved in scheduling Power center jobs using CONTROL-M.

Involved in the Production deployment process and provided hyper care support for the jobs in production environment until 3 months.

Resolved all issues timely for each incident logged in ServiceNow during the hyper care period.

Environment: Informatica PowerCenter 10.4.1, TOAD, MS SQL Server (AQT),Oracle, Snowflake, WINSCP, XML files, Flat Files, Unix Shell scripting, CONTROL-M, ServiceNow, Windows.

Data Migration Lead 12/2017 - 01/2019

Description: Comerica Bank is a financial services company strategically aligned into three major business segments: the Business Bank, the Retail Bank, and Wealth Management. The bank's primary businesses include deposits, lending, credit cards, and trust and investment services. A cash back program is a rewards program offered by a company to customers who frequently make purchases.

Responsibilities

Participated in all phases including Requirement Analysis, Client Interaction, Design, Coding, Testing and Documentation.

Created Star Schema with slowly changing Dimension Type 1 and Type 2, Type3

Worked extensively on different types of transformations like Source qualifier, Expression, Aggregator, Router, Filter, Update strategy, Lookup, Sorter, Normalizer and Sequence generator.

Used Mapplets for reusable logic and simplify the maintenance.

Used UNIX scripts to schedule the ETL Load.

Implemented Informatica standards and guidelines.

Designed, developed, debugged, and tested, Stored Procedures, configuration files and implement best practices packages to maintain optimal performance.

Work closely with various levels of individuals to coordinate and prioritize multiple projects.

Involved in production support team, Trouble shouted issues with data warehouse and Informatica.

Provided On-Call Production Support for daily and weekly data loads.

Solved issues with Production databases, tuned mappings for fast loading of data.

Developed Scripts in Oracle to get data from Source to Data mart tables, and to schedule various workflows to meet the business requirements.

Used Workflow manager to create Workflows and Sessions, database connection management.

Extracted data from various sources, Transformed and Loaded into Data Warehouse Targets using Informatica.

Worked on Exception and Error handling.

Optimized Query Performance, Session Performance.

Performed by Object and by Folder ETL code migrations between DEV, QA and PROD repositories using Repository Manager.

Creating and executing test cases for Informatica mappings. Documenting ETL design and test results. Creating and maintaining Technical Design documentation.

Involved in Unit testing for developed mappings.

Created Cognos reports through report.

Created user Conditions and Filters for various reports to specify the reporting parameters as per the requirements of the end users.

Created Merged Dimensions for Reports with Multiple Data providers.

Environment: Informatica PowerCenter 10.4.1, TOAD, MS SQL Server (AQT), WINSCP, XML files, Flat Files, Unix Shell scripting, CONTROL-M, ServiceNow, Windows.

Colruyt Group, Brussels, Belgium 11/2016 - 10/2017

Data Migration Lead, Lead ETL Developer

Description: The Objective of this project is to create a Data mart for the analysis of sales information in the Company. This information was earlier stored in OBJECT STAR tables and since the OBJECT STAR has become obsolete, the entire data is migrated to Data Marts via informatica Power Center. This Project involved designing and creating a Data mart to fulfill their business study and analysis requirement.

Responsibilities

Created and delivered documentations like data migration strategy, data bridging solution, data mapping, data profiling and cleansing reports, data validation and data verification scenarios and reports etc.

Conducted data mapping session with client, finalize transformation rules and create ETL design documents.

Used DataStage to extract data from source e.g., Flat files, VSAM files, DB2, Excel and XML and apply transformation logic to migrate legacy data from the staging area into the Final DM DB.

Finalized standard of ETL jobs to ensure performance and quality and provided guidance to the ETL developers.

Tested ETL programs and perform rejection analysis by writing SQL Queries in DB2.

Developed Unix Scripts to automate data profiling, data validations etc., in staging area and in the cloud.

Involved in conducting the review of ETL Code, Unit Test Cases & Results with the Developers

Involved in detailed technical design and provide KT to the QA team.

Environment: Informatica PC, UNIX, DB2

Colruyt Group, Brussels, Belgium 8/2015 to 10/2016

L1 Support Analyst

Responsibilities

Involved in production support of ETL interfaces and jobs.

Raised ticket to the runtime support team to address issues of failed jobs.

Involved in communicating with client to understand business impact of the issues and provide status of tickets.

Involved in clarifying technical implementation of functional stuffs to the client in screen sharing sessions.

Involved in resolution of issues with Informatica jobs and data related issues in lynx production and development environment.

Involved in tuning performance of Informatica jobs.

Involved in verifying values in the Cognos report vs data in the oracle DB by writing PL/SQL Code and SQL blocks.

Environment: Informatica PowerCenter, UNIX, Oracle, Erwin

EDUCATION

Bachelor’s in computer science and engineering.

Sri Venkateshwara College of Engineering & Technology Chittoor, Andhra Pradesh, India.



Contact this candidate