Post Job Free

Resume

Sign in

Sql Server Data

Location:
Spokane, WA
Posted:
January 08, 2017

Contact this candidate

Resume:

Umesh Garg

ETL Developer

848-***-****

acx6cm@r.postjobfree.com

SUMMARY:

Over 9 years of extensive experience in Information Technology with special emphasis on Data Warehousing and ETL process using Talend Data Integrator 6.1 (DI), Informatica Power Center9.x and SQL Server Integration Services(SSIS) in various domain Banking and Financial Services, Energy and Utilities and Telecom

Excellent knowledge of OLTP and OLAP Systems, developing Database schemas – Star and Snow flake schemas (Dimensions and Facts).

Excellent experience in Extraction, Transformation and Loading of data into Data Warehouse and downstream applications from heterogeneous sources using ETL tool - Informatica Power Center 9.x/8.x,Talend Open Studio and Talend Integration Suite and SSIS.

Expertise with Talend frequently used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput & tHashOutput and more).

Expertise in developing complex mappings using various transformations like Unconnected and Connected lookups, Router, Filter, Expression, Normalize, SQL, Stored Procedure, Aggregator, Joiner, Union, Update Strategy, SQL and Stored procedure using Informatica.

Expertise in MSBI – Stack: SQL Server Integration Services (SSIS), SSAS, SSRS and SQL Server 2005/2008/2012

Expertise in designing and developing complex ETL packages and has deep insights of SSIS architecture such as Dataflow Task Components, Control Flow Elements, Containers, Connections Managers, Runtime Events, and Log Providers etc.

Hands on experience on several key areas of Enterprise Data Warehousing such as Change Data Capture (CDC), Slowly Changing Dimensions (SCD Type I and Type II).

Excellent experience working on Informatica Data validation option (DVO).

Strong experience in performance tuning, debugging and Error handling of mappings, sources and targets for better performance, also identified and fixed the bottlenecks.

Extensive experience in creating Reusable Mapplets, Worklets and used assignment, Event wait, email and command tasks in Workflows.

Excellent knowledge in RDBMS concepts and constructs along with Database Objects creation such as Tables, User Defined Data Types, Indexes, Stored Procedures, User Defined Functions

Extensive experience in writing Transact-SQL (DDL/DML) queries.

Hands on experience in SQL Query performance tuning and trouble shooting performance issue.

Experience in Agile – Scrum model.

TECHNICAL SKILLS:

ETL Tools

Informatica Power Center 9.x, Talend Open Studio, Talend Integration Suite 6.1 and SSIS 2012

Databases

SQL Server 2005, 2008, 2012, Sybase, MS-Excel.

Programming Skills

T-SQL, PL/SQL, C#, Unix Shell Scripting, Power Shell

Data Modeling Tools

Erwin

Source Control

Tortoise SVN, VSS, TFS 2012

Education and Certifications

Master of Computer Applications (2006) - Maharshi Dayanand University, Rothak (74%)

Bachelor of Computer Applications (2003) - Punjab Technical University, Jalandhar (72%)

Microsoft .NET Framework 2.0-Web Based Client Development certified (96.5%)

Brain bench Certification in .SQL Server, NET Framework,C#,ASP.NET and in Business Communication

Cognizant Certified Professional L1 Security Services

PROFESSIONAL EXPERIENCE:

Ecova Inc (Nagarro), Spokane, WA Mar 2015 – Till Date

Project Abstract:

Treasury is a system involving multiple applications which is designed to efficiently manage and pay the client bills for utilities e.g. Electricity, Water, Telecom, Waste. The project handles all processes related to client bills including digitization, maintenance and financial management of the bills. The funds from clients are efficiently managed by handling all banking transaction related utilities bills and benefits clients by paying bills in advance and later on reimburse same from client. It help client to avoid Late Fee Charges.

Role: ETL/Talend Developer

Responsibilities:

Patriciate in project Sprint Planning, Refinement and Retro

Estimate Story points based on business requirement

Coordinated with Product owner on a regular basis to discuss the business requirement.

Create data mapping, work flow and deployment documents

Created Talend jobs to copy the files from one server to another and utilized Talend FTP components

Created and managed Source to Target mapping documents for all Facts and Dimension tables

Migrate existing SSIS package into Talend and design new jobs to pull data from heterogeneous sources (XML, SQL Server, Sybase, MS Excel) using Talend most used components (tMap, tDie, tConvertType, tJoin, tBufferOutput, tFlowMeter, tSetGlobalVar, tHashInput & tHashOutput and many more)

Created Joblets and Parent child jobs in Talend

Designed & Developed the ETL Jobs using Talend Integration Suite by using various transformations as per the business requirements and based on ETL Mapping Specifications.

Created Implicit, local and global Context variables in the job.

Worked on Talend Administration Console (TAC) for scheduling jobs and adding users.

Created Database objects and wrote complex T-SQL queries to extract data from SQL Server 2012.

Wrote T-SQL queries to validate data

Design and Implemented ETL for data load from heterogeneous Sources to SQL Server target databases and for Fact and Slowly Changing Dimensions SCD-Type1 and SCD-Type2.

Environment: SQL Server, Talend, SSIS, TFS 2012, SVN, PL\SQL, T-SQL, WinSCP, Putty, XML, MS Excel

TTX (Nagarro), India May 2014 – Feb 2015

Project Abstract:

CBM is Component condition Based Maintenance. CBM aims to enable TTX to identify car’s components that require attention due to that component’s specific condition. This will enable TTX to maintain current reliability levels and focus maintenance resources where they are needed when they are needed - providing the highest value proposition to TTX.

Role: ETL Lead

Responsibilities:

As BI Team Lead, I have performed the following tasks:

Interaction with the Business analyst and Business Users to understand existing application and new Requirements

Validating technical feasibility of business requirements

Interfaces with Business users and Technical SMEs to clarify requirements, review design, test cases, UAT and deployment

Project milestone discussion and Customer approvals on the delivery dates

Design logical and physical data model to support CBM requirements

Dimensional Data model creation for the reporting system using Erwin 9.6

Discussion of Solutions/Approaches with client and architects. Preparation of Design Documents.

Review and sign off from customer/ stockholders on design specifications and UAT

Document project requirements and translate them into functional and non-functional specifications for BI reports

Created Database objects and wrote complex T-SQL queries to extract data from SQL Server 2012.

Designed & developed ETL procedures, Data dictionaries, data mappings involving data analysis, data validations and data migrations.

Designed the Data Warehouse/ ETL processes using Informatica Power Center 9.6 to extract, transform and load data from SQL Server.

Created complex mappings using various transformations such as Rank, Joiner, Expression, Lookup (Connected/Unconnected), Aggregate, Filter, Update Strategy, Sequence Generator etc to implement the user requirements.

Implemented Type II Slowly Changing Dimensions Methodology to keep track of historical data.

Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.

Created and scheduled workflows using Workflow Manager to load the data into the Target Database.

Performed Unit Testing and Integration Testing of Mappings and Workflows.

Environment: Informatica 9.x, Erwin, SQL Server 2012, T-SQL, PL\SQL

Amex, India Oct 2011 – Apr 2014

Project Abstract :

ARC-MIS is migration project. It involves migration of CORN jobs (Shell script/SAS code) to Informatica workflows. These workflows load data from heterogeneous Databases (Sybase, Teradata, csv) to Data Warehouse on SQL Server. ARC-MIS application provides the functionality to the end business user to view reports which are generated from Relational database and Cubes.

Role: Sr. ETL Developer

Responsibilities:

Interaction with the client to understand existing application and new Requirements

Analysis and Estimation of the requirements

Discussion of Solutions/Approaches with client and architects. Preparation of Design Documents

Used Informatica workflow manager to execute the workflows and informatica workflow monitor to monitor the status of the workflows.

Developed standard and re-usable mappings and mapplets using various transformations like expression, aggregator, joiner, source qualifier, router, lookup, and filter.

Configured the mappings to handle the updates to preserve the existing records using Update Strategy Transformation.

Validated Informatica extraction, transformation and loading process by writing SQL against Business rules.

Development of SSIS Packages and Database objects

Migrated SSAS 2005 Cubes to SSAS 2012 and did enhancements as required

Interaction with testing team for SIT and Business for UAT

Preparation of deployment documents and support during code releases

Worked on BI Roadmap to suggest value additions/improvements to client

Provide consultancy/training to other projects

Environment: Informatica Power Center 8.6.1, SSIS 2012, SQL Server 2012, Sybase, Teradata, T-SQL, PL\SQL

JP Morgan Chase, India Jan 2011 – Sep 2011

JP Morgan Chase, Dallas, USA May 2010 – Dec 2010

JP Morgan Chase, India Oct 2008 – Apr 2010

Project Abstract :

Fee Biiling which has two intranet portals - Feebilling and Feedhandler. The Fee billing website contacts the Advantage Web database and internally contacts the Advantage database for some of the operations. The FeedHandler is a job scheduling and management software application designed for use in distributed environments.

Role: ETL Developer

Responsibilities:

Team member of the Application Development team in this Project.

Involved in – creating functional specifications, writing requirement understanding document, test case writing and effort estimations.

Communicated with the Development and Business team to understand the business processes and collect the required information.

Developed SSIS package which accepted data from MS Excel, SQL Server databases and load into SQL Server.

Shaped SSIS package to load data from XML to SQL Server. Used different type of transformation e.g. Lookup, Joiner, Update Strategy, Router to load data into SQL Server 2012 database. Wrote Business code in T-SQL queries and stored procedure.

Created matrix reports, tabular, drill-down, drill-through, sub reports, chart reports and multi parameterized reports.

Developed sub reports parameter driven, matrix, sub-reports and integrated report hyper-link functionality to access external applications.

Created Tabular and Matrix Reports with all aggregate functions and expressions

Environment: SSIS, SSRS, SQL Server 2008, T-SQL, PL\SQL, C#



Contact this candidate