Post Job Free

Resume

Sign in

Senior ETL / Informatica Developer/ Data Analyst

Location:
Plano, TX, 75075
Salary:
Market rate
Posted:
September 28, 2017

Contact this candidate

Resume:

PROFESSTIONAL SUMMARY:

Data analyst/ETLInformaticaexpert with around 7+ years of total IT experience. Operating as a Data analyst /ETLexpert in a wide variety of projects and is skilled in analysis/design/implementation surrounding package enabled business transformation ventures.

6+ years of focused experience in Information Technology with a strong background in Database development and strong ETLskills for Data warehousing using Informatica.

Extensive experience in developing ETL applications and statistical analysis of data on databases ORACLE, DB2, Teradata, Netezza, MySQL, PostGreSQL and SQL Server.

Superior SQL skills and ability to write and interpret complex SQL statements and also skillful in SQL optimization and ETL debugging and performance tuning

Experience in developing of on-line transactional processing (OLTP), operational data store (ODS) and decision support system (DSS) (e.g., Data Warehouse) databases.

Experience in Inmon and Kimball data warehouse design and implementation methodologies

Strong familiarity with master data and metadata management and associated processes

Hands-on knowledge of enterprise repository tools, data modeling tools, data mappingtools, data profiling toolsand data and information system life cycle methodologies.

Experience with dimensional modeling and architecture experience implementing proper data structures for analytical reporting from an enterprise data warehouse.

Implemented Change Data Capture (CDC) with Informatica Power Exchange.

Used Informatica Power Exchange to access VSAM files also worked onFlat files, JSON and XML files

Well versed with data quality features like Analyst, IDD& transformation like Key Generator,Standardizer, Case Converter, MatchConsolidation etc.

Applied Address transformation for Address Validation and Standardization.

Strong in implementation of data profiling,documenting Data Quality metrics like Accuracy, completeness, duplication, validity, consistency.

Good skills in analyzing trend charts from score cards to analyze the threshold which is to be considered in further development

Have good skills on understanding and development of business rules for its Standardization, Cleanse and Validation of data in various formats.

Very strong knowledge of InformaticaData Quality transformations like Address validator, Parser, Labeler, Match, Exception, Association, Standardizer and other significant transformations.

Extensively worked on Informatica Power center transformations as well like Expression, Joiner, Sorter, Filter, Router and other transformations as required.

Very strong knowledge on end to end process of Data Quality requirements and its implementation.

Good experience on Data Warehouse Concepts like Dimension Tables, FactTables, Slowly Changing Dimensions, DataMart’s and Dimensional modeling schemas.

Experience in Data modeling; Dimensional modeling and E-R Modeling and OLTP AND OLAP in Data Analysis. Very familiar with SCD1 and SCD2 in snowflake schema and star schema.

Experience in Extraction, Transformation and Loading (ETL) data from various data sources into Data Marts and Data Warehouse using Informatica power center components (Repository Manager, Designer, Workflow Manager, Workflow Monitor and InformaticaAdministration Console).

Strong Experience in developing Sessions/Tasks, Worklets and Workflows using Workflow Manager tools -Task Developer, Workflow & Worklet Designer.

Experience in performance tuning of Informatica mappings and sessions to improve performance for the large volume projects.

Experience in debugging mappings, identified bugs in existing mappings by analyzing the data flow and evaluating transformations.

Good experience in writing UNIX shell scripts, SQL scripts for development, automation of ETL process, error handling and auditing purposes.

Experience in AWS (Amazon Web Services), S3 Bucket and Redshift (AWS RelationalDatabase).

Expertise on Agile Software Development process.

Extensive experience of providing IT services in Retail, Insurance, Healthcare, Financial and Banking industries.

Strong Knowledge of Hadoop Ecosystem (HDFS, HBaase, Scala, Hive, Pig, Flume, NoSQL etc.) and Data modelling in Hadoop environment.

TECHNICAL SKILLS:

ETL

InformaticaPowerCenter 9.5.1, 9.0, 8.1.1

Data Profiling Tools

Informatica IDQ 9.5.1, 8.6.1

ETL Scheduling Tools

Control M, ESP.

RDBMS

DB2, Oracle 11g/12c, SQL Server 2008/2012, MySQL,PostgreSQL

Data Modeling

ER (OPLTP) and Dimensional (Star, Snowflake Schema);

Data Modeling Tools

Erwin 9.3/7.5

UNIX

Shell scripting

Reporting Tools

Tableau 9, Cognos 8x/9x

Defect Tracking Tools

Quality Center

Operating Systems

Windows XP/2000/9x/NT, UNIX

Source Management

BitBucket, Visual SourceSafe

Cloud Computing

Amazon Web Services (AWS), S3, Redshift

Programming Languages

C, C++, PL/SQL

Other Tools

Notepad++, Toad, SQL Navigator, Teradata SQL Assistant, JIRA, Rally

PROFESSIONAL EXPERIENCE:

ETL Informatica developeratWells Fargo,Plano TX Nov 2015 to Till Date

Wells Fargo & Company is an American international banking and financial services holding company. It is the world's second-largest bank by market capitalization and the third largest bank in the U.S. by assets.

Project: Integrated Customer Data Mart

The purpose of this project is to present the business requirements to analyze customer historical data for loan processing, this project enabled prompt reporting solutions using customer’s credit card utilization, sales analysis, identifying risky customers. It includes understanding the existing loan processing, reporting and the associated processes, concerns and systems for implementation of a new loan processing.

Responsibilities:

Extensively Worked with Business Users to gather, verify and validate various business requirements.

Identified various source systems, connectivity, tables to ensure data availability to start the ETL process.

Worked as Data modeler and created Data model for warehouse and involved in ODS and Datamart data models.

Worked as Data analyst to analyze the source systems data.

Created Design Documents for source to target mappings. Developed mappings to send files daily to AWS.

Used UNIX scripting to apply rules on the raw data within AWS.

Used Redshift within AWS.

Created Complex mappings using Unconnected and ConnectedLookup, Aggregator and Router transformations for populating target table in efficient manner.

Created Stored procedures to use oracle generated sequence number in mappings instead to using Informatica Sequence generator.

Created complex Mappings and implemented Slowly Changing Dimensions (Type 1, Type2 and Type 3) for data loads.

Created complex Mappings to implement data cleansing on the source data.

Used Mapping Variables, Mapping Parameters and Session Parameters to increase the re-usability of the Mapping.

Created source to target mappings, edit rules and validation, transformations, and business rules. Analyzed client requirements and designed the ETL Informatica mapping.

Scheduled and Run Workflows in Workflow Manager and monitored sessions using Informatica Workflow Monitor.

Tunedperformance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.

Validated and tested the mappings using Informatica Debugger, Session Logs and Workflow Logs.

Created detail Unit test plans and performed error checking and testing of the ETL procedures using SQL queries, filtering out the missing rows into flat files at the mapping level.

Used UltraEdit tool and UNIX Commands to create access and maintain the session parameter files, data files, scripts on the server.

Used CUCUMBER automated test tool to automate the unit tests for Informatica ETL.

Followed and automated the Acceptance Test Driven Development (ATDD) and Test Driven Development (TDD) for unit tests for Informatica ETL.

Scheduled the ETLs using ESP scheduler.

Environment:InformaticaPower Center 9.6.1, Oracle 11g, Cognos10.x, AWS, Redshift, DB2, Flat files, SQL, putty, UltraEdit-32, shell scripting, Toad, Quest Central, UNIX scripting, Windows NT

ETL Informatica developer atState Farm, Bloomington ILMay 2014 to Jun 2015

State Farm Bank want to help their customer as an informed consumer. Whether a customer considering an auto loan, a home mortgage loan or pursuing a home equity loan or line of credit, their resources proved valuable as either a customer considering loan process or other financing options.

In order to build an application for informed consumer, we have built ODS, and Consumer Data Mart which integrate data from various internal and external data sources.This project enabled an improvement in overall customer satisfaction/performance by providing the reporting and analysis capabilities necessary to support StateFarm growth, Tableau 9.2 has been used for reporting and visualization

Responsibilities:

Extensively Worked with Data Modeler, Data Analysts and Business Users to gather, verify and validate various business requirements.

Identified various source systems, connectivity, tables to ensure data availability to start the ETL process.

Created Design Documents for source to target mappings

Created Workflows, Tasks, database connections, FTP connections using Workflow Manager.

Developed mappings to get the SAP source data

Extensively developed various Mappings using different Transformations like SourceQualifier, Expression, Lookup (connected and unconnected), UpdateStrategy, Aggregator, Filter, Router, Joineretc

Used Workflow Manager for Creating, Validating, Testing and running the workflows, sessions and scheduling them to run at specified time.

Created pre-session and post-session shell commands for performing various operations like sending an email to the business notifying them about any new dealer branches.

Providing the Architecture/Domain knowledge to Report developer for the creation of Dashboards and Reports.

Performed Fine tuning of SQL overrides for performance enhancements and tuned Informatica mappings and sessions for optimum performance.

Extensively worked with Look up Caches like Shared Cache, Persistent Cache, Static Cache and Dynamic Cache to improve the performance of the lookup transformations.

Created Source definitions and Flat File Target definitions using the Informatica

Used UNIX commands (Vi Editor) to perform the DB2 load operations.

Created detail Unit test plans and performed error checking and testing of the ETL procedures using SQL queries, filtering out the missing rows into flat files at the mapping level.

Scheduled the ETLs using ESP scheduler.

Environment:Informatica Power Center 8.6.1/8.1.1, Oracle 11g, TOAD, Cognos 9, Tableau 9.2, UNIX, Autosys, SQL*Loader,PowerCenter Mapping Architect for Visio, Sybase Power Designer, IBM DB2, Flat files, SQL, putty, UltraEdit-32, shell Programming, Toad, Quest Central

Informatica Developerat TMX Finance, Carrollton TX May 2012 to Apr 2014

The TMX Finance is one of the largest and fastest growing consumer specialty finance organizations in the United States. With brands that include TitleMax, TitleBucks, InstaLoan, and TMX Credit, TMX Finance provides a diversified product offering. TMX finance’s goal is to provide excellent service delivery to a demographic without access to traditional credit resources.

In order to achieve the goal, the source systems used at the TMX stores is continuously integrated to Great Plains Microsoft system so that accounting can run the required reports. In this Accounting Integrations Redesign project, the existing process has been redesigned to use Informatica for ETL instead of Scribe and SSIS jobs.

The new design sources the data from multiple Point of Sales Systems that TMX uses and Informatica ETLs have been designed to stage the data in SQL Server. From this staging database, SmartConnect tool is used to extract the data from staging and push to Great Plains system.This initiative started to follow the Agile software process instead of waterfall methodology.

Responsibilities:

Involved in the requirements definition and analysis in support of Data Warehousing efforts.

Worked on ETL design and development, creation of the Informaticasource to target mappings, sessions and workflows to implement the Business Logic.

Created Informatica mappings with PL/SQL Procedures/Functions to build business rules to load data

Used most of the transformations such as the Source qualifier, Aggregators, Lookups, Filters, Sequence and Update strategy, Router.

Extensive knowledge and worked with Informatica Data Quality (IDQ 8.6.1) for data analysis, datacleansing, data validation, data profiling and matching/removing duplicate data.

Designed and developed Informatica DQ Jobs, Mapplets using different transformation like Address validator, matching, consolidation, rules etc.for data loads and data cleansing.

Preparation of technical specification for the development of Extraction, Transformation and Loading data into various stage tables.

Scheduled and Run Workflows in Workflow Manager and monitored sessions using Informatica Workflow Monitor.

Validated and tested the mappings using Informatica Debugger, session logs and workflow logs.

Environment:InformaticaPowerCenter8.6.1, Informatica Data Quality (IDQ 8.6.1), SQL Server, Oracle 11g, Flat files, MySQL, Teradata 13, WinSCP, Notepad++, Toad, Quest Central, UNIX scripting, Windows NT.

InformaticaDeveloper at Cummins Inc., Columbus, IN Jan 2010 to May 2012

EDW program is to provide a single logical data repository that can be used to provide data services to downstream consuming applications and provide the necessary depth and breadth of information to the business through the Cummins approved data delivery and reporting platform.

Responsibilities:

Assisted to prepare design/specifications for data Extraction, Transformation and Loading

Developed Informatica mappings, enabling the extract, transport and loading of the data into target tables.

Created Workflow, Worklets and Tasks to schedule the loads at required frequency using Workflow Manager.

Prepared reusable transformations to load data from operational data source to Data Warehouse.

Wrote complex SQL Queries involving multiple tables with joins.

Scheduled and Run Workflows in Workflow Manager and monitored sessions using Informatica Workflow Monitor.

Used debugger, session logs and workflow logs to test the mapping and fixed the bugs.

Analyzed the dependencies between the jobs and scheduling them accordingly using the Work Scheduler.

Improved the performance of the mappings, sessions using various optimization techniques.

Environment:Informatica 8.1, OBIEE, Erwin, Oracle 10g, SQL Server 2008, Flat files, SQL, putty, UltraEdit-32, shell Programming, Toad, SQL Developer, UNIX scripting, Windows NT.



Contact this candidate