Post Job Free

Resume

Sign in

Data Manager

Location:
United States
Posted:
June 03, 2016

Contact this candidate

Resume:

DEVENDRA PATEL

ETL / INFORMATICA DEVELOPER

Telephone# (754) 220- 7517 Email: acu2sj@r.postjobfree.com

Data Extraction Database Development Data Modeling

Highly accomplished IT Professional with 7 years in the Analysis, Design, Development, Testing, Implementation and Production Support for various industries such as Healthcare, Banking & Retail using ETL, OLAP, Data Warehouse, Client/Server and Web Applications. Computer literate in database management, multiple operating systems, programming language, software applications with cutting edge knowledge of technology changes and their business implications. Informatica Administration experience like setting up environments in Linux, Windows, User & Group creations, LDAP configuration, Repository creations using Admin URL, Repository Migrations. Keen understanding of business priorities, genuine team player and committed to managing operations and projects flawlessly while contributing to revenue-producing activities.

Areas Of Expertise

Software Applications

Programming Languages

Operating Systems

Database Design / Programming

Debug Application Programs

Database Optimization

Data Flow Diagrams

Performance Tuning

Problem Solving

Strategic Planning

Multi-Projects

Systems Development- Waterfall/ Agile / Scrum Principals

Data Modeling & Analysis

Summary

Experience in different phases of Data Warehouse Life Cycle including requirements gathering, source system analysis, logical/physical data modeling, ETL design/development, project deployment, Business reporting, production support and warranty support.

Excellent in Supporting Business Application in Production Environment.

Experience in working with Business Analysts to study and understand requirements and translated them into ETL code in Requirement Analysis phase.

Experience in coordinating and managing with the off-shore team to work collectively to create, assigning, solving and finally closing the tickets.

Expertise in Business Model development with Dimensions, Hierarchies, Measures, Partitioning, Aggregation Rules, Time Series, Cache Management.

Strong in Designing, Development (coding), Testing, Implementation and Documentation as per Industry standards.

Knowledge of data profiling, design & development, testing of ETL and ILM (Information Lifecycle Management) concepts in a dynamic environment and harness my skills.

Good understanding of Microsoft Reporting Service (SSRS) with Report authoring, Report management, Report delivery and Report security.

Worked extensively on Oracle client/server application tools and RDBMS.

Worked on integrating data from flat files, CSV files and XML files into a common reporting and analytical Data Model.

Optimized the performance of queries with modifications in T-SQL queries, removed unnecessary columns, eliminated redundant and inconsistent data, normalized tables, established joins, created indexes and Partitions whenever necessary.

Expert in database skills using SQL, TOAD, PL/SQL Developer for debugging applications.

Strong knowledge of Entity-Relationship concept, Facts and dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema) and creating Reusable Transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer and Rank).

Hands on experience in identifying and resolving performance bottlenecks.

Well experienced in database development stored procedures/packages, Functions, Table creation scripts & database triggers.

Well aware of Data warehousing methodologies, data extraction, transformation and loading fundamentals.

Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements. Solid understanding of OLAP concepts and challenges especially with large data sets.

Experience in integration of various data sources like Oracle, Teradata, DB2, Sybase, SQL server, MS access and non-relational sources like flat files, CSV files and XML files into staging area.

Professional Strength

Enthusiastic, knowledge-hungry, self-starter, eager to meet challenges and quickly assimilate newest and latest technologies, skills, concepts, and ideas.

Experienced in database management, SQL queries, and multiple operating systems, programming language and software applications with cutting edge knowledge of technology changes and their business implications.

Proven relationship-builder with exceptional interpersonal, communication and presentation skills.

Goal oriented individual with strong passion and quick learning ability.

Ability to work effectively and efficiently under tight deadlines, high volumes and multiple interruptions.

Technical Skills

Data warehousing

Informatica Power Center 9.5/8.6/8.5/8.1/7.1 (Repository Manager, Repository Server Administration Console, Designer, Workflow Manager, Workflow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplets Designer, Mapping Designer), Oracle warehouse builder 10.2 (OLAP, OLTP, Star & Snowflake Schema, FACT & Dimension Tables, Physical, Relational& Logical Data Modeling, Normalization, Denormalization) and MS Visio Data Stage 7.x, Erwin 4.0

BI Tools

Cognos 7, Business Objects XI, OBIEE 11g

Databases

Oracle 11g/10g/9i/8i, SQL Server 2007/2000, Teradata 14/13

Languages

SQL, PL/SQL, UNIX Shell Scripting, Perl, Python

Operating System

Windows Vista/NT/2000/7/8.1/10, MS-DOS, Linux, Sun Solaris

Other Tools

MS Visual Source Safe, PVCS, Autosys, Control M, uni center, Remedy, Tivoli

DB Tools:

SQL*Plus, SQL*Loader, Export/Import, TOAD, SQL Navigator, SQL Trace

Microsoft Tools

MS Office, MS Front Page, MS Outlook, MS Project. MS Visio, SSIS, SSAS, SSRS

Development

Object-Oriented Design, Technical Documentation, Agile Development, Quality Assurance, Solution Architecture, Data Quality Write up, Prod Transition Document

Packages

SQL* PLUS, Toad 8.6, MS Office, SCM Tools, Job scheduler

Professional Experience

INDEPENDENT HEALTH, Buffalo, NY May 2014 – Present

Informatica Developer

Independent Health is one of the largest healthcare organizations providing health care services. The goal of the project is to analyze operational data sources, define the data warehouse schedule and develop ETL processes. Expertly transform business requirements into technical documents. Explain business requirements in terms of technology to the developers. Design various mappings for extracting data from various sources involving Flat files, Oracle, Teradata, Sybase, SQL Server and IBM DB2. Function in the performance tuning of programs, ETL procedures and processes. Efficiently work on Debugging and Troubleshooting of the Informatica application. Establish Test plan to verify the logic of every Mapping in a Session.

Key Accomplishments:

Developed Data Flow diagrams to create Mappings and Test plans. Specifically these data flow diagrams ranged from OLTP systems to staging to Data warehouse.

Designed and developed Informatica Mappings from OLTP to Stage and Stage to DWH and effectively used Informatica Transformations like Source Qualifier, Joiner, Expression, Router, Aggregator, Connected and Unconnected Lookup, Normalizer, Update Strategy etc.

Developed complex mappings, mapplets using Informatica workflow designer to integrate data from varied sources like Teradata, Oracle, SQL Server, Flat files and loaded into target.

Worked on performance tuning by identifying the bottlenecks in Sources, Targets, and Mapping enhanced Performance for Informatica sessions using large data files by using partitions.

Worked with Static cache, Persistent, Dynamic cache, Index cache, Data cache and target based commit interval in order to improve the performance at session level.

Designed and developed complex interactive OBIEE reports, including prompts, filters, different views (charts, views, filters, tables and bins), drill-down capabilities, navigation links, detailed designing and developing of custom dashboards including customized appearance.

Extensively performed Data Masking for preserving the referential integrity of the user data.

Build several Unix Shell scripts for PL/SQL programs to schedule them on Control M.

Assisted the team in the development of design standards and codes for effective ETL procedure development and implementation.

Worked with SQL Override in the Source Qualifier and Lookup transformation.

Involved in the design, development and testing of the PL/SQL stored procedures, packages for the ETL processes.

Experience in using SSIS tools like Import and Export Wizard, Package Installation, and SSIS Package Designer.

Created Unix Shell Scripts for ETL jobs, session log cleanup, dynamic parameter and maintained shell scripts for data conversion.

Prepared Unit Test plan and efficient unit test documentation was created along with Unit test cases for the developed code.

Involved in the process design documentation of the DW Dimensional Upgrades.

Coordinate with offshore team and client, update daily status report to client.

Validate the data in source and target systems using PL/SQL queries.

Involved in migration of mappings and sessions from development repository to production repository.

Installed and Documented the Informatica Power Center setup on multiple environments.

Environment: Informatica PowerCenter 9.5.1, PowerExchange 9.5.1, AS-400(DB2), Oracle 11g, Teradata 14, SYBASE, OBIEE 10g, SQL/P SQL, SQL Server 2005/2K, AIX, Windows XP,LINUX, UNIX, SSIS, Sun Solaris, Putty, SCM, Rally, Control M.

CIGNA, Hartford, CT October 2013 – April 2014

ETL Developer

CIGNA is a leading provider of insurance and other financial services throughout the world. The primary objective of this project is to develop Tampa Data Mart for investment like securities and bonds using existing DWH. DM is developed in such a way that adjustments are applied directly by users with GUI.

Key Accomplishments:

Involved in design & development of operational data source and data marts in Oracle.

Involved in analyzing source systems and designing the processes for Extracting, Transforming and Loading the data.

Involved in Data Model Design and used best approaches to configure Physical and Logical model.

Used various transformations such as Source Qualifier, Expression, Lookup, Sequence Generator, aggregator, Update Strategy and Joiner while migrating data from various heterogeneous sources like Oracle, SQL Server, and Flat files to target data base.

Made changes in mappings with data changing capture capability for sources which have data added rapidly.

Worked extensively on bug fixing of the existing mappings, performance tuning for better performance with best performance techniques and making existing objects to adhere to standards set up for the project.

Created mapping variables and parameters for incremental loading.

Handled Slowly Changing Dimensions (Type I, Type II, Type III) based on the business requirements.

Reviewed source data and recommend data acquisition and transformation strategy.

Wrote SQL stored procedures to implement business logic.

Involved in conceptual, logical and physical data modeling and used star schema in designing the data mart.

Used Informatica Power centre workflow manager to design sessions, assignment, e-mail and command to execute mappings.

Developed and coded applications in PL/SQL.

Created mapplets to reuse the transformation logic in several mappings.

Used power centre workflow monitor to monitor the workflows.

Optimized mappings using transformation features like aggregator, filter, joiner, expression and lookups.

Created daily and weekly workflows and scheduled to run based on business needs.

Involved in creating complex and custom reports using SSRS with drilldown features, sub reports, charts.

Worked closely with Production Control team to schedule shell scripts, Informatica workflows and PL/SQL code in Autosys.

Involved in developing test automation scripts using Perl/Python.

Environment: Informatica PowerCenter Designer 9.1, Informatica PowerExchange 9.1, Oracle11g, DB2 6.1, MS Visio, TOAD, Teradata 13, Unix- SunOS, PL/SQL, SQL Developer, SSRS, DB2, Perl, Python, Autosys.

T-Mobile, Bothell, WA April 2012 – September 2013

Informatica Consultant

T-Mobile is the second largest GSM mobile service provider in USA. It stores its entire customer information both post-paid and pre-paid in a data warehouse. It receives data from various legacy systems and Oracle applications like AR, AP and PO of GL module. This data from oracle applications is extracted, transformed and loaded into oracle and also data from legacy system received from SQL Server database is loaded using Informatica. Responsible for successfully solving Informatica and Teradata Issues for different departments if ticket was opened. Contributed in Data Warehouse Development in ETL, Data Migration and Data Conversion Projects. Accurately wrote analytical queries to generate reports .Designed normal and materialized View.

Key Accomplishments:

Developed complex Informatica mappings using various transformations - Source Qualifier, Normalizer, Filter, Connected Lookup, Unconnected Lookup, Update strategy, Router, Aggregator, Sequence Generator, Reusable sequence generator transformation.

Created Oracle Stored Procedure, Function, Cursor, Package and Triggers.

Developed complex mappings using Informatica Power Center for data loading.

Enhanced performance for Informatica session using large data files by using partitions, increasing block size, data cache size and target based commit interval.

Developed various mappings for extracting the data from different source systems using Informatica, PL/SQL stored procedures.

Developed mappings for extracting data from legacy and Oracle Applications into our data warehouse.

Extensively used Informatica to load data from MS SQL Server into the data warehouse.

Developed mappings between multiple sources such as flat files, oracle and multiple targets.

Strong experience in using Expression, Joiner, Router, Lookup, Update strategy, Source qualifier and Aggregator Transformations.

Created Informatica mappings with PL/SQL stored procedures/functions to incorporate critical business functionality to load data.

Created client/server Perl scripts to automate routine processes to check for new incoming data.

Developed Informatica mappings for Slowly Changing Dimensions Type I & II.

Created Mapplets for implementing the incremental logic, in identifying the fiscal quarter of the transaction and in calculating various requirements of business processes.

Successfully developed drill down reports and hierarchies, created cascading values using Cognos.

Extensively involved in troubleshooting the issues in UAT.

Extensively used almost all the transformations, which includes (Sequence Generator, Expression, Filter, Router, Sorter, Rank, Aggregator, Lookup (Static and Dynamic), Update Strategy, Source Qualifier, Stored Procedure, Joiner, Normalizer and XML Source Qualifier).

Involved in creating and running Sessions and Workflows using Informatica Workflow Manager and monitoring using Workflow Monitor.

Seamlessly migrated the code from Development---Testing---UAT---Production.

Extensively worked on creating and executing formal test cases, test plans for functional, system and regression testing.

Environment: Informatica PowerCenter Designer 9.1, Informatica Power Exchange 9.1, Informatica Repository Manager, Oracle11g/10g, DB2 6.1, Erwin 5, TOAD, SAP Version 3.1.H, Teradata 13, Unix- SunOS, PL/SQL, SQL Developer, Cognos, Tivoli.

WELLS FARGO, Charlotte, NC August 2010 – March 2012

ETL Informatica Developer

Wells Fargo is an American multinational banking and financial services holding company with operations around the world. Wells Fargo is the fourth largest bank in the U.S. by assets and the largest bank by market capitalization. Wells Fargo is the second largest bank in deposits, home mortgage servicing and debit cards. Developed effective query, performance tuning and optimization. Streamlined writing shell scripts and added these shell scripts in autosys as Scheduled daily, weekly or monthly. Precisely scheduled the workflows and monitored databases. Used Informatica power connects to extract data from third party database. Meticulously oversaw troubleshooting and performance tuning of information systems. Used Informatica scheduler to run workflows on UNIX box and monitored the results.

Key Accomplishments:

Supported MyBEA cluster architecture, with three WebLogic domains, offering high-availability web applications, Web Services and EJB services to other applications in the MyBEA suite.

Involved in restarting the application WebLogic servers. Involved in deploying J2ee files for the frontend GUI screens from WebLogic Admin Console.

Involved in creation of Technical, Application Interface and User Guide documentation.

Designed, developed Informatica mappings using Informatica 8.5, enabling the extract, transport and loading of the data into target tables and loading into the Teradata.

Work directly with non-IT business analysts throughout the deployment cycle and provide Production Validation Testing and post-implementation support and investigation of data and report issues identified by customers.

Involved in Triage and Outage calls and ensuring the application always available in production environment.

Used Informatica - repository manager create new users, repositories.

Performed data extraction from OLTP to OLAP for decision making using SSIS.

Used Informatica Designer for mappings and coding it using reusable mapplets.

Scheduling the session's tasks comprising of different mappings for data conversion, extractions in order to load data into the database.

Automated redundant and repetitive tasks using different tools and objects available in SQL Server.

Involved in the performance tuning of Informatica mappings/sessions for large data files by increasing block size, data cache size, and commit interval.

Migrated data from Flat Files to Oracle database using SQL*Loader/Manual ETL.

Used Informatica Workflow Manager to Design and create workflows based on the dependencies.

Involved in debugging the failed mappings and developing error-handling methods.

Created multiple job-aids for recovering the informatica workflow failures.

Used Parameter files with parameters and variables to restore multiple DB connections to the Sources.

Implemented effective error handling strategies.

Environment: Informatica PowerCenter 8.6, Oracle 10g, SQLServer2008, IBM Series (DB2), MS Access, Teradata 12, Windows XP, Toad, Tidal, Cognos 8.4.1, SQL developer, Linux.

PNC BANK, Philadelphia, PA April 2009 – July 2010

Junior Informatica Developer

PNC Bank is one of the leading companies in U.S. I was the Informatica Developer in my team. We developed a New Data Mart. This Data Mart consists of Star Schema data design features and fact tables, conformed dimensions, link dimensions (Fact-less Dimensions) and hierarchy tables in it. The main reason for building this data mart is to store data and make it accessible to support Analytic functions and features (like Balances, Accounts, Customers, Officers, Products, Collateral, Accruals, Payments, Interest Rates, FTP, demographics, segmentation, goals), Provide stored raw data metrics (i.e. balance) / derived metrics (i.e. MTD Avg Balance) / daily metrics / monthly metrics / EDW missing metrics and Role-based Security Access Controls (specify the user roles, admin roles, support roles, sensitive data roles).

Key Accomplishments:

Developed ETL jobs to extract information from Enterprise Data Warehouse.

Extensively used ETL to load data from different relational databases, XML and flat files.

Used ETL, Informatica Repository Manager to create repositories and users and to give permissions to users.

Debugged the mappings extensively, hard coding the test data ids to test the logics going instance by instance.

Performed various transformations, aggregations and ranking routines on the data to be stored in the application reporting mart.

Handle the Migration process from Development, Test and Production Environments.

Implemented Type 2 slowly changing dimensions to maintain dimension history and Tuned the Mappings for Optimum Performance.

Developed PL/SQL Procedures at the database level that were used in the mappings through Stored Procedure Transformation.

Used ETL, Informatica Designer to design mappings and coded it using reusable mapplets.

Developed workflow sequences to control the execution sequence of various jobs and to email support personnel.

Involved in unit testing and documenting the jobs and work flows.

Set Standards for Naming Conventions and Best Practices for Informatica Mapping Development.

Used database objects like Sequence generators and Stored Procedures for accomplishing the Complex logical situations.

Created various UNIX shell scripts for Job automation of data loads.

Created mappings which include the Error Handling Logic being implemented to create an error, ok flags and an error message depending on the source data and the lookup outputs.

Extensively involved in the analysis and tuning of the application code (SQL).

Environment: Informatica Powercenter 8.6, Oracle 10g, Oracle Data Integrator, SQL, PL/SQL,SQL Developer, ER-win, Oracle Designer, MS VISIO, Autosys, Korn Shell, Quality Center 10, LINUX, UNIX.

Education & Credentials

Bachelors of Science, Information Technology Saurashtra University, India



Contact this candidate