Post Job Free

Resume

Sign in

Sr Informatica

Location:
TN, 37013
Posted:
September 18, 2015

Contact this candidate

Resume:

Mohamed Diiriye

Sr. Informatica Developer Mob: 817-***-****

Email: acrq4z@r.postjobfree.com

PROFESSIONAL SUMMARY:

Over 8+ years of experience in Information Technology with a strong back ground in Analyzing, Designing, Developing, Testing, and Implementing of Data Warehouse development in various domains such as Banking, Insurance, Health Care, Telecom and Wireless.

Over 8+years of experience in dealing with SQL.

Over8+ years of Data Warehousing experience using Informatics Power center 9.x/8.x/7.x, PowerExchange 9.x/8.x.

Expertise in Database development skills using Oracle 11g/10g/9i/8i, SQL, PL/SQL stored procedures, Functions, Packages, Programming and TOAD.

Experienced in Installation, Configuration, and Administration of Informatica Power Center/Power Mart Client/Server.

Experienced in complete life cycle Implementation of data warehouse.

Expertise in building Enterprise Data Warehouses (EDW), Operational Data Store (ODS), Data Marts, and Decision Support Systems (DSS) using Data modeling tool ERWIN and Dimensional modeling techniques (Kimball and Inmon), Star and Snowflake schema addressing Slowly Changing Dimensions (SCDs).

Expert knowledge in Trouble shooting and performance tuning at various levels such as source, mappings, target and sessions.

Experience in the concepts of building Fact Tables, Dimensional Tables, handling Slowly Changing Dimensions and Surrogate Keys.

Expertise in designing and Developing complex Mappings using Informatica Power Center Transformations –Lookup, Filter, Expression, Router, Joiner, Update Strategy, Aggregator, Stored Procedure, Sorter, Sequence Generator and Slowly Changing Dimensions.

Working experience in using Informatica Workflow Manager to create Sessions, Batches and schedule workflows and Worklets, Developing complex Mapplets, Worklets, Re-usable Tasks, Re-usable Mappings, Define Workflows &Tasks, Monitoring Sessions, Export & Import Mappings & Workflows, Backup & Recovery,Power Exchange.

Expertise in doing Unit Testing, Integration Testing, System Testing and Data Validation for Developed Informatica Mappings.

Data loading using UNIX Shell scripts, SQL*Loader.

Experience in dealing with various databases like Oracle 11x/10g/9i/8x, SQL Server 2008/2005,Teradata, DB2Excel sheets, Flat Files and XML files.

Well Experienced in doing Error Handling and Troubleshooting using various log files.

Good exposure to development, testing, debugging, implementation, documentation and production support.

Experience in handling initial and incremental loads in target databaseusing mapping variables.

Developed effective working relationships with client team to understand support requirements, develop tactical and strategic plans to implement technology solutions, and effectively manage client expectation.

Excellent written, communication skills and possessed analytic problem skills in evaluating business and technical processes and issues, develops and implements system enhancements to facilitate overall operations.

Technical Skills:

Data Warehousing

Informatica Power Center 9.5/9.1/8.6/8.1/7.1, Informatica Power Exchange 9.x/8.x

Data Modeling

Ralph-Kimball Methodology, Bill-Inmon Methodology, Star schema, Snowflake schema, Dimensional data modeling, Snowflake modeling, Fact Tables, Dimension Tables.

Databases/Files

Oracle 11g/10g/9i/8i, SQL Server 2008/2005, Teradata 14/13/V2R5, DB2, XML and Flat files.

Oracle Utilities

TOAD 9.x/8.x/7.x,Oracle SQL Developer, SQL * Plus, Adhoc SQL, SQL Loader.

Operating Systems

Windows 7/Vista/NT/2000/XP, UNIX/Linux.

Reporting Tools

OBIEE 10g, Cognos 10.x/8.x

Modeling Tools

Erwin 9.1/7.5, SQL Loader 11g/10g, ER-Studio.

Scheduling Tools

Autosys, Control-M, UC4.

Programming

SQL, PL/SQL, T-SQL, C, C# (.NET), HTML, ASP.NET.

Scripting

UNIX Shell Scripting, Korn Shell Scripting, Perl Scripting.

Work Experience:

EFT Source, Nashville, TN Mar’ 14 to Present Role: Sr. Informatica Developer

Description: EFT Source has been providing turn-key card programs – from conceptual art to creative fulfillment services for financial institutions. They continuously invest in the latest technology to deliver products that will make you stand out from your competition. Also, they deliver new and innovative products that will maximize your bottom line and offer personalized, convenient and secure services to your customers.

Responsibilities:

Extensively work with the data modelers to implement logical and physical data modeling to create an enterprise level data warehousing.

Extensively use Informatica Power Center 9.5 to extract data from various sources and load in to staging database.

Design and Develop Oracle PL/SQL Procedures and UNIX Shell Scripts for Data manipulations and Data Conversions.

Create and Modify T-SQL stored procedures for data retrieval from MS SQL SERVER Staging.

Work with pre and post sessions, and extract data from Transaction System into Staging Area.

Extensively work with Informatica Tools – Source Analyzer, Warehouse Designer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager, Workflow Manager, Workflow Monitor, Power Exchange, Repository server and Informatica server to load data from flat files, legacy data.

Extensively use transformations such as Source Qualifier, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Joiner, Transaction Control and Stored Procedure.

Design the mappings between sources (external files and databases) to operational staging targets.

Involve in data cleansing, mapping transformations and loading activities.

Develop Informatica mappings and Mapplets and also turn them for Optimum performance, Dependencies and Batch Design.

Use Informatica debugging techniques to debug the mappings and use session log files and bad files to trace errors occurred while loading.

Implement Slowly Changing Dimension methodology for Historical data.

Designing mapping templates to specify high level approach.

Extensive hands on XML import/export, deployment groups, query generation, migration using informatica repository manager.

Use Power Exchange to source copybook definition and then to row test the data from data files.

Change Data Capture can do using the Power Exchange.

Extensively work on unit testing and implement on transformations, mappings.

Create Test cases and detail documentation for Unit Test, System, Integration Test and UAT to check the data quality.

Coordinate between Development, QA and production migration teams.

Generate reports using Business Objects for analysis.

Environment: Informatica PowerCenter 9.5, Informatica PowerExchange 9.5,Oracle 11g, Teradata 14, MSSQL SERVER 2012,UNIX Shell scripting, OBIEE 10g,Autosys, TOAD, SQL Developer, Adhoc SQL, Windows 7, WinSCP, Linux

State of Tennessee, Nashville, TN Aug ‘12 to Jan’14 Role: Sr. Informatica Developer

Description: State of Tennessee is the nation's leader in cost containment, program integrity, and coordination of benefits solutions for government-funded, commercial, and private entities. Project was in particular to integrate data from various source systems for verification and reporting purposes.

Responsibilities:

Involved in Data modeling design, Designing specifications and Documentation of Data Warehouse using ETL (Extraction, Transformation and Load) tool Informatica Power Center 9.1.

Analyzed existing database schemas and designed star schema models to support the users reporting needs and requirements.

Upgraded informatica power center 8.6.1 to Informatica power center 9.1.

Extensively used Informatica Power Center 9.1 to create and manipulate source definitions, target definitions, mappings, mapplets, transformations, re-usable transformations, etc.

Involved in design and development of complex ETL mappings and stored procedures in an optimized manner.

Designed and developed daily audit and daily/weekly reconcile process ensuring the data quality of the Data warehouse.

Finding the Informatica mappings bottlenecks and optimizing the mappings to get the best performance and tuning the SQL queries as well.

Implemented partitioning and bulk loads for loading large volume of data.

Involved in loading the data from Sources to RST (Raw Staging Tables) using Transformation and Cleansing Logic using Informatica.

Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.

Based on the requirements, used various transformations like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner, Stored procedure transformations in the mapping.

Developed mapplets and worklets for reusability.

Implemented daily error tracking and correction process using Informatica and Tidal.

Involved in performance tuning of mappings, transformations and (workflow) sessions to optimize session performance.

Developed Informatica SCD type-I, Type-II mappings. Extensively used almost all of the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, mapplets and others.

Developed workflow tasks like reusable Email, Event wait, Timer, Command, Decision.

Created Stored Procedures, Packages in PL/SQL with Oracle in order to create, update several tables like Order processing Information table and Audit Log tables.

Worked on Migration of Data warehouse in between different versions.

Used PMCMD command to start, stop and ping server from UNIX and created UNIX SHELL PERL scripts to automate the process.

Extensively used cognos 8.4 frame work studio to create reports relative to corporate business planning.

Extensive hands on Schema design, XML import/export, Scheduling, System management using Informatica.

Creating Test cases and detailed documentation for Unit Test, System, Integration Test and UAT to check the data quality.

Environment: Informatica PowerCenter 9.1, Oracle 11g, Informatica PowerExchange 9.1, Web Services, MQ Series, TOAD 10.6 for oracle, PL/SQL, ER-studio, Autosys, DB2, XML, Flat Files, Windows 7, Linux, Cognos 8.4.

McAfee Inc., Plano, TX Feb’11–Jul’12 Role: Sr.Informatica Developer

Description:McAfee is the world’s largest dedicated security technology company. Delivering proactive and proven solutions and services that help secure systems and networks around the world, It protects consumers and businesses of all sizes from the latest malware and emerging online threats. Backed by an award-winning research team, McAfee security technologies use a unique, predictive capability that is powered by McAfee Global Threat Intelligence enabling home users and businesses to stay one step ahead of online threats. The project was executed in phases building up from extraction, computation logic, dimension models and populating data using Informatica.

Responsibilities:

Studied and understood the warehouses, sources, and functionally analyzed the application domains, involved in various knowledge transfers from dependent teams understand the business activities and application programs and document the understandings for internal team referencing.

Interacted with functional/end users to gather requirements of core reporting system to understand exceptional features users expecting with ETL and Reporting system and also to successfully implement business logic.

Study of detailed requirement of end users of system & their expectations from Applications.

Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake Schema.

Business process re-engineering to optimize the IT resource utilization.

Integration of various data sources like Oracle, SQL Server, Fixed Width and Delimited Flat Files, DB2, COBOL files & XML Files.

Identify the flow of information, analyzing the existing systems, evaluating alternatives and choosing the"most appropriate" alternative.

Understand the components of a data quality plan. Make informed choices between sources data cleansing and target data cleansing.

Transformed data from various sources like excel and text files in to reporting database to design most analytical reporting system.

Initiate the data modeling sessions, to design and build/append appropriate data mart models to support the reporting needs of applications.

Change Data Capture can do using the Power Exchange.

Used features like email notifications, scripts and variables for ETL process using Informatica Power Center.

Involved in Data Extraction from Oracle and Flat Files using SQL Loader Designed and developed mappings using Informatica Power Center .

Developed slowly changed dimensions (SCD) Type 2 for loading data into Dimensions and Facts.

Involved in Data Extraction from Oracle and Flat Files, XML Files using SQL Loader, Freehand SQL.

Using Toad to increase User productivity and application code quality while providing an interactive community to support the user experience.

Created the Test cases and Captured Unit test Results.

Extensively used ETL to load data from Flat files which involved both fixed width as well as Delimited files and also from the relational database, which was Oracle 9i/10g.

Imported metadata from different sources such as Relational Databases, XML Sources and Impromptu Catalogs into Frame Work Manager.

Environment: Informatica Power Center 8.6, Oracle 11g, PL/SQL, MSSQL SERVER 2008,UNIX Shell scripting, IBM DB2, Business Objects XI R3,ESP Schedule, TOAD, SQL Developer, Humming Bird, Power Exchange 8.6, Win XP, Win SCP.

Edward Jones Investments, St.Louis, MO Jan’10- Feb’11 Role: ETL Developer

Description: Edward Jones is a group of investments and financial services companies. The firm focuses solely on individual investors and small-business owners .It is a limited partnership owned only by its employees and retired employees and is not publicly traded. The projects cover up across various elements of the investments. The task was in particular to integrate data from various source systems for reporting purposes.

Responsibilities:

Involved in understanding the business requirements and translate them to technical solutions.

Worked for preparing design documents and interacted with the data modelers to understand the data model and design.

Created new mappings and updating old mappings according to changes in Business logic.

Involved in migrating project from UAT to Production.

Setting up of the local Informatica environment on the client machines which included the connectivity and access to the data sources, taking the necessary steps to set up the Relational Connectivity variables in the Workflow manager etc.

Created XML targets based on various non XML sources.

Generated sequence numbers using Informatica logic without using the sequence generator.

Used SQL override to perform certain tasks essential for the business.

Worked with flat file, xml file and SQL server tables as targets.

Implemented XML web services using C #.NET and ASP.NET.

Implemented Performance tuning in Mappings and Sources.

Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and modularity.

Created and Modified PL/SQL, T-SQL stored procedures for data retrieval.

Defined Target Load Order Plan for loading data into Target Tables.

Used mapping variables to achieve certain goals like creating file names dynamically.

Developed mappings to support incremental loading. Used change data capture (CDC) tables as source tables for incremental.

Responsibilities included designing and developing complex Informatica mappings including Type-II slowly changing dimensions.

Unit testing of mappings and testing of various conditions.

Developed Unix Shell scripts for customizing the delivery of Flat files to the customer.

Involved in the documentation of the ETL process with information about the various mappings, the order of execution for them and the dependencies.

Environment: Informatica Power center 8.6, Oracle 10g, TOAD, SQL, PL/SQL, T-SQL, UNIX Shell Scripting, Autosys, Windows XP, SQL Server 2005 and XML.

Nokia, Irving, TX Oct’08 – Dec'09 Role: Informatica Developer

Description: Nokia is one of the major wireless companies in U.S.A and the foremost in providing G.S.M. phones in U.S. market. The company has huge volumes of data coming from different sources which were generated from multiple and cross platform applications. The Project involved design and development of data warehouse for the sales department.

Responsibilities:

Actively involved in gathering requirements and acquiring application knowledge from Business Managers & Application Owners.

Prepared the High-level Design document to provide an overview of the technical design specification system required for Application Enhancements.

Involved in designing the process flow for extracting the data across various systems interacting.

Developed ETL routines using Informatica Power Center and created mappings involving transformations like Lookup, Aggregator, Ranking, Expressions, Mapplets, connected and unconnected stored procedures, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers.

Implemented dimension model (logical and physical data modeling) in the existing architecture using Erwin.

Designed data model structure and E-R Modeling with all the related entities and relationship with each entities based on the rules provided by the business manager using Erwin.

Used workflow manager for session management, database connection management and scheduled the jobs to run in the batch process.

Developed Slowly Changing Dimensions for Type 2 SCD.

Involved in analyzing the bugs, performance of PL/SQL Queries and provided solutions to improve the same.

Written and used UNIX shell scripts extensively for scheduling and pre/post session management

Involved in the performance tuning process by identifying and optimizing source, target, mapping and session bottlenecks.

Involved in Unit testing, System testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.

Prepared the Standard Operating Procedure (Knowledge Transfer) document, which provides necessary information, required for the Maintenance and Operation of the application.

Provided data loading, monitoring, system support and general trouble shooting necessary for all the workflows involved in the application during its production support phase.

Environment: Informatica Power Center 8.1, Oracle 10g, PL/SQL, SQL Server 2005, T-SQL, SQL*Loader, Erwin 4.0, SQL Loader 9i, Win 2008, Perl &Kornscripts, SAP R3, Flat files,Toad, Solaris 10, Autosys, ODBC, Power Analyzer.

HealthCare USA, St. Louse, MO Mar’07–Sep’08

Role: ETL Developer

Description: Samsung emphasis on Product Innovation and R&D has given the Company a competitive edge in the marketplace. The Samsung India Software operations works on major projects for Samsung Electronics in the area of telecom, wireless terminals and infrastructure, Networking and application software.The projects cover up across various elements of the financial services. The task was in particular to integrate data from various source systems for maintaining a data warehouse and for reporting purposes.

Responsibilities:

Designed the mappings between sources (external files and databases) to operational staging targets.

Experience with high volume datasets from sources like DB2, Oracle and Flat Files.

Loaded data from various sources using different transformations like Source Qualifier, Joiner, Aggregators, Connected & Unconnected lookups, Filters, Router, Expression, Rank Union, and Update Strategy & Sequence Generator.

Experience in writing PL/SQL scripts, Stored Procedures and functions and debugging them.

Responsible for Migration of Stored Procedures into Informatica Mappings for improving performance issue.

Involved in Performance Tuning of application by identifying bottlenecks in SQL, thus providing inputs to the application programmer, thereby correcting and implementing the right components.

Created Session Task, Email and Workflow to execute the mappings. Used Workflow Monitorto monitor the jobs, reviewed error logs that were generated for each session, and rectified anycause of failure.

Experience in ETL testing, Created Unit test plans and Integration test plans to check whetherthe data has been loaded into the target is accurate, which was extracted from different sourcesystems according to the user requirements.

Setting up of the local Informatica environment on the client machines which included the connectivity and access to the data sources, taking the necessary steps to set up the Relational Connectivity variables in the Workflow manager etc.

Used SQL override to perform certain tasks essential for the business.

Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and modularity.

Defined Target Load Order Plan for loading data into Target Tables.

Involved in the documentation of the ETL process with information about the various mappings, the order of execution for them and the dependencies.

Environment: Informatica PowerCenter 7.1, Oracle 10g, TOAD, Windows XP, PL/SQL, MS Excel, IBM UDB DB2,Erwin, UNIX, Sun Solaris.



Contact this candidate