Post Job Free

Resume

Sign in

Data Manager

Location:
Florida
Salary:
market
Posted:
June 05, 2016

Contact this candidate

Resume:

Imran Khan Mohammed E-mail: acu3o9@r.postjobfree.com:acu3o9@r.postjobfree.com

Phone: 515-***-****

EXPERIENCE SUMMARY:

7+ years of experience in IT Industry involving Software Analysis, Design, Development,Integration, Maintenance, Reports, Coding, Bug fixing, Creating Specifications, Maintenance, Production Implementation, Testing.

Extensively used Source Qualifier, Connected and Unconnected Lookup, Normalizer, Router, Filter, Expression, Aggregator, Stored Procedure, Sequence Generator, Sorter, Joiner, Update Strategy, Union Transformations, Mapplets.

Experience in Mappings, Mapplets, Worklets, Reusable Transformations, Sessions/Tasks, Workflows and Batch Processes in Informatica Server.

Extracted data from multiple operational sources for loading staging area, Data warehouse, Data Marts using SCDs (Type 1/Type 2/ Type 3) loads.

Experience in Data warehousing, Data Extraction, Transformation and loading (ETL) data from various sources like Oracle, SQL Server, Microsoft Access, Microsoft Excel and Flat files into Data Warehouse and Data Marts using Informatica Power Center 9.0/8.6.0/8.1.1/7.x/6.x.

Experience on Monitoring and Scheduling of Jobs using UNIX (Korn & Bourn Shell) Scripting.

Experience with high volume datasets from various sources like Oracle, Flat files, SQL Server and XML.

Involved in the Migration process from Development, Test and Production Environments.

Extensive database experience using Oracle 11g/10g/9i, Sybase, MS SQL Server 2008/2005, MySQL, SQL, PL/SQL, SQL*Plus, Teradata.

Experienced in designing ETL procedures and strategies to extract data from heterogeneous source systems like Oracle 11g/10g, SQL Server 2008/2005, DB2 10, Flat files, XML, SAP R/3 etc.

Worked with Full Software Development Life Cycle (SDLC) involving Application Development and ETL/OLAP Processes.

Extensive knowledge in architecture design of Extract, Transform, Load environment using Informatica Power Mart and Power Center.

Worked extensively on Error Handling, Performance Analysis and Performance Tuning of Informatica ETL Components and Teradata Utilities, UNIX Scripts, SQL Scripts etc.

Developed procedures to populate the customer data warehouse with transaction data, cycle and monthly summary data, and historical data.

An expert in the ETL Tool Informatica which includes components like Power Center, Power Exchange, Power Connect, Designer, Workflow Manager, Workflow Monitor, Repository Manager, Repository Server Administration Console, IDE – Informatica Data Explorer, IDQ - Informatica Data Quality.

Reviewed the SQL for missing joins & join constraints, data format issues, mis-matched aliases, casting errors.

Experience in using the Data stage Utilities like Pushdown optimization, CDC techniques, Partition and implemented Slowly Changing dimensions Type 1, Type 2 methodology for accessing the full history of accounts and transaction information.

Populate or refresh Teradata tables using Fast load, Multi load & Fast export utilities for user acceptance testing and loading history data into Teradata.

Involved in developing strategies for Extraction, Transformation and Loading (ETL) mechanism using DataStage tool.

Expertise in gathering, analyzing and documenting business requirements, functional requirements, and data specifications for Business Objects Universes and Reports.

Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.

Worked on complex queries to map the data as per the requirements.

Strong Data Modeling experience in ODS, Dimensional Data Modeling Methodologies likes Star Schema, Snowflake Schema. Design and development of OLAP models consisting of multi-dimensional cubes and drill through functionalities for data analysis.

Proven record of success in design, development and implementation of software applications using object oriented technology.

Good exposure to Mainframe Systems and knowledge in handling COBOL files.

Well versed in writing UNIX shell scripting.

Strong decision-making and interpersonal skills with result oriented dedication towards goals

Technical SKILLS:

Data Warehousing

Informatica Power Center 9x/8x/7x, Repository Admin console, Repository Manager, Designer, Workflow Manager, workflow Monitor. Teradata SQL, Teradata (TD) Utilities (BTEQ, Fast Load, Fast Export, MultiLoad).

Databases

Teradata 14/13/V12/V2R6/V2R5 (Fast-Load, Multi-Load, T-Pump, BTEQ), Oracle11x/10g/9i/8i/7.3,Sybase12.0/11.x/10,DB2 UDB8.0/7.0/6.0,MSSQL Server2008/2005/2000/7.0/6.5/6.0, MSAccess2003/2000/97, Sybase.

BI Tools

OBIEE 10.x, Business Objects.

Data Modeling

ERWIN 4.x/3.x, Ralph-Kimball Methodology, Bill-InmonMethodology, Star Schema, Snow Flake Schema, Extended Star Schema, Physical And Logical Modeling, Dimensional DataModeling, Fact Tables, Dimension Tables, Normalization, Denormalization

Programming

C, Visual Basic, Unix Shell Scripting, SQL, T-SQL, and PL/SQL, HTML, XML.

Environment

Windows XP/2000/98, Win NT, UNIX (SunSolaris10,HP,AIX Linux), MS-Dos

Other Tools

SQL*Plus, Toad, SQL Navigator, Putty, MS-Office,MS Visio, Tidal Scheduler, JIRA, Service-now, Blue Zone 6.1

Professional Experience:

Client: John Deere, Moline, IL Aug 2015 – Till Date Role: Lead ETL Informatica Developer

Collaborated with Business Analysts for gathering the requirements and the Data Dictionaries/Mapping documents and designing of End to End solution for the project.

Worked on Informatica power center for (ETL) extraction, transformation and loading data from heterogeneous source systems and flat files, which includes fixed-length as well as delimited files.

Developed complex mappings and SCD type-I, Type-II and Type III mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Lookup (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router and SQL transformations. Created complex mapplets for reusable purposes.

Created complex workflows, with multiple sessions, worklets with consecutive or concurrent sessions and other tasks like Timer, Event Raise, Event Wait, Decisions and Email to transport the data to data warehouse tables using Informatica Workflow Manager.

Involved in Unit testing, System testing, Integrated testing and User Acceptance Testing to check whether the data was loading into target, which was extracted from different source systems according to the business requirements.

Modified all the existing Informatica Mappings by applying performance tuning techniques for better performance.

Created reusable email tasks in workflows to send post session and suspension emails as per user requirements.

Created and maintained folders in the repository manager to logically organize all the metadata in the repository including mappings, schemas and sessions.

Documenting implementation steps to move the code from Test to Pilot and to prod. Worked with DBA team for creation of the Scratch tables, Fact tables and reviewing of SQL queries that were used in Source Qualifier and in Look Up’s.

Used Tivoli scheduler for scheduling the workflows and have experience working in agile environment by attending daily Scrum and bi-weekly Sprint and Backlog Grooming meetings.

Extensively Involved in ddocumenting ETL Design Document for test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis.

Excellent Written, Communications and Analytical skills with ability to perform independently as well as in a team.

Lead teams of design, development, unit testing and deployment.

Environment:Informatica Power Center 9.5.1, Informatica Power Exchange,DB2, Flat Files, Windows, MS Outlook, Tivoli Scheduler, JIRA, COBOL Files, Word, Excel and Power Point.

Client: EBay, PA May 2014- July 2015

Role: Sr. Informatica ETL Developer

Responsibilities:

Developed and tested thoroughly Informatica mappings/workflows as per technical and functional specs provided using Informatica 9.6.1.

Proficient in understanding business processes / requirements and translating them into technical requirements and assisting developers with design work, analysis.

Involved in requirements analysis and Architect rolewith business users for design of tables in the new Warehouse.

Lead the Performance Tuningof InformaticaMappings, Sessions and workflows. Developed unit test cases and did unit testing for all the developed mappings.

Used KSH to run Informatica workflow by passing parameters.

Strong in Data warehousing concepts Fact table, Dimension table, Star and Snowflake schema methodologies.

Review of ETL Detailed design documents for Mapping and DDL specifications document for creation of tables, defining key constraints on tables and data types.

Extracting data from various source systems like Flat Files, Teradata staging tables, reporting tables.

Experience in working with business users, App DBA’s, ETL Admins, and Prod Support.

Participating in daily scrum meetings and attending weekly keys meeting.

Designing mappings as per the business requirements using transformations such as Source Qualifier, Expression, Aggregator, Lookup, Filter, Sequence generator, Router, Union, Joiner, Update strategy etc.

Coordination of System / Integration / UAT testing with other teams involved in project and review of Master test strategy.

Setup ODBC, Relational, and Native and FTP connections for Sybase, Teradata and Flat File.

Involved in performance tuning of the workflows and mapping. Portioning mapping for optimal performance.

Extensively used parameter file for standardizing the ETL and making it generic.

Hands on experience with PL/SQL programming, triggers, stored procedure, functions, packages. Strong understanding of oracle Architecture and the design and implementation of slowly changing dimension.

Developed UNIX shell scripts to automate the data warehouse loading.

Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various needs of the transformations while loading the data.

Environment: Informatica 9.6.1, Sybase, Teradata, Oracle SQL Developer, UNIX Shell Scripting, Flat Files, COBOL Files, Windows, MS Outlook, SQL Assistant, Control-M Scheduler, Putty.

Client:Cigna Health Care, Bloomfield, CT June 2013 – April 2014

Role: Teradata/Informatica ETL Developer

Responsibilities:

Responsible for Creating workflows and worklets. Created Session, Event, Command, and Control Decision and Email tasks in Workflow Manager.

Done extensive testing and wrote queries in SQL to ensure the loading of the data. Perform unit testing at various levels of the ETL and documented the results too.

Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.

Developed Informatica Mappings, Mapplets and Transformations to load data from different relational sources like Oracle, Teradata, DB2, and flat files sources into the data mart.

Developed mappings using Informatica to load data from sources such as Sequential files into the target Teradata tables, Sequential files.

Extensively worked on designing Business views and writing SQL queries in Teradata catering specific business requirement to generate data.

Done the documentation part like preparing migration checklist, fixing the errors and documenting the changes done, Testing Results.

Worked with QA team to create testcases and validated the source and target tables.

Used CA ESP for scheduling the jobs in UAT and Production environment.

Used OBIEE for preparing reports and analysis.

As it is a Single resource Project have to take complete care of the code and the requirements from preparing the TDD to Code migration.

Worked on moving the code from DEV to QA to PROD and performed complete Unit Testing, Integration Testing, user Acceptance Testing for the three environments while moving the code.

Environment: Informatica 9.1, Teradata, Toad, Oracle SQL Developer, UNIX Shell Scripting, Windows, MS Outlook, SQL Assistant, CA ESP, OBIEE.

Client:GENERAL ELECTRIC (GE), Cincinnati, OH Jan2011–May2013

Role: Teradata/ETL Developer

Responsibilities:

Physical and logical design of the dimensional model of data warehouse using ERWIN.

Involved in business requirements, technical requirements, high-level design, and detailed design process.

Worked on loading of data from several flat files sources to Staging using Teradata MLOAD, FLOAD.

Developed Parallel Routines and Custom Built-Operators to meet the business logic which was otherwise not possible with available active/passive stages.

Developed Overriding SQL for Data Stage Jobs.

Performance tuning and optimization of database configuration and application SQL

Performed unit and system test for the modified code and loaded shadow data marts for testing prior to production implementation.

Worked on UNIX Shell Scripting to schedule the loading process using Teradata utilities.

Involved in centralized data management for important entities; single portal for multiple users across multiple organizations, enables consistent data flow throughout the enterprise, data validation and error checks to ensure input data is clean

Loaded XML data into Teradata using XML import feature.

Acted as the lead developer for all the ETL jobs, reading data from the vendors and affiliates, and finally loading dimension, fact and other aggregate tables.

Worked with the users and testing teams to implement the business logic as expected.

Written several Teradata BTEQ scripts to implement the business logic.

Environment: Teradata V12, Teradata Utilities, Unix Servers 5380 / 5250, Oracle, BTEQ, MLOAD, FLOAD, SAS, ERWIN.

Client: Hallmark cards Inc, Kansas City – MO Oct2009 - Dec 2010

Role: ETL/Teradata Developer

Responsibilities:

Worked as Data warehouse Developer and was responsible for the timely and quality delivery of the ETL code and all related documents

Implemented SCD type-1 and SCD type-2 load strategy for the Data Warehouse

Responsible for requirements gathering from Clients, analyzed the functional specs provided by the data architect and created technical specs documents for all the enhancements.

Worked with high-volume data, Tuning and troubleshooting of mappingsandCreated documentation to support for the Application.

Perform impact analysis, identifying gaps and code changes to meet new and changing business requirements.

Legacy System Data Analysis to identify logical conditions, joins, filter criteria etc to gather data necessary for conversion. Identify gaps or flaws where the logic might fail and reporting to the Source System for correction.

Involved in developing an ETL architecture based on Change Data Capture data acquisition methods using redo log files for inserts, updates and deletes of data from transaction database to load into Data Warehouse.

Worked with Connected and Unconnected Stored Procedure for pre & post load sessions.

Tuned performance of Informatica sessions for large data files by increasing block size, data cache size and sequence buffer length.

Developed PL/SQL Procedures, Functions and Packages and SQL scripts

Understood requirements and created BSD and later developed detailed TDD for the whole Operations data migration project.

Extensively worked in data Extraction, Transformation and loading from source to target system using BTEQ, FastLoad, and MultiLoad.

Developed OLAP applications using Cognos suite of tools and extracted data from the enterprise data warehouse to support the analytical and reporting for all Corporate Business Units.

Design and developed PL/SQL procedures, packages and triggers to be used for Automation and Testing.

Involved in performance tuning on the source and target database for querying and data loading.

Developed MLoad scripts and shell scripts to move data from source systems to staging and from staging to Data warehouse in batch processing mode.

Involved in writing scripts for loading data to target data Warehouse for BTEQ, FastLoad, and MultiLoad.

Error handling and performance tuning in Teradata queries and utilities.

Data reconciliation in various source systems and in Teradata.

Involved in unit testing and preparing test cases.

Involved in peer-to-peer reviews.

Environment: Informatica 8.6, Teradata V2R6, Teradata Utilities (Multiload, FastLoad, FastExport, BTEQ, Tpump), SQL Server 2000, Oracle, FTP, CVS, Windows XP, UNIX, Pentium Server.

Client:Exilant Technologies, Hyderabad, India June 2008– Sept2009

Role: Informatica ETL Developer

Responsibilities:

Analyzed the source data coming from Flat files and worked with business users.

Worked with most of the DB client tools like SQL Navigator, TOAD, Data Browser, Teradata SQL Assistant, etc.

Technical writing skills to provide professional reports for implementation documentation and assessment.

Understand the components of a data quality plan. Make informed choices between sources data cleansing and target data cleansing.

Performed data quality analysis to validate the input data based on the cleansing rules

Used various transformations like Source Qualifier, Lookup, Update Strategy, Router, Filter, Sequence Generator, and Joiner on the extracted source data according to the business rules and technical specifications.

Optimized the Mappings using various optimization techniques and also debugged some existing mappings using the Debugger to test and fix the mappings.

Schedule, Run and Monitor sessions by using Informatica Workflow Manager.

Extensively worked in the performance tuning of the programs, ETL Procedures and processes.

Wrote Unix Shell Scripts for extracting parameters and automating the FTP and SFTP Processes.

Redesigned some of the existing mappings in the system to meet new functionality.

Written, tested and implemented various UNIX Shell, PL/SQL and SQL scripts to monitor the pulse of the database and system.

Requirement gathering and worked according to the CR.

Code Development as per the client requirements.

Involved in the development backend code, altered tables to add new columns, Constraints, Sequences and Indexes as per business requirements.

Perform DML, DDL Operations as per the Business requirement.

Creating views and prepares the Business Reports.

Resolved production issues by modifying backend code as and when required.

Used different joins, sub queries and nested query in SQL query.

Environment:Informatica 8.6, Windows 98/NT/2000, Oracle 9i/8i, TOAD, Database Tools/Utilities.

Education:

Bachelor’s degree, Electronics and Communication Engineering



Contact this candidate