Resume

Sign in

Data Developer

Location:
Houston, Texas, United States
Posted:
October 28, 2016

Contact this candidate

Asad Zahid

Informatica Developer

acw9zo@r.postjobfree.com

281-***-****

Professional Summary:

Over 7 years of IT experience with a focus and good understanding of Banking, Finance, Telecommunications, Public and Health care industry including designing, developing, implementing and supporting Data Warehouses, data marts and data integration, ETL projects.

Good team player with excellent communication and interpersonal skills with an ability to perform individually as well as ability to work in-group and Team Lead. Excellent problem solving with good analytical and programming skills good time management skills, quick leaner and initiative to learn new technology and tools quickly.

Over 7 years experience as Informatica Developer in Data integration, Migration and ETL processes using Informatica PowerCenter 9.X,8.X/7.X/6.X/5.X, Power Exchange (CDC), Informatica Data Quality both in real time and batch processes.

Extensive understanding of Informatica Grid Architecture, Oracle/Teradata architecture and how the load and resources are distributed in Grid to maximum utilize the resources available, increase performance.

Ability to meet deadlines and handle multiple tasks, decisive with strong leadership qualities, flexible in work schedules and possess good communication skills.

Extensively worked on developing Informatica Mappings, Mapplets, Sessions, Workflows and Worklets for data loads from various sources such as Oracle, Flat Files, Teradata, XML, SAP, DB2, SQL Server, Tibco.

Involved in Technical Architecture Design Process, by gathering high level Business Requirement Document and Technical Specification Document from Technical Architecture and followed the specific conventions.

Analyzing Data by High level audit, JAD sessions during the business requirement definition and evaluate granularity, Historical consistency, and valid values and attribute availability.

Proven ability to interface and coordinate with cross functional teams, analyze existing system, wants and needs, design, implement database solutions and providing integration solution.

Well acquainted with Performance Tuning of sources, targets, mapping and sessions to overcome the bottlenecks in mappings.

Developed Complex Mapping using Source qualifier, Lookup, Joiner, Aggregator, Expression, Filter, Router, Union, Store Procedure, Web Services, Transaction Control and other Transformation for the slowly changing Dimension (Type1, Type2, and Type3) to keep track of historical data.

Day to Day responsibilities contains reviewing the High level Documentation, TRD, identifying development objective, defect and fixes.

Implemented Performance Tuning techniques at application, database, and system levels by using Hints, Indexes, Partitioning, Materialized View, External Table, Procedures, Functions and Explain Plan.

Informatica automation through UNIX Shell Scripts for running sessions, aborting sessions and creating parameter files. Written number of shell script (Korn Shell) to run various jobs and experience in FTP files processing.

Created database objects like Tables, Views, Synonyms, DBlink and Indexes from Logical database design document and Experience in writing, testing and implementation of the Stored Procedures, Functions and Triggers using Oracle PL/SQL, Teradata.

Expertise in Production Support,Maintaince and Enhancement Project.Responsible for break fixed in Project.

Dynamic career reflecting pioneering experience and high performance in System Analysis, design, development and implementation of Relational Database and Data Warehousing Systems using IBM Data Stage 8.0.1/7.x/6.x/5.x (Info Sphere Information Server, Web Sphere, Ascential Data Stage).

Experience in implementing Netezza, DB2, DB2/UDB, Oracle 10.2 / 9i / 8i / 8.0, SQL Server, PL/SQL, and SQL*Plus database applications.

Technical Skills:

Data Modeling

NF1/NF2/NF3, Logical, Physical Modeling with ERWIN, ERstudio, MsVisio.

ETL Tools

Informatica PowerCenter 9.1/8.6.1/8.6.0/8.1.1/8.0/7.x/6.x, PowerExchange 8.X, IDQ,DT studio with structure and Semi Structure data IDQ,rules/data profiling/data cleansing/data parsing, IDE

Databases

Oracle 8i/9i/10g/11g, MS SQL Server 2000/2005/2008, /2012, Teradata (v2R5, 13), Teradata utilities TTU (BTEQ, MLOAD, FLOAD, Fast Export).

Big Data

Hadoop, Map Reducer, Teradata Aster, Query It, Teradata Studio, HBase, Hive, Putty.

Programming Languages

SQL, SQL* Plus, PL/SQL, Teradata Procedures, PowerShell/Batch/UNIX/PERL Shell Scripting, JAVA, HTML, XML.

Reporting Tools

Teradata SQL Assistance, Toad 9.x, Oracle SQL Developer,SSMS,SSRS,Visual Studio, MS Excel, MS Word, Autosys 11.3, IBM Cognos 8.

Environment

Unix, Linux, IBM AIX, Window xp/NT/2003/vista/7

United Health Group, Minnesota May 2014 – Present

Role: - Sr. ETL Developer

Responsibilities: -

Created tabular and matrix reports in SSRS .wrote store procedure and function to create SSRS reports.

Involved in the creation and deployment of the reports using SSRS and configuring the SQL server reporting server

Used SSIS tool suite and created mapping using various transformation like OLEDB command/conditional split.

Integrate with the SFDC salesforce.com to receive updates in real time.

Gathered requirements, estimated LOE and created functional requirements for key Salesforce projects

Configured Salesforce including but not limited to validation rules, workflows, custom labels, custom settings, profiles and permissions.

Integrated and administered Salesforce for Outlook (SFO) for large end-user base

Worked with talend solution implementation guideline on daily basis and helped team integrated Informatica with big data.

As a Sr. ETL developer I integrated Informatica with HDFS and Teradata aster.

Developed Hive, Pig script for data analysis on daily basis.

As a Sr. ETL developer had high-level overview of all required TRD and Mapping Document, and reviews them as development prospective to find out what all the development required.

Involved in the meeting with the Business to discuss the business requirement, ETL specification, calculation of all the metrics, sub metrics and reports that Frontier generate for the business to review.

As a primary developer in the team, responsible to schedule Team meeting to discuss the Development, changes required, Time Line to meet all the SLA.

As a part of the development team understanding the business requirement, Involved in the code change in the Oracle Store Procedures and Package required for the different metrics that calculates the revenue based on the CLEC usage, Retail, and DUF usage.

Developed Informatica Workflow using the Informatica PowerCenter 9.1 that was required for the new Order, Trouble, Billing feeds. Using the Informatica PowerCenter/PowerExchange and also was responsible for the Daily Load of over 500 files every day.

Developed standard and re-usable mapping and mapplets using the various transformations like expression, lookup, joiner that Extract, Transform and Load the data between different environment using relational writer and FastExport.

Responsible for Analyzing the data coming from the different sources(Oracle 11g, Teradata v13, XML, Flat Files) and databases by using Complex Query.

Independently wrote Oracle Store Procedures and functions that calculate penalty at the end of the month.

Created the modified the teradata utilities BTEQ, MLOAD, FLOAD script to load the data from the various data source and legacy systems into Teradata Test and Production environment.

Used the Teradata EXPLAIN and visual explain to analyze the cost and improce the Query performance.

Created the workflow for Incremental Load and Slowly Change Dimensions (SCD1, SCD2, SCD3) using the Lookup and Aggregator transformation both for real time and Batch Processes.

Developed the real time workflow that process the message and message queues from the MQ Series, web service message using the XML Parser and web service consumer transformation.

Involved in writing various shell scripting using Unix/Perl and automate the Informatica workflow using cron jobs.

Used Hints, store procedure, complex query using the Toad 9.5.0 to help the team in the performance tuning of the database.

After the development of the Informatica workflow responsible for the importing and exporting the code to QA and PRD.

Involved in the Unit testing the informatica workflow and with the Testing team to help them writing the test cases for different metrics.

Used Power Exchange for web services to submit member assessment to health plan and consume result.

Working for Oracle BI and Cognos 8.3

Used PDI kettle for ETL Process over Hadoop

Used pentaho with Hadoop to load data from oracle

Used Spoon GUI to debug The PDI process.

Used DataStage Parallel Extender to load data, utilizing the available processors to achieve job performance, configuration management of system resources in Orchestrate environment.

Environment: Informatica Power Center 9.1, Oracle 11g, Teradata v13, Teradata SQL Assistance, RHEL (awl, sed), Windows 2003 Server, Toad, SQL Developer.

Wells Fargo, San Francisco, CA Jan 2012 –April 2014

Role: SR. ETL Developer

Responsibilities:

Generated periodic reports based on the statistical analysis of the data from various time frame and division using SQL Server Reporting Services (SSRS)

Migrated data from Heterogeneous Data Sources and legacy system (DB2, Access, Excel) to SQL Server databases using SQL Server Integration Services (SSIS) to overcome transformation constraints.

Designed DTS/SSIS Packages to transfer data between servers, load data into database, archive data file from different DBMSs using SQL Enterprise Manager/SSMS in SQL Server 2000/2005 environment, and deploy the data. Scheduled the jobs to do these tasks periodically.

●Gathered requirements from business analysts for designing and development of the system and developed transformation logic and designed various complex Mappings in the Designer for data load and data cleansing.

●Involved in writing Functional Design Specification document (FDS), translating BRD business requirement documents to technical specification, creates/maintained/modified database design Documents with detail description of logical entities and physical table.

●Master Data Management (MDM) process and tool was used to provide data transformation, data consolidation, data governance so that business is supportive to take Decision Support System.

●Categorized Dimension and Fact table by interviewing Oracle functional expert and Business Analyst and evaluated the granularity, Historical consistency and attribute availability.

●Extensively used Star schema and Snow Flake Schema methodologies in building and designing the logical data model in SCD1, SCD2, and SCD3.

●Created an environment object to trust open communication, provide the team with a vision of project objective, also motivated and inspired team members.

●Categorized the transactions by department whether it’s from sales, customers etc by looking at transaction code available in code master table to implement correct logic and filter unnecessary transactions.

●Responsible for communicating with offshore team to maintain the critical work which has to handle by on site team and work which was not time critical has to assign to off-shore team.

●Used Informatica PowerExchage with CDC option to capture the inserted, updated, deleted and any changed data and Exchanging data between different partner having their own format and database.

●Designed Mappings using Mapping Designer to load the data from various sources (Flat Files, Sap, Teradata, VSAM) using different transformations like Source Qualifier, Expression, Lookup (Connected and Unconnected), Aggregator, Update Strategy, Joiner, Filter, and Sorter transformations

●Extensively used the capabilities of PowerCenter such as File List, pmcmd, Target Load Order, Concurrent Lookup Caches etc. Created and Monitored Workflows using Workflow Manager and Workflow Monitor

●Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables, Session Parameters and making Parameter file to pass value through Shell Scripting.

●Used shortcuts (Global/Local) to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.

●Used debugger to test the mapping and fixed the bugs, and find error by fetching Session log from Workflow Monitor.

●Extensively involved in identifying performance bottlenecks in targets, sources, mappings, sessions and successfully tuned them by using persistence cache, lookup transformation and parallel processing for maximum performance.

●Did error handling and performance tuning in Teradata queries and utilities for the existing complex scripts.

●Created Autosys jobs to schedule sessions and workflows on Linux and scheduled various reports using Schedule Management tool/Event Studio

●Developed Unit Test monitoring the run time to ensure successful execution of the data loading processes involve in preparing high level documents about mapping details.

Used Shell Scripts for better handling of incoming source files such as moving files from one directory to another directory and extracting information from the log files on Linux server.

Involved in preparing SDD (System Design Document) based on PRD (Project Requirement Document), TRD (Technical Requirement Document) and Market analysis team inputs.

Integrated with Hadoop HDFS using Power Exchange for Hadoop for loading data

Used DataStage Netezza Enterprise stage to load data, utilizing the available processors to achieve job performance, configuration management of system resources in Orchestrate environment.

Environment: Informatica PowerCenter 8.6, Teradata, Teradata SQL Assistance, IBM-AIX (awl, sed, korn shell), Windows 2003 Server, Autosys 11.3

Banking Data Mart / SG Cowen & Co., NY July 2010 – Dec 2011

Role: ETL Developer

Responsibilities:

Responsible to analyze functional specification and prepared technical specification, development, deploying and testing according to business requirement

Involved in data profiling process to systematically examine the quality, scope of data source to build a reliable ETL system with minimum transformation and human intervention before load to target table.

Involved in cleansing process to clean invalid values like zip code format, ensuring consistency across records, removing duplicates and ensure if complex business rules has been enforced.

Worked on Informatica PowerCenter tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor.

Extracted the data from the flat files and Relational databases into staging area and populated into data warehouse using SCD Type 2 logic to maintain the history.

Developed number of complex Informatica Mappings, Mapplets, and reusable Transformations to implement the business logic and to load the data incrementally.

Employed the Source Qualifier, Lookup, Expression, Aggregator, Rank, Sequence and Joiner Transformations in the Mappings to populate data into the target.

Tested behavior of the mappings using the Load Test option & Unit Test cases and debugging using the Debugger to see mapping flow through various transformations.

Closely worked with the QA team during the testing phase and fixed the bugs that were reporting.

Created & maintained tables, views, synonyms and indexes from Logical database design document and wrote stored procedures in PL/SQL for certain key business requirements.

Trouble shooting of connectivity problems. Looked up for errors by maintaining a different log file.

Performance tuned the mappings by integrating the logic and reducing the number of transformations used.

Maintained high level document including source name, target name, number of rows in both target and source, transformation used and session information.

Environment: Oracle 10g, Informatica PowerCenter 7.1, PL/SQL, UNIX, Windows 2003 Server, Toad 8.6.1.

Client: - Allied Bank Pakistan Jan 2009 – June 2010

Role: ETL Developer

Responsibilities:

●Understanding the business needs by gathering information from Business Analyst, Data Modeler and Business user.

● Involved in process of preparing Physical design of input and output by documented record layouts, source, target location, file/table sizing information.

●Extensively explore the physical model of Data Warehouse including all the Dimensions and facts like household, branch, account etc to better understanding of requirement and meet all the SLA.

●Informatica Designer tools were used to design the source definition, target definition and transformations to build mappings and documented how data will be transformed.

●Monitored batches and sessions for weekly and Monthly extracts from various data sources across all platforms to the target database using cron scheduler.

●Supported technical teams involved with ORACLE issues and operations also developed UNIX shell Scripting and PL/SQL Procedure to extract and load data.

●Used Informatica command utilities like pmcmd to communicate with integrated server to perform some task like startworkflow, abortworkflow.

●Performance tuned the mappings using different techniques like indexes, store procedure, functions, and materialized view to provide maximum efficiency.

●Developed UNIX shell scripts to move source files to archive directory, maintaining logs and automate processes by using command utilities sed, awk, cut and other Unix command

●Involved in Unit testing to find the validation of mapping and session before testing session start and report all bugs.

●Involved in the migration of the jobs from DB2 to Netezza.

Environment: Informatica PowerCenter 6.2, Oracle 9i, PL/SQL, Windows NT, UNIX.

Education:

Bachelor’s Degree in Computer science and Engineering (Pakistan).



Contact this candidate