Post Job Free
Sign in

Data Developer

Location:
Houston, TX
Posted:
August 15, 2019

Contact this candidate

Resume:

Asad Zahid

ETL Informatica Developer

*********.*********@*****.***

310-***-****

Professional Summary:

Over 7 years of IT experience with a focus and good understanding of Banking, Finance, Telecommunications, Public and Health care industry including designing, developing, implementing and supporting Data Warehouses, data marts and data integration, ETL projects.

Good team player with excellent communication and interpersonal skills with an ability to perform individually as well as ability to work in-group and Team Lead. Excellent problem solving with good analytical and programming skills good time management skills, quick leaner and initiative to learn new technology and tools quickly.

Over 7 years experience as Informatica Developer in Data integration, Migration and ETL processes using Informatica Power Center 9.X,8.X/7.X/6.X/5.X, Power Exchange (CDC), Informatica Data Quality both in real time and batch processes.

3+ years of experience using Talend Data Integration/Big Data Integration (6.1/5.x) / Talend Data Quality.

Good Understanding knowledge of Ralph Kimball & Bill Inman Methodologies.

Extensively worked on Dimensional modeling (Star/Snow flake), Data migration, Slowly Changing Dimensions, Data cleansing and Data Staging of operational sources using ETL processes and providing data mining features for data warehouses.

Extensive experience with Change Data Capture (CDC).

Extensive understanding of Informatica Grid Architecture, Oracle and how the load and resources are distributed in Grid to maximum utilize the resources available, increase performance.

Ability to meet deadlines and handle multiple tasks, decisive with strong leadership qualities, flexible in work schedules and possess good communication skills.

Strong experience in aws Cloud Engineer (Administrator) and working on aws. Services IAM, EC2, VPC, EBS, EFS, EIP, AMI, SNS, RDS, Dynamo DB, Cloud Watch, Cloud trail, Auto. Scaling, S3, and Route 53.

Extensively worked on developing Informatica Mappings, Mapplets, Sessions, Workflows and Worklets for data loads from various sources such as Oracle, Flat Files, Teradata, XML, SAP, DB2, SQL Server, Tibco.

Involved in Technical Architecture Design Process, by gathering high level Business Requirement Document and Technical Specification Document from Technical Architecture and followed the specific conventions.

Analyzing Data by High level audit, JAD sessions during the business requirement definition and evaluate granularity, Historical consistency, and valid values and attribute availability.

Proven ability to interface and coordinate with cross functional teams, analyze existing system, wants and needs, design, implement database solutions and providing integration solution.

Well acquainted with Performance Tuning of sources, targets, mapping and sessions to overcome the bottlenecks in mappings.

Developed Complex Mapping using Source qualifier, Lookup, Joiner, Aggregator, Expression, Filter, Router, Union, Store Procedure, Web Services, Transaction Control and other Transformation for the slowly changing Dimension (Type1, Type2, and Type3) to keep track of historical data.

Day to Day responsibilities contains reviewing the High level Documentation, TRD, identifying development objective, defect and fixes.

Implemented Performance Tuning techniques at application, database, and system levels by using Hints, Indexes, Partitioning, Materialized View, External Table, Procedures, Functions and Explain Plan.

Informatica automation through UNIX Shell Scripts for running sessions, aborting sessions and creating parameter files. Written number of shell script (Korn Shell) to run various jobs and experience in FTP files processing.

Expertise in creating mappings in TALEND using tMap, tJoin, tReplicate, tParallelize, tConvertType,, tflowtoIterate, tAggregate, tSortRow, tFlowMeter, tLogCatcher, tRowGenerator, tNormalize, tDenormalize, tSetGlobalVar, tHashInput, tHashOutput, tJava, tJavarow, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDie etc.

Experience on web services using talend components like tSOAP, tREST, tWebService, tWebServiceinput.

Experience on moving data from traditional files to cloud database.

Created database objects like Tables, Views, Synonyms, DBlink and Indexes from Logical database design document and Experience in writing, testing and implementation of the Stored Procedures, Functions and Triggers using Oracle PL/SQL.

Technical Skills:

Data Modeling

NF1/NF2/NF3, Logical, Physical Modeling with ERWIN, ERstudio, MsVisio.

ETL Tools

Informatica Power Center 10.2/9.1/8.6.1/8.6.0/8.1.1/8.0/7.x/6.x, Power Exchange 8.X, IDQ,DT studio with structure and Semi Structure data

Databases

Oracle 8i/9i/10g/11g, MS SQL Server 2000/2005/2008, Teradata (v2R5, 13)

Programming Languages

SQL, SQL* Plus, PL/SQL, Teradata Procedures, Power shell/Batch/UNIX/PERL Shell Scripting, JAVA, HTML, XML.

Reporting Tools

Cognos 8.3/8.2/8.0, Business Objects xi 1, MS SQL Server Reporting services 2005.

Blue Cross Blue Shield (BCBS) San Francisco,CA Feb 2019 – Present

Role: - Sr. ETL Developer

Monitored and cleansed data using Informatica Data Quality (IDQ) - source data profiling using Informatica Analyst and creating mappings in Informatica Developer to apply cleansing and business rules. Developed number of Graphs based on business requirements using various Ab Initio Components such as Partition by Key, Partition by round robin, reformat, rollup, join, scan, normalize, gather, merge etc.

Responsibilities: -

Involved in requirements gathering phase, low-level design, build and unit testing of ETL mappings.

Code optimization and performance tuning of the Informatica mappings. Adhering to the Informatica build standards and best practices.

Creating mappings, workflows, sessions and analyzed the source s, targets, transformed the data, mapped the data and load the data into the targets using Informatica.

Transformations like Expression, Router, Joiner, Lookup, Update Strategy, and Sequence Generator are mainly used.

Creating the reusable mapping using mapplet designer and those mapplet are used in mapping.

Creating the parameter files and the list files, which is used by the workflows.

Unit testing for every component worked upon.

Performed Data Integration between different Databases and to HDFS, Hive and Hbase using Talend Data Integration and Talend Big Data ETL tools.

Major successive enhancements post deployments part of Quality control.

Communicating with the Designers, Quality Analysis team for technical and functional issues.

Involved in the code review for optimization and the defect prevention.

Involved in the peer Reviews of the Mappings.

Monitoring all the jobs related to Informatica and UNIX.

Worked on production support issues and activities

Actively optimize performance of code through performance tuning and push-down optimization.

Effectively create standard review document and performed review of code to check if it maintains the coding standards and meets the business requirements.

Consistently testing data between Source and Target through minus query and preparation of Unit Test case documents.

Adequately creating and labeling Deployment groups and preparing Release document for the same.

Created Talend jobs to copy the files from one server to another and utilized Talend FTP components

Experienced in using Informatica cloud REST API to access and perform Informatica cloud tasks.

Experienced in Developing, maintaining and enhancing Informatica Cloud Mappings, Task Flows, and processes (Data Synchronization, Data replication)

Anadarko Petroleum, Houston TX Sep 2018 – Jan 2019

Role: - Sr. ETL Developer

Worked extensively on Informatica designer to design a robust end-to-end ETL process involving complex transformation like Source Qualifier, Lookup, Update Strategy, Router, Aggregator, Sequence Generator, Filter, Expression, Stored Procedure, External Procedure, Transactional Control for the efficient extraction, transformation and loading of the data to the staging and then to the Data Mart (Data Warehouse) checking the complex logics for computing the facts

Responsibilities: -

•Extensively Working on Informatica tools such as Source Analyzer, Warehouse Designer, Transformation Designer, Mapplet Designer and Mapping Designer

• Using Informatica Power center 10.1 to make the changes to the existing ETL Wrote PL/SQL procedures which are called from Stored Procedure transformation to perform database actions such as truncate the target before load delete records based on a condition and Re-name the tables.

•Create ETL mappings with complex business logic for high volume data loads using various transformations such as Aggregator, Sorter, Filter, Normalizer, SQL Transformation, Lookup, Joiner, Router, Update Strategy, Union, Sequence Generator, transformation language likes transformation expression, constants, system variables, data format strings etc.

• Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using Informatica 10.x.

• Extensively worked on building workflows, worklets, sessions, command tasks etc.

• Designed and developed Informatica mappings including Type-I, Type-II and Type-III slowly changing dimensions (SCD).

• Created mappings using different IDQ transformations like Parser, Standardizer, Match, Labeler and Address Validator.

• Extensively worked on ETL performance tuning for tune the data load, worked with DBAs for SQL query tuning etc.

• Perform Designs and Analysis on business systems applications, systems interfaces, databases, reporting, or business intelligence systems

•Involved in writing SQL Stored procedures and Shell Scripts to access data from various sources.

•Performed System, Integration testing and supported User Acceptance Testing.

•Managed stakeholder communication and validated the fixed issues in order to ensure support availability as per agreed SLAs.

•Responsible to handle multiple (18) Informatica applications to design, implement, support and operating Data Integration Infrastructure.

• Built custom transformations using aws Glue, Lambda and Kinesis, helped reduce costs in etl tool & on-premise infrastructure

Worked on planning, design and implementing the standards, guidelines and best practice.

Develop ETL mappings for various Sources (.TXT, .CSV, XML) and load the data from these sources into relational tables with Talend Enterprise Edition.

Experienced in using Informatica cloud REST API to access and perform Informatica cloud tasks.

Experienced in Developing, maintaining and enhancing Informatica Cloud Mappings, Task Flows, and processes (Data Synchronization, Data replication)

Experienced in integrating Sales force and Web Services using Informatica Power Exchange and Informatica Cloud

•Conduct requirement planning sessions for the hands-on development of the implementation strategy and customize payroll layouts based on best practices and plan knowledge

•Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.

•Used IDQ's standardized plans for addresses and names clean ups.

• Worked on IDQ file configuration at user's machines and resolved the issues.

Environment: Informatica Power Center 10.2, Oracle 11g, Teradata v13, Teradata SQL Assistance, RHEL (awl, sed), Windows 2003 Server, Toad, SQL Developer.

United Health Group, Minnesota Aug2017 – Aug 2018

Role: - Sr. ETL Developer

Working with architect, business managers to understand the requirement and source system in order to prepare design documents specifying the various ETL approaches, pros and cons of the different approaches with suggestion of the best approach. Worked extensively on Informatica designer to design a robust end-to-end ETL process involving complex transformation like Source Qualifier, Lookup, Update Strategy, Router, Aggregator, Sequence Generator, Filter, Expression, Stored Procedure, External Procedure, Transactional Control for the efficient extraction, transformation and loading of the data to the staging and then to the Data Mart (Data Warehouse) checking the complex logics for computing the facts.

Responsibilities: -

As a Sr. ETL developer had high-level overview of all required TRD and Mapping Document, and reviews them as development prospective to find out what all the development required.

Involved in the meeting with the Business to discuss the business requirement, ETL specification, calculation of all the metrics, sub metrics and reports that Frontier generate for the business to review.

As a primary developer in the team, responsible to schedule Team meeting to discuss the Development, changes required, Time Line to meet all the SLA.

As a part of the development team understanding the business requirement, Involved in the code change in the Oracle Store Procedures and Package required for the different metrics that calculates the revenue based on the CLEC usage, Retail, and DUF usage.

Developed Informatica Workflow using the Informatica Power Center 9.1 that was required for the new Order, Trouble, Billing feeds. Using the Informatica Power Center/Power Exchange and also was responsible for the Daily Load of over 500 files every day.

Developed standard and re-usable mapping and Mapplets using the various transformations like expression, lookup, joiner that Extract, Transform and Load the data between different environment using relational writer and FastExport.

Responsible for Analyzing the data coming from the different sources(Oracle 11g, Teradata v13, XML, Flat Files) and databases by using Complex Query.

Independently wrote Oracle Store Procedures and functions that calculate penalty at the end of the month.

Created the modified the Teradata utilities BTEQ, MLOAD, FLOAD script to load the data from the various data source and legacy systems into Teradata Test and Production environment.

Used the Teradata EXPLAIN and visual explain to analyze the cost and improve the Query performance.

Created the workflow for Incremental Load and Slowly Change Dimensions (SCD1, SCD2, SCD3) using the Lookup and Aggregator transformation both for real time and Batch Processes.

Developed the real time workflow that process the message and message queues from the MQ Series, web service message using the XML Parser and web service consumer transformation.

Wrote and executed several complex SQL queries in AWS glue for Etl operations in Spark data frame using sparksql.

Involved in writing various shell scripting using Unix/Perl and automate the Informatica workflow using cron jobs.

Used Hints, store procedure, complex query using the Toad 9.5.0 to help the team in the performance tuning of the database.

After the development of the Informatica workflow responsible for the importing and exporting the code to QA and PRD.

Involved in the Unit testing the informatica workflow and with the Testing team to help them writing the test cases for different metrics.

Environment: Informatica Power Center 9.6.1, Informatica Power Exchange, SQL Server 2014/2012, T-SQL, Oracle 11g, SQL Developer, Management Studio, Microsoft Visual Studio, SAP BW.

Bank of America, Plano TX May 2017 –Aug 2017

Role: SR. ETL Developer

The primary objective of this project is to provide improved levels of service delivery to customers, by effective analysis on data collected from different sources. The secondary objective is to improve the management reporting and analysis process by providing a multi-dimensional analysis capability to help monitor the key business parameters for Customer Service Division. The Data Warehouse was populated on daily basis from underlying OLTP Applications using PowerCenter.

Responsibilities:

●Gathered requirements from business analysts for designing and development of the system and developed transformation logic and designed various complex Mappings in the Designer for data load and data cleansing.

●Involved in writing Functional Design Specification document (FDS), translating BRD business requirement documents to technical specification, creates/maintained/modified database design Documents with detail description of logical entities and physical table.

●Master Data Management (MDM) process and tool was used to provide data transformation, data consolidation, data governance so that business is supportive to take Decision Support System.

●Categorized Dimension and Fact table by interviewing Oracle functional expert and Business Analyst and evaluated the granularity, Historical consistency and attribute availability.

●Extensively used Star schema and Snow Flake Schema methodologies in building and designing the logical data model in SCD1, SCD2, and SCD3.

●Created an environment object to trust open communication, provide the team with a vision of project objective, also motivated and inspired team members.

●Categorized the transactions by department whether it’s from sales, customers etc by looking at transaction code available in code master table to implement correct logic and filter unnecessary transactions.

●Responsible for communicating with offshore team to maintain the critical work which has to handle by on site team and work which was not time critical has to assign to off-shore team.

●Used Informatica PowerExchage with CDC option to capture the inserted, updated, deleted and any changed data and Exchanging data between different partner having their own format and database.

●Designed Mappings using Mapping Designer to load the data from various sources (Flat Files, Sap, Teradata, VSAM) using different transformations like Source Qualifier, Expression, Lookup (Connected and Unconnected), Aggregator, Update Strategy, Joiner, Filter, and Sorter transformations

●Extensively used the capabilities of PowerCenter such as File List, pmcmd, Target Load Order, Concurrent Lookup Caches etc. Created and Monitored Workflows using Workflow Manager and Workflow Monitor

●Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables, Session Parameters and making Parameter file to pass value through Shell Scripting.

● Designed Talend Jobs Using Big Data components like tHDFSInput, tHDFSOutput, tHDFSput, tHiveLoad, tHiveInput, tHivecreateTable.

●Used shortcuts (Global/Local) to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.

●Used debugger to test the mapping and fixed the bugs, and find error by fetching Session log from Workflow Monitor.

●Extensively involved in identifying performance bottlenecks in targets, sources, mappings, sessions and successfully tuned them by using persistence cache, lookup transformation and parallel processing for maximum performance.

●Created Autosys jobs to schedule sessions and workflows on Linux and scheduled various reports using Schedule Management tool/Event Studio

●Developed Unit Test monitoring the run time to ensure successful execution of the data loading processes involve in preparing high level documents about mapping details.

Used Shell Scripts for better handling of incoming source files such as moving files from one directory to another directory and extracting information from the log files on Linux server.

Involved in preparing SDD (System Design Document) based on PRD (Project Requirement Document), TRD (Technical Requirement Document) and Market analysis team inputs.

Environment: Informatica Power Center 8.6, Teradata, Teradata SQL Assistance, IBM-AIX (awl, sed, korn shell), Windows 2003 Server, Autosys 11.3

Banking Data Mart / SG Cowen & Co., NY Apr 2014 – May 2017

Role: ETL Developer

As a member of ETL Team, responsible for analyzing, designing and developing ETL strategies and processes, writing ETL specifications for developer, ETL and Informatica development, administration and mentoring. Participated in business analysis, ETL requirements gathering, physical and logical data modeling and documentation. Designed the ETL processes using Informatica to load data from Mainframe DB2, Oracle, SQL Server, Flat Files, XML Files and Excel files to target Oracle warehouse database.

Responsibilities:

Responsible to analyze functional specification and prepared technical specification, development, deploying and testing according to business requirement

Involved in data profiling process to systematically examine the quality, scope of data source to build a reliable ETL system with minimum transformation and human intervention before load to target table.

Involved in cleansing process to clean invalid values like zip code format, ensuring consistency across records, removing duplicates and ensure if complex business rules has been enforced.

Worked on Informatica PowerCenter tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor.

Experienced in using debug mode of talend to debug a job to fix errors.

• Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using talend Integration Suite.

Custom API interacts with Aws s3 service which stores the users files in a bucket through the cloud.

Extracted the data from the flat files and Relational databases into staging area and populated into data warehouse using SCD Type 2 logic to maintain the history.

Developed number of complex Informatica Mappings, Mapplets, and reusable Transformations to implement the business logic and to load the data incrementally.

Employed the Source Qualifier, Lookup, Expression, Aggregator, Rank, Sequence and Joiner Transformations in the Mappings to populate data into the target.

Tested behavior of the mappings using the Load Test option & Unit Test cases and debugging using the Debugger to see mapping flow through various transformations.

Closely worked with the QA team during the testing phase and fixed the bugs that were reporting.

Created & maintained tables, views, synonyms and indexes from Logical database design document and wrote stored procedures in PL/SQL for certain key business requirements.

Trouble shooting of connectivity problems. Looked up for errors by maintaining a different log file.

Performance tuned the mappings by integrating the logic and reducing the number of transformations used.

Maintained high level document including source name, target name, number of rows in both target and source, transformation used and session information.

Environment: Oracle 10g, Informatica PowerCenter 7.1, PL/SQL, UNIX, Windows 2003 Server, Toad 8.6.1.

Client: - Allied Bank Pakistan June2012 – Mar 2014

Role: ETL Developer

This project involves the process of maintaining and supporting the Allied Bank website along with various enhancements related to products. The process is being updated by integrating multiple applications like web services to provide better quality of service along with improvements to the existing website in order to enable ease of use for both the business along with the users. The Database is built on DB2 and informatica is used to perform the daily ETL load process. Data loads from various source systems are scheduled to update on a nightly basis. Adhoc functionality is also available for data loads.

Responsibilities:

●Understanding the business needs by gathering information from Business Analyst, Data Modeler and Business user.

● Involved in process of preparing Physical design of input and output by documented record layouts, source, target location, file/table sizing information.

●Extensively explore the physical model of Data Warehouse including all the Dimensions and facts like household, branch, account etc to better understanding of requirement and meet all the SLA.

●Informatica Designer tools were used to design the source definition, target definition and transformations to build mappings and documented how data will be transformed.

●Monitored batches and sessions for weekly and Monthly extracts from various data sources across all platforms to the target database using cron scheduler.

●Supported technical teams involved with ORACLE issues and operations also developed UNIX shell Scripting and PL/SQL Procedure to extract and load data.

●Used Informatica command utilities like pmcmd to communicate with integrated server to perform some task like startworkflow, abortworkflow.

●Performance tuned the mappings using different techniques like indexes, store procedure, functions, and materialized view to provide maximum efficiency.

●Developed UNIX shell scripts to move source files to archive directory, maintaining logs and automate processes by using command utilities sed, awk, cut and other Unix command

●Involved in Unit testing to find the validation of mapping and session before testing session start and report all bugs.

Environment: Informatica Power Center 6.2, Oracle 9i, PL/SQL, Windows NT, UNIX.

Education:

Highest Qualification

University

Bs electronic engineering

Sirsyed university of engineering & technology



Contact this candidate