Post Job Free

Resume

Sign in

Data Developer

Location:
New York, NY
Posted:
January 15, 2018

Contact this candidate

Resume:

Siddiqur Rahman

ETL/Informatica Lead

SUMMARY:

Data Warehousing: 11+ years of Experience in Extraction, Transformation, and Loading (ETL) using Informatica Power Center 9.6.1/9.5/9.1/8.6/7.1.4/6.2 and Informatica Power Exchange 9.5/9.0.1/8.6, IDQ 9.6.1.

SDLC: 11+ year’s exposure in overall SDLC including requirement gathering, development, testing, debugging, deployment, documentation, and production support.

Databases: 11+ years’ experience in dealing with various Data sources like Oracle 12C/11g/10g/9i, SQL Server 2012/2008/2000, Teradata 14/13/V2R6, DB2.

Data Modeling: Extensive experience in Analyzing the Dimensional Data modeling, Star

Schema/Snowflake modeling, FACT & Dimensions tables, Physical & logical data model using Erwin

4.5/3.1.

Business Intelligence: Extensive experience in Business Intelligence tools like Cognos 7.1,

QlikView 12, Qlik Sense 3.

UNIX: Excellent working knowledge of UNIX shell scripting, job scheduling on multiple platforms.

Coordination: Maintained outstanding relationship with Business Analysts and Business Users to identify information needs as per the business requirement. Experience working with Onside-Offshore model while co-ordination with peer developers and resolve issues.

Others: Excellent problem solving, analytical, written and communication skills with ability to work both in team as well as an individual.

COMPUTER SKILLS:

ETL TECHNOLOGY

Informatica Power Center 9.6/9.5/9.1/8.6/7.1, Informatica Power Exchange 9.5/9.0.1/8.6, Informatica Data Quality 9.6.1.

DATA WAREHOUSE

Multidimensional Model Design, Star Schema Development

DATA MODELLING

MS Visio 2010/2007, Erwin 4.5/3.1

DATABASES

Oracle 12C/11g/10g/9i, MS SQL Server 2012/2008/2000, MS Access, Sybase, DB2, MySQL, Teradata 14/13/12, Netezza.

PROGRAMMING

C, C++, SQL, PL/SQL, HTML, UNIX Scripting, Java, HDFS, Hive, Pig

REPORTING TOOLS

Cognos 7. Qlik View 12, Qlik Sense 3

OPERATING SYSTEMS

Windows 2007/XP, Linux, Sun Solaris, AIX, MS-DOS

APPLICATIONS

MS Office, MS Project, FrontPage, Toad 9.2/8.6

MANAGEMENT

MS-Office 2010/2007, MS Project

DEPLOYMENT

Perforce, Team city

DEFECT & TASK TRACKING

Quality Center, JIRA, Agile Central

PROJECT EXPERIENCE CHRONOLOGY:

BANK OF NEW YORK MELLON, NY AUG 2017 TO PRESENT

SR. ETL DEVELOPER

Responsibilities:

Interacted with business analysts to gather the business requirements by attending regular meetings with the business community.

Prepared ETL detail design and unit testing document to outline the flow of data, for testing source/targets counts and field-to-field mappings.

Developed naming convention document for various component of Informatica.

Analyzed the systems, met with end users and business units in order to define the requirements.

Conducted business requirements and source system analysis to arrive at optimal ETL designs.

Develops new ETL applications from simple to moderately complex scope, works with basic applications systems design specifications; and utilizes Company standards, procedures, and techniques.

Partnering with project managers and development resources to assist with design changes and design decisions.

Used SQL developer to run the queries to verify the data with the existing GUI functionality.

Involved in creation of test plan and test cases for testing the enhanced functionality of MBOX.

Used Cron job to schedule the UNIX script and PL/SQL programs.

Involved in PL/SQL code review and modification for the development of new requirements.

Developed Korn shell scripts to kick off back end PL/SQL, SQL programs.

Build from the scratch and maintain PL/SQL scripts, indexes, and complex queries for data analysis and extraction.

Involved in documentation to describe program development, logic, coding, testing, and any changes and corrections.

Worked with creation of reports for decision support.

Involved in analysis of the data in the development and test environment.

Work with analog team to test the existing application.

Involved in logging defects in JIRA and monitor the progress until it goes to UAT environment.

Worked on prod support of real-time projects.

Very strong in implementation of data profiling, creating score cards, Creating reference tables and documenting Data Quality metrics/dimensions like Accuracy, completeness, duplication, validity, consistency.

Good skills in analyzing trend charts from score cards to analyze the threshold which is to be considered in further development

Have good skills on understanding and development of business rules for its Standardization, Cleanse and Validation of data in various formats.

Very strong knowledge of Informatica Data Quality transformations like Address validator, Parser, Labeler, Match, Exception, Association, Standardizer and other significant transformations.

Used Informatica user defined functions to reduce the code dependency.

Handled versioning and dependencies in Informatica.

Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the requirements.

Performing ETL & database code migrations across environments.

Developed ETL code by the business logic defined in the Functional Design Document.

Environment Informatica PowerCenter 10.1.1, Oracle 12C, SQL Developer, TOAD, DB2 mainframe, Autosys, Visio, MS SQL Server 2012, Linux (Redhat 6)UNIX (AIX 6.2), Windows 7, JIRA, Harvest.

BROADRIDGE FINANCIAL SOLUTIONS, LONG ISLAND, NY APRIL 2013 TO May 2017

ETL Lead

Responsibilities:

Interacted with our customers and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build Data Warehouse.

Participated in reviewing requirements during analyze and ensuring that sufficient details is available going into the design phase.

Installed and configured PowerCenter 9.0.1 and 9.1.0 and 9.6.1on UNIX platform.

Installed and configured Informatica PowerExchange CDC 9.1.0 and 9.0.1 for Oracle on UNIX platform.

Performed Informatica upgrade from V8.6.1 to 9.0.1 and 9.1.0 and 9.6.1

Installed and configured Metadata Manager Service, Data Analyzer and Web Service Hub.

Installed and configured the following IDQ 9.0 components on AIX UNIX Servers.

DQ Content Basic.

DQ Content Accelerator

DQ Content IMO

Configured OS Profiles and LDAP Authentication.

Installed and configured Informatica Support Console 1.5 to proactively monitor the environment.

Extensively worked on UNIX shell scripts on automation (Auto restart of servers, Disk space utilization, Informatica Server monitoring and UNIX file system maintenance and cleanup, and scripts using Informatica Command line utilities.

Creation and maintenance of Informatica users and privileges.

Migration of Informatica Mappings/Sessions/Workflows from Dev, QA to Prod environments.

Created Groups, roles, privileges and assigned them to each user group.

Worked on SQL queries to query the Repository DB to find the deviations from Company’s ETL Standards for the objects created by users such as Sources, Targets, Transformations, Log Files, Mappings, Sessions and Workflows.

Ensure that all support requests are properly approved, documented, and communicated using the MQC tool. Documenting common issues and resolution procedures.

Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.

Parsed high-level design specification to simple ETL coding and mapping standards.

Designed and customized data models for Data warehouse supporting data from multiple sources on real time

Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.

Created mapping documents to outline data flow from sources to targets.

Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.

Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.

Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.

Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.

Developed mapping parameters and variables to support SQL override.

Created mapplets to use them in different mappings.

Developed mappings to load into staging tables and then to Dimensions and Facts.

Used existing ETL standards to develop these mappings.

Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.

Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.

Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.

Extensively worked reading XML data and creationg XML files.

Worked on reading and generating delimited flat file, fixed-width files.

Worked on Informatica versioned repository with check in and checkout objects feature.

Used Debugger extensively to validate the mappings and gain troubleshooting information about data and error conditions.

Worked on CDC data by using mapping variable.

Provided guidance to less experienced personnel.

Conducted quality assurance activities such as peer reviews.

Participate in the business analysis process and the development of ETL requirements specifications.

Write complex SQL queries for validating the data against different kinds of reports.

Worked with Excel Pivot tables.

Performing data management projects and fulfilling ad-hoc requests according to user specifications by utilizing data management software programs and tools like Excel and SQL.

Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.

Used Cron job to schedule the UNIX script and PL/SQL programs.

Involved in PL/SQL code review and modification for the development of new requirements.

Developed Korn shell scripts to kick off back end PL/SQL, SQL programs.

Build from the scratch and maintain PL/SQL scripts, indexes, and complex queries for data analysis and extraction.

Worked on Data validation option architecture and functionality.

Worked with production support systems that required immediate support, develop, execute and maintain appropriate ETL development best practices and procedures.

Assisted in the development of test plans for assigned projects.

Monitor and tune ETL processes for performance improvements, identify, research, and resolve data warehouse load issues.

Involved in unit testing of the mapping and SQL code.

Wrote BTEQ scripts to transform data.

Wrote Fastexport scripts to export data.

Wrote, tested and implemented Teradata Fastload, Multiload and Bteq scripts, DML and DDL.

Constructed Korn shell driver routines (write, test and implement UNIX scripts)

Wrote views based on user and/or reporting requirements.

Wrote Teradata Macros and used various Teradata analytic functions.

Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata.

Worked with Informatica Cloud to create Source /Target connections,monitor,synchronize the data in sfdc.

Worked with Informatica cloud for creating source and target objects, developed source to target mappings.

Involved in analyzing, defining, and documenting data requirements by interacting with the client and Salesforce team for the Salesforce objects.

Involved in extracting, transforming and loading data Accounts, Contracts, Reservations, Owner interactions Interactions tables from various source systems to Salesforce.com and also reverse data feed from Salesforce for CRM telesales.

Interacting and assigning development work to Developers that were offshore and guiding them to implement logic and troubleshoot the issue that they were experiencing.

Developed mappings to load data in slowly changing dimensions.

Involved in performance tuning of source & target, mappings, sessions and workflows.

SAS Metadata and ETL developer with extensive knowledge of building & implementing metadata repository & metadata security

Expertise in SAS Data Integration (DI) studio, SAS Management console and SAS BI suite

Very good proficiency in Informatica Data Quality 9.5.1/9.6.1 Extensively worked on Informatica Analyst tool 9.5.1/9.6.1 as initial phase.

Good knowledge on Informatica Data Quality Admin tasks as well.

Very strong in implementation of data profiling, creating score cards, Creating reference tables and documenting Data Quality metrics/dimensions like Accuracy, completeness, duplication, validity, consistency.

Good skills in analyzing trend charts from score cards to analyze the threshold which is to be considered in further development

Have good skills on understanding and development of business rules for its Standardization, Cleanse and Validation of data in various formats.

Very strong knowledge of Informatica Data Quality transformations like Address validator, Parser, Labeler, Match, Exception, Association, Standardizer and other significant transformations.

Automated the email process for the shipments by using JD Edwards 9.1

Customize BPEL Process to add attribute to generate task based on conditions

iPaaS to include end-to-end data management

Developed Data Replication/synchronization tasks in Informatica cloud (cloud application ingegrations).

Extensive experience on handling “Proxy Materials” (loading data from MF system to RDBMS databases, Created xml file to feed cloud channels).

Extensive experience as a solution architect for business information systems, focusing on database architecture, data modeling, data analysis, software design, programming and application integration.

Used REST and SOAP web services.

Worked on BAPIs and RFCs functions exposed by SAP.

Used IDoc to exchange data between SAP systems.

Worked on ICS REST connector.

Worked on Informatica Cloud Real Time (ICRT) process, the service call step (integration > Run Cloud Task) provides the functionality to call an Informatica Cloud services (ICS) task.

Very good knowledge of Data Warehouse/Data mart concepts and Expertise in data modeling for OLAP & OLTP systems from design and analysis to implementation including the conceptual, logical and physical data models.

Experience with Relational and Dimensional data modeling using ERWIN, understanding of Ralph Kimball and Bill Inmon methodology for Dimensional Modeling, Star and Snowflake Schemas. Thorough understanding of DW concepts like Facts, Dimensions, Surrogate keys, drill down and drill across approach

Working experience with Big Data and Hadoop File System (HDFS). In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and MapReduce concepts and experience in working with MapReduce programs using Apache Hadoop for working with Big Data to analysis large data sets efficiently.

Hands on experience in working with Ecosystems like Hive, Pig, Sqoop, Map Reduce, Flume, Oozie. Strong knowledge of Pig and Hive's analytical functions, extending Hive and Pig core functionality by writing custom UDFs. Creating Internal and External tables, Partitioning tables, bucketing tables in Hive.

Environment Informatica PowerCenter 9.6.1, Informatica PowerExchange 9.5, Oracle 12C, SQL Developer, TOAD, DB2 mainframe, Autosys, Visio, MS SQL Server 2012, Linux (redhat 6)UNIX (AIX 6.2), Windows 7.

BLUE CROSS BLUE SHIELD OF MICHIGAN, DETROIT, MI JULY 2011 TO MARCH 2013

SR. ETL DEVELOPER/SUPPORT ANALYST

Responsibilities:

Co-Ordinated with various team to gather requirement and created technical design doc.

Worked on data modeling, created HLD, LLD and Unit test case.

Worked on complex mapping design, coding and testing.

Worked on Harvest SCM for version control, configuration and migration code.

Created PowerExchange registration for real-time workflows and data map to read file from mainframe system.

Created restart token, hanging workflow identification job and auto deadlock restart jobs.

Worked on Teradata utilities like BTEQ, MLOAD, FLOAD and TPUMP.

Created and configured real-time mappings, sessions and workflows.

Worked on Informatica and Sql tuning and customization.

Provided 24X7 on call support which included monitoring morning and nightly jobs and emergency production fixes.

Coordinated the Change Management process which involved driving the QA and production deployments.

Very good proficiency in Informatica Data Quality 9.1 extensively worked on Informatica Analyst tool 9.5.1 as initial phase.

Good knowledge on Informatica Data Quality Admin tasks as well.

Very strong in implementation of data profiling, creating score cards, Creating reference tables and documenting Data Quality metrics/dimensions like Accuracy, completeness, duplication, validity, consistency.

Good skills in analyzing trend charts from score cards to analyze the threshold which is to be considered in further development

Have good skills on understanding and development of business rules for its Standardization, Cleanse and Validation of data in various formats.

Very strong knowledge of Informatica Data Quality transformations like Address validator, Parser, Labeler, Match, Exception, Association, Standardizer and other significant transformations.

Involved in Information administration including creating new users and groups, backing up the repository and domain as well as handling various upgrades.

Involved in writing and modifying UNIX shell scripts.

Worked with data quality team to fix the production data issues and also provide solutions to the business users.

Involved in providing the solutions for production support job failures at Tier-2 level.

Coordinated with offshore team to support the existing ODS system jobs.

Coordinated with business users and across the development teams in implementing new business requests.

Used Toad for data quality verification and unit testing the data.

Used Teradata, SQL assistant for running the SQL Queries to check the data loaded to the Target Tables.

Environment: Informatica PowerCenter 9.1, Informatica PowerExchange 9.0.1, Oracle 11g, Teradata 14/13, DB2 mainframe, UNIX (Sun-Solaris), UNIX (AIX 6.2), Harvest SCM, SQL Loader, Autosys, Visio, Synsort, TOAD for Oracle, SQL Developer, Windows XP.

T ROWE PRICE, BALTIMORE, MD AUG 2010 TO JUN 2011

SR. ETL DEVELOPER

Responsibilities:

Interacted with business analysts to gather the business requirements by attending regular meetings with the business community.

Prepared ETL detail design and unit testing document to outline the flow of data, for testing source/targets counts and field-to-field mappings.

Developed naming convention document for various component of Informatica.

Analyzed the systems, met with end users and business units in order to define the requirements.

Conducted business requirements and source system analysis to arrive at optimal ETL designs.

Develops new ETL applications from simple to moderately complex scope, works with basic applications systems design specifications; and utilizes Company standards, procedures, and techniques.

Partnering with project managers and development resources to assist with design changes and design decisions.

Used SQL developer to run the queries to verify the data with the existing GUI functionality.

Involved in creation of test plan and test cases for testing the enhanced functionality of MBOX.

Used Cron job to schedule the UNIX script and PL/SQL programs.

Involved in PL/SQL code review and modification for the development of new requirements.

Developed Korn shell scripts to kick off back end PL/SQL, SQL programs.

Build from the scratch and maintain PL/SQL scripts, indexes, and complex queries for data analysis and extraction.

Involved in documentation to describe program development, logic, coding, testing, and any changes and corrections.

Worked with creation of reports for decision support.

Involved in analysis of the data in the development and test environment.

Work with analog team to test the existing application.

Involved in logging defects in JIRA and monitor the progress until it goes to UAT environment.

Worked on prod support of real-time projects.

Very strong in implementation of data profiling, creating score cards, Creating reference tables and documenting Data Quality metrics/dimensions like Accuracy, completeness, duplication, validity, consistency.

Good skills in analyzing trend charts from score cards to analyze the threshold which is to be considered in further development

Have good skills on understanding and development of business rules for its Standardization, Cleanse and Validation of data in various formats.

Very strong knowledge of Informatica Data Quality transformations like Address validator, Parser, Labeler, Match, Exception, Association, Standardizer and other significant transformations.

Used Informatica user defined functions to reduce the code dependency.

Handled versioning and dependencies in Informatica.

Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the requirements.

Performing ETL & database code migrations across environments.

Developed ETL code by the business logic defined in the Functional Design Document.

Environment: Informatica Power Center 9.1/8.6, Informatica PowerExchange 9.0.1/8.6, Oracle 11g, Teradata 12, Sql Server 2000, Linux, ESP, Visio, Synsort, SQL Developer, Windows XP, JIRA.

MEDIA BANK, HASBROOK HEIGHTS, NJ JAN 2009 TO JUL 2010

ETL DEVELOPER

Responsibilities:

Provided production support by running the jobs and fixing the bugs.

Taking the backup of the repository at regular interval depending on the amount of work done.

Aided in migrating the code across various environments.

Worked on test cases and test plans.

Documented the process for further maintenance and production support.

Involved in the documentation/Design of Technical Design Document of various interfaces like Sales organization, Field AOP, and Field Forecast.

Used Teradata, SQL assistant for running the SQL Queries to check the data loaded to the Target Tables.

Designed and developed mappings, mapplets and sessions from source to target database using Informatica Power Center.

Used simple to moderately complex debugging techniques to test ETL applications.

Developed Test Cases for testing the ETL mappings.

Conducted code reviews and lead design reviews to assure compliance.

Involved in migration of ETL/Bteq code from Development environment to QA environment and finally to Production environment.

Followed Informatica development standards and enforce best practices.

Performed QA work to verify the results provided from other teams.

Worked with various file formats like fixed length and delimited.

Involved in creation of BTEQ Script for validating the data quality like referential integrity, not null check.

Maintained the proper communication between other teams.

Utilizing the existing UNIX shell scripts to run the informatica workflow through process control.

Scheduled sessions and batch process based on demand, run on time, run only once using Informatica Workflow Manager.

Identified BI reporting problems, and suggests practical resolutions.

Participates in *on-call* support for ODS and the data warehouse.

Involved in documentation to describe program development, logic, coding, testing, and any changes and corrections.

Environment: Informatica Power Center 8.1, Teradata V2 R6, SQL, Windows XP, SQL Assistant, UNIX Shell Script.

SPECTRUM HEALTHCARE, PHOENIXVILLE, PA OCT 2008 TO DEC2009

INFORMATICA DEVELOPER

Responsibilities:

Involved in creation of Logical Data Model for ETL mapping and the process flow diagrams.

Worked with SQL developer to write the SQL code for data manipulation.

Worked on Informatica versioned repository with check in and checkout objects feature.

Used Debugger extensively to validate the mappings and gain troubleshooting information about data and error conditions.

Provided guidance to less experienced personnel. Conducted quality assurance activities such as peer reviews.

Participate in the business analysis process and the development of ETL requirements specifications.

Worked with production support systems that required immediate support

Develop, execute and maintain appropriate ETL development best practices and procedures.

Assisted in the development of test plans for assigned projects.

Monitor and tune ETL processes for performance improvements; identify, research, and resolve data warehouse load issues.

Involved in unit testing of the mapping and SQL code.

Developed mappings to load data in slowly changing dimensions.

Involved in performance tuning of source & target, mappings, sessions and workflows.

Worked with connected, unconnected lookups and reusable transformations and mapplets.

Utilized Unix Shell Scripts for adding the header to the flat file targets.

Involved in designing the star schema and populating the fact table and associated dimension tables.

Identified the bottlenecks in the sources, targets, mappings, sessions and resolved the problems.

Prepared estimates and tracked each and every task and strictly adhered to the estimated deadlines.

Extensive knowledge in promoting packages (code) across development, test, preproduction and production environment using tools like Computer Associate Harvest package.

Co-ordinate with Off-Shore team in INDIA.

Environment: Oracle 10g, SQL Developer, SQL, Informatica Power Center 8.1,Sybase, Windows XP, Visio 2000, Business objects XIR2, Linux, Sql Server 2000.

US SOFTWARE LTD, DHAKA, BANGLADESH JUL 2008 TO SEP 2008

ETL DEVELOPER/DATA ANALYST

Responsibilities:

Involved in requirements gathering and data gathering to support developers in handling the design specification.

Extracted data from various source systems like Oracle, Sql Server and flat files as per the requirements.

Created ETL design docs, test case and migration documents.

Developed complex mappings, unit test case and reviewed teammates codes.

Updated and remediate existing Business Intelligence scripts and Teradata applications to reflect changing business data requirements.

Extensive experience in writing and executing scripts for validation and testing of the sessions, data integrity between source and target database and for report generation.

Involved in loading of data into Teradata from legacy systems and flat files using complex

scripts.

Leads the planning, analysis, design, implementation, maintenance, and control of the organization’s server-class databases.

Created Teradata External loader connections such as MLoad, Upsert and Update, Fastload while loading data into the target tables in Teradata Database.

Involved in understanding Business and data needs and analyze multiple data sources and document data mapping to meet those needs.

The mappings involved extensive use of transformations like Aggregate, Filter, Join, Expression, Lookup, Update Strategy, Expressions, Sequence Generator Transformations. Used debugger to test and fix mapping.

Tuned mappings using Power Center-Designer and used different logic to provide maximum efficiency and performance.

Environment: Informatica 8.1, Oracle 10g, Teradata V2R6, Teradata SQL Assistant, BTEQ, FLOAD, FEXP, MLOAD, FTP, Windows XP, Cognos, Visual Basic 6.

UNICORN SOFTWARE & SOLUTION, DHAKA, BANGLADESHDEC 2006 TO JUN 2008

DATABASE DEVELOPER/ETL DEVELOPER

Responsibilities:

As a team member involved in collecting all the requirements for building the database.

Involved in Creation and designed required tables, indexes and constraints for all production activities.

Played a key role in designing the application and migrate the existing data from relational sources to corporate warehouse effectively by using Informatica Power center.

Responsible for Extracting, Transforming and Loading data from Oracle, Flat files and placing them into targets.

Developed various mappings using Mapping Designer and worked with Source qualifier, aggregator, connected unconnected lookups, Filter transformation, and sequence generator transformations.

Involved in Data Modeling and design of Data Warehouse and Data Marts in Star Schema methodology with conformed and granular dimensions and fact tables.

Used SQL, PL/SQL for database related functionality.

Performed complex defect fixes in various environments like UAT, SIT etc to ensure the proper delivery of the developed jobs into the production environment.

Responsible to write complex SQL queries and PL/SQL procedures to perform database operations according to business requirements.

Worked with the testers closely in determining both medium and high severity defects, that would potentially affect the downstream systems before the release of the project, and also fixed the defects before moving the jobs into production.

Document all ETL related work per company's methodology.

Attended production implementation calls and co-ordinate with the build engineers during migration of the code and was highly acknowledged.

Environment: Informatica Power Center 7.1, Oracle 10g, SQL Server (7.0), SQL, Sybase, PL/SQL, Erwin3.5, UNIX, Windows XP, C.

INFOTECH DATA SOLUTIONS, DHAKA, BANGLADESH JUN 2006 TO NOV 2006

INFORMATICA DEVELOPER/UNIX ADMIN

Responsibilities:

• Worked with different Sources such as Oracle, SQL Server and Flat files

The data that was obtained from various sources were fed into the staging area in Teradata.

• Enterprise wide templates were created for handling SCD, Error



Contact this candidate