Resume

Sign in

ETL informatica Developer

Location:
Richmond, Virginia, United States
Posted:
April 21, 2017

Contact this candidate

Nitish Reddy

aczw3w@r.postjobfree.com

732-***-****

aczw3w@r.postjobfree.com

Professional Summary

8+ Years of Total IT Experience in Analysis, Design, Development, Implementation, Testing and maintenance of Business Intelligence solutions using Data Warehousing/Data mart design, ETL, OLAP, OLTP client /server applications.

Strong Data warehousing and ETL experience using Informatica Power Center 9.5.1/9.1.x/9.0.x/8.6.x/8.1.x/7.1.x, Informatica MDM 9.6.1, IDQ Developer/Analyst 9.5.x/8.6.x, Warehouse Designer, Source Analyzer, mapping Designer, Transformation Developer, Mapplet Designer, Mapping Designer.

Excellent knowledge upon RDBMS, OLAP, OLTP, Data Marts, ODS Systems.

Mapping experiences using different transformations like Filter, Joiner, Router, Source Qualifier, Expression, Sequence Generator, Unconnected / Connected Lookup, Update Strategy, Aggregator, Sorter, Union, Xml Generator, XML Parser, Transaction control, Match, Key Generator, Association, Consolidation, Labeler, Parser, Address Validator, Classifier, Decision, Standardizer, Expression, Case Converter and Merge.

Efficiently handled the granularity, indexing and partitioning with data warehouses.

Mapping experiences with SCD TYPES, Master, Transactional Subject areas & other dimensional models with respect to ETL Data warehouses.

Maintained warehouse Meta data, naming standards and warehouse standards for future application development.

Performing Profiles, Standardization, Identifying Anomalies, creating/Applying rules and created mappings/mapplets.

Proficient in the Integration of various data sources with multiple relational databases like Oracle11g / Oracle10g/9i, MS SQL Server, DB2, Teradata, and Flat Files into the staging area, ODS, Data Warehouse and Data Mart

Good experience upon Business Intelligence reports using Business Objects, COGNOS, OBIEE, and Jasper soft.

Extensive work experience as a DataStage Administrator, Datastage Developer, Infosphere Administrator, Infosphere Developer, Information Analyzer Administrator and Developer, Trillium Administrator, Developer and as Informatica Developer.

Performed End-to-end data lineage with IDQ along with maintaining an excellent relationship with the end client.

Worked closely with the users to understand the current state of information availability in the enterprise and then identify future needs based on their analysis of business requirements, current state environments, gap analysis, and future state warehousing implementation.

Experience on Talend ETL Enterprise Edition for Big data & Data integration v5.5/6

Experience in data extraction, transformation and loading data using Informatica Power Center from Oracle, DB2, Teradata, FTP, SQL Server 2008, Flat files, Mainframes.

Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.

Building Data Warehouse/Data Marts using Ascential Datastage 7.5.2, 7.5.3 and IBM Infosphere Datastage 8.1.0, 8.1.2 and 8.5 Versions.

Expertise in design and configuration of landing tables, staging tables, base objects, hierarchies, foreign-key relationships, lookups and packages.

Expertise in creating Mappings, Trust and Validation rules, Match Path, Match Column, Match rules, Merge properties and Batch Group creation.

Performed gap analysis between the current state and future data warehouse environment identifying data gaps and quality issues and recommending potential solutions.

Working on Talend integrated server packaging to move the Data stage code into QA/Production

Worked closely with the data warehouse development team to ensure user requirements and issues being addressed while controlling scope.

Data modeling experience using Star Schema/Snowflake modeling, FACT & Dimensions tables, Relational & Dimensional tables in perspective of Master and Transactional data.

Expertise in extended validation of report functionality developed, using Cognos, Business Objects, by writing complex SQLs at the backend.

Good knowledge upon databases using Teradata, Oracle 11g/10g/9.x/8.x/7.x, MS SQL Server 2000/2005, SQL, PL/SQL, SQL * Plus, SQL*Loader, Toad 7.3 / 8.x .0/ 9.1/ 10.3.3.0/ 11.5.

2+ years of Informatica Power Center Administration in 8.x,9.x including server setup, configuration, client installations, deployments, backups, repository management, server health and maintenance, performance monitoring and improvements, patching connectivity to connect to other databases., setting up ODBC.

Experience in Performance tuning in Informatica Power Center.

Education:

Bachelors in Engineering, Jawaharlal Nehru Technological University

Technical Skills:

ETL Tools

Informatica Power Center 9.5.1/9.1.x/9.0.x/8.6.x/8.1.x/7.1.x

Analyzer, Designer, Server manager, Work Flow Monitor, Warehouse Designer, Mapplet Designer, Mapping Designer, Work Flow Manager), Informatica MDM 9.6.1, OLAP, OLTP, IDQ/IDE, SQL * Plus Informatica Data Quality, Informatica Data Profiler. Abintio.

Operating Systems

Windows XP/2000/NT/98/95, Windows 2003,07,10 servers, Unix. MSDOS.

RDBMS

Oracle 11g,10g/9i/8i/8.0, MS-SQL Server 2008/2005/2000/7.0/6.5

Data Modeling

Physical Modeling, Logical Modeling, Relational Modeling, Dimensional Modeling (Star Schema, Snowflake, FACT, Dimensions), Entities, Attributes, ER Diagrams, ERWIN, ERStudio 7.1/6.1.

Data Base Tools

Oracle 11g/10g/9i/8i, Teradata 13.0, MS SQL Server 2008/2005/2000, Oracle SQL Developer 3.0.04, SQL*Loader, IBM Data Studio 2.2.1.0, EDW DB2.

Reporting Tools

Business Objects XI/6.x/5.x, OBIEE 10.x/9.x; Power play and Transformer. Cognos 8.3 (Framework Manager, Query Studio, Report Studio, Analysis Studio).

Scripting

UNIX Shell Scripting.

Web Technology

Web Technology HTML, Java script.

Languages

SQL, PL/SQL, C, C ++, Java 1.2, XML.

Client: Brown Greer PLC, Richmond, VA Feb’ 2015 to till Date

Role: Sr. Informatica ETL Developer

Responsibilities:

Understanding the requirements and data mapping documents.

Understanding the existing logic in PL/SQL stored procedures of different packages to implement ETL code in Informatica.

Created Informatica mappings for stage tables, ODS tables and core table loads.

Extensively worked with Informatica power center 9.5.

Used Tidal job scheduler to execute Informatica ETL jobs.

Created mappings by using Joiner, Expression, Aggregator, Router, Lookup, SQL transformation, Update Strategy and Sequence Generator.

Responsible for complete ETL Designs (Stage, ODS, EDW, Data Store, Data Mart) and involved in ETL Development using Informatica Power Center and Oracle.

Worked with various Lookups-Connected lookup, un connected lookup, Static cache, and Dynamic cache lookups.

Extensively worked on IBM Infosphere Information Server Administration in both Data Integration and Dataware Housing projects.

Used Automic tool as scheduler to kick off jobs. (Shell script involved)

Extensively involved in ICD code testing.

Had to create Datastage jobs instantly on demand for moving data from Sybase ASE (OLTP) to GreenPlum and Sybase IQ for testing purposes.

Parameterized using mapping, session parameters, and workflow variables by defining them in parameter files.

Developed complex DataStage jobs according to the business requirements / mapping documents

Knowledge on IBM Infosphere FastTrack, which accelerates the transistion of business requirements in to data integration projects.

Involved in migration of DataStage projects and jobs from earlier versions to IBM InfoSphere 8.0.1 version

Design the Talend ETL flow to load the data into hive tables.

Worked extensively on Erwin and ER Studio in several projects in both OLAP and OLTP applications.

Experience using Data Warehouse Appliances such as Netezza for data warehousing, business intelligence and Error Free Data integration.

Excellent Knowledge, Understanding and Some Hands-on Experience in different Business Intelligence Tools like Business Objects, Cognos.

Designed and developed approaches to acquire data from new sources like Mainframe (DB2), and AS400 (DB2).

Implemented SCD type 1 for ODS table loads.

Develop the Talend jobs and make sure to load the data into HIVE tables & HDFS files.

Develop the Talend jobs to integrate with Teradata sytem from HIVE tables.

Analysis of Source, Requirements, existing OLTP system and identification of required dimensions and facts from the Database.

Analysed session logs, session run properties, Workflow run properties.

Analysed rejected rows using bad files.

Performed code reviews and unit testing.

Implemented performance tuning techniques.

Used mapping debugger for data and error conditions to get trouble shooting information.

Created data visualization reports and dashboards using Tableau desktop

Environment: Informatica Power Centre 9.5, Talend Enterprise edition v6 for Big data & data integration, Oracle 11g, Erwin, SQL developer, tidal, Unix Shell Scripting, Mainframe, Putty, SQL, PL/SQL, Tableau desktop, Windows 8.

Cisco Systems, San Francisco, California. Nov’ 2013 to Jan’ 2015

Role: Informatica ETL Developer

Responsibilities:

Understanding the requirements and data mapping documents.

Prepared high level mapping design documents from requirements.

Created mappings for Initial, Daily loads, Base table and functional table loads.

Created Project Expense Statements in all Data marts and other Financial Reports using SSRS.

Developed mappings for Audit reconsolidation of daily loads.

Worked with Informatica power center 9.5 to extract the data from IBM Mainframes DB2 sources into Teradata.

Used Director Client to validate, run, schedule and monitor the jobs that are run by IBM InfoSphere DataStage server.

Configured various third party tools, technologies with IBM Infosphere Information Server software.

Used ETL methodologies and best practices to create Talend ETL jobs.

Created mappings by using Joiner, Expression, Aggregator, Data masking, Router, Lookup, SQL transformation, Update Strategy and Sequence Generator.

Analysis of Source, Requirements, existing OLTP system and identification of required hierarchies for dimensions and facts from the Database.

Worked with various Lookups-Connected lookup, un connected lookup, Static cache, and Dynamic cache lookups.

Parameterized using mapping, session parameters, and workflow variables by defining them in parameter files.

Used DataStage Designer to develop parallel jobs to extract, cleanse, transform, integrate and load data into Data Warehouse.

Designed the ETL processes using Informatica to load data from Mainframe DB2, Oracle, SQL Server, Flat Files, XML Files and Excel files to target Teradata warehouse database.

Used ILM as an approach between data and storage management.

Experience using Visio and Erwin design tools like Deployment Processes and Adhoc to create Star and Snowflake schemas.

Extensive testing ETL experience using Informatica 9.1/8.6.1 (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager) Teradata and Business Objects

Analysed session logs, session run properties, Workflow run properties.

Analyzing the source data to know the quality of data by using Talend Data Quality.

Analysed rejected rows using bad files.

Performed code reviews and unit testing.

Deployment of Informatica objects by deployment groups in Informatica and non informatica objects by using Eclipse, UNIX deployment process.

Used ETL methodologies and best practices to create Talend ETL jobs.

Used Unix shell scripting for informatica jobs to run from ESP workload scheduler.

Implemented performance tuning techniques.

Used mapping debugger for data and error conditions to get trouble shooting information.

Leading a team of ETL developers from offshore.

Environment: Informatica Power Centre 9.5, Talend 5.5/5.0, Teradata, DB2, Unix Shell Scripting, Erwin, Mainframes, Teradata SQL Assistant, DB Visualizer, Putty, Esp. workload Automation, Eclipse, winscp, FileZilla, Windows 8.

United Health Group, Trenton NJ Aug 2012 to Oct’ 2013

Role: Informatica Developer

Responsibilities:

Designing the source to target mappings that contain the Business rules and data cleansing during the extract, transform and load process.

Responsible for converting Functional Requirements into Technical Specifications and production support.

Worked on Designer tools like Source Analyzer, Warehouse Designer, Transformation Consultant, Mapplet Designer and Mapping Designer.

Creation of shortcuts for source, targets and created transformations according to business logic.

Developed and designed Data marts extensively using Star Schema.

Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.

Utilized IDQ for data profiling, standardization and structuring the data.

Tuned the informatica mappings for optimal load performance.

Used Teradata utilities fast load, multi load, t pump to load the data.

Good knowledge on Teradata Manager, TDWM, PMON, DBQL, SQL assistant and BTEQ.

Developed Oracle views to identify incremental changes for full extract data sources.

Developed the automated and scheduled load processes using Tidal scheduler.

Involved in migration of mappings and sessions from development repository to production repository.

Developed mappings for Type 1, Type 2 & Type 3 Slowly Changing Dimension (SCD) using Informatica Power Center.

Responsible for Unit testing and Integration testing of mappings and workflows.

Performed data integration from Informatica cloud into SAP & Salesforce cloud.

Proficient in developing PL/SQL Stored Procedures, Packages and triggers to implement Business logic.

Analysed, documented and maintained Test Results and Test Logs.

Exposure on partitioning for loading large volumes of data.

Scheduled Informatica workflows using Informatica Scheduler to run at regular intervals.

Developed Reports / Dashboards with different Analytics Views (Drill-Down, Pivot Table, Chart, Column Selector, and Tabular with global and local Filters) using OBIEE.

Worked on Shell Scripts in order to convert incoming Excel Flat files from xls to csv, which helps importing into Informatica.

Worked closely with reporting team and generated reports using Business Objects

Wrote Shell Scripting as a part of FTP'ing the Files to the mainframe region.

Actively participating in agile process development style like attending scrum meeting (standup meetings).

Accepted inbound transactions from multiple sources using FACETS.

Supported integrated EDI batch processing and real-time EDI using FACETS.

Environment: Informatica PowerCenter 9.5/9.1 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Informatica IDQ, SQL Query Analyzer 8.0, Oracle 10g/11g, SQL Developer, Data Loader, OBIEE, BI Publisher, Erwin, Unix Shell Scripting, putty, FACETS and Business Objects.

Bajaj Allianz Insurance, India Feb 2010 to July 2012

Role: Informatica Developer

Responsibilities:

Extracted data from various heterogeneous sources DB2 UDB, Oracle and Flat files.

Written medium to complex queries, tuned the queries for optimum performance, and related to data model.

Worked with power center tools like Designer, Workflow Manager, Workflow Monitor, and Repository Manager.

Extensively used Informatica Transformation like Source Qualifier, Rank, SQL, Router, Filter, Lookup, Joiner, Aggregator, Normalizer, Sorter etc. and all transformation properties.

Solid Expertise in using both connected and unconnected Lookup Transformations.

Extensively worked with various Lookup caches like Static cache, Dynamic cache and Persistent cache.

Used Debugger wizard to troubleshoot data and error conditions.

Responsible for Best Practices like naming conventions, and Performance Tuning.

Developed Reusable Transformations and Reusable Mapplets.

Extensively used Various Data Cleansing and Data Conversion Functions in various Transformations.

Involved in Migrating from Business Objects XI R2 to BO XI 3.0.

Involved in developing, designing, maintaining new, and existing universes by creating joins and cardinalities, responsible for creation of the universe with Classes, Objects and Condition Objects.

Involved in creation of SAP Business Objects XI R3.0 Desktop Intelligence and Web Intelligence Reports with different functionalities like Dynamic cascading prompts, drill down, linking reports, charts and graphs, ranking and alerts based on user feedback.

Used multi-dimensional analysis (Slice & Dice and Drill Functions) to organize the data along a combination of Dimensions and Drill-down Hierarchies giving the ability to the end-users to view the data.

Created graphical representation of reports such as Bar charts, 3D charts and Pie charts etc.

Created Web Intelligence reports with multiple data providers and synchronized the data using merge dimension option in SAP Business Objects XI R3.0.

Developed Error Logging and Auditing strategies for the ETL Scripts.

Responsible for migrate the code using deployment groups across various Instances.

Optimized SQL queries for better performance.

Responsible for Unit Testing of Mappings and Workflows.

Developed Slowly Changing Dimension Mappings for Type 1 SCD, Type 2 SCD and fact implementation.

Environment: Informatica Power Center 8.5(Repository Manger, Designer, Workflow Manager, Workflow Monitor, Source Analyzer, Mapplet Designer, Workflow Designer, Task developer, Worklet Designer), SAP Business Objects XI R2, Toad 9.7, Erwin 4.2, DB2 UDB, DB2 LUW, Oracle 10g, ODI 11g Administration, Access, Flat files, SQL/PLSQL, UNIX shell scripting.

Bharat Heavy Electrical Limited (BHEL)

Role: ETL Developer June 2008 to Jan’2010

Responsibilities:

Analyzed the logical model of the databases and normalizing it when necessary.

Involved in identification of the fact and dimension tables.

Extensively used Informatica-Power Center 7.1.3 for extracting, transforming and loading into different databases.

Wrote PL/SQL stored procedures and triggers for implementing business rules and transformations.

Developed transformation logic as per the requirement, created mappings and loaded data into respective targets.

Created Source and Target Definitions in the repository using Informatica Designer – Source Analyzer and Warehouse Designer.

Worked extensively on different types of transformations like Source qualifier, Expression, Filter, Aggregator, Rank, Lookup, Stored procedure, Sequence generator.

Used Mapping Designer to create mappings.

Replicated operational tables into staging tables, to transform and load data into the enterprise data warehouse using Informatica.

Created and scheduled Worklets, configured email notifications. Set up Workflow to schedule the loads at required frequency using Power Center Workflow Manager, Generated completion messages and status reports using Workflow Manager.

Involved in Performance Tuning at various levels including Target, Source, Mapping, and Session for large data files.

Used SQL tools like TOAD to run SQL queries to view and validate the data loaded into the warehouse.

Documented Data Mappings/ Transformations as per the business requirement.

Performed testing, knowledge transfer and mentored other team members.

Environment: Informatica Power Center 7.1, Oracle 9i, PL/SQL, Toad, SQL plus, Erwin, Tivoli.



Contact this candidate