Sign in

Informatica developer

Edison, New Jersey, United States
October 20, 2016

Contact this candidate


Sandeep Golla

Informatica - Developer

PH: 510-***-**** Email:


Over all 3 Years of experience in information technology and Data Warehouse Development, Design, Mapping, Extraction, Migration, Data Conversion, Data validation and Development of ETLs.

Extensively worked in Extraction Transformation Loading (ETL) process using Informatica 9.x/8.x/7.x/6.x, Informatica Data Quality (IDQ) 9.x/8.x, Informatica Big Data Edition (BDE) 9.x and developing user-centric data models, queries.

Strong Data Warehousing ETL experience of using Informatica PowerCenter Client tools – Designer, Source Analyzer, Target Designer, Transformation Developer, Mapping and Mapplet Designer along with Workflow Manager and Monitor tools.

Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions.

Strong Data Warehousing ETL experience of using Informatica PowerCenter Client tools – Designer, Source Analyzer, Target Designer, Transformation Developer, Mapping and Mapplet Designer along with Workflow Manager and Monitor tools.

Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions.

Extensively worked in Extraction Transformation Loading (ETL) process using Informatica 9.x/8.x/7.x/6.x, Informatica Data Quality (IDQ) 9.x/8.x, Informatica Big Data Edition (BDE) 9.x and developing user-centric data models, queries.

Designed and developed business rules to generate consolidated data using Informatica data quality tool.

Utilized of Informatica IDQ to complete initial data profiling and matching/removing duplicate data.

Worked on Data Profiling using IDQ-Informatica Data Quality to examine different patterns of source data. Proficient in developing Informatica IDQ transformations like Parser, Classifier, Standardizer and Decision.

Experience with creating profiles, rules, scorecards for data profiling and quality using IDQ .

Involved in cleaning the data using Informatica Data Quality.

Providing high quality detailed analysis, design and build processes to support specific projects with regard to Data Quality profiling and DQ rule coding

Played a key role in creating / managing the Data Quality Process and also in the development of DQ Rules, Profiles, Profile Models and Scorecard for various business requirements

Actively Involved in Design, Develop and Test the Data Quality Rules, Profile Models, Column and Primary Key Profiling for various data sources to determine root causes, ensure correction of data quality issues due to technical or business processes

Worked in building the reusable mapplets development and coordinated with the data quality development tool

Worked in IDQ complex transformations, used address validator, Parser, match and merge transformations to cleanse the data

Profile source data using IDQ tool, understand source system data representation, formats & data gaps Created Exception handling process and worked on the best practices and standards for exception handling routines

Worked with technical/business analysts for DQ requirements, business analysis & project coordination

Expertise in working with relational Databases such as Oracle, SQL Server, Natezza.

Excellent knowledge in data warehouse development life cycle including Software Development Life Cycle (SDLC), including business requirement analysis, data mapping, build, unit testing, user acceptance testing and implementation of slowly changing dimensions.

Strong understanding of the principles of DW using fact tables, dimension tables and star schema modeling.

Expert in writing critical ETL stored procedures that extract data from different data feeds, transform and load into Data warehouse using different transformations like Source Qualifier, Expression, Lookup, Aggregate, Update Strategy and Joiner that is used by business users in the form of reports.

Expert in designing and developing backend PL/SQL packages in database layer, functions and triggers.

Extensive experience in Requirement Gathering, Customization, Maintenance and Implementation of various Business Applications.

Hands on experience in resolving errors using Debugger, Log files of Session & workflows.

Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, TSQL and Oracle PL/SQL.

Efficient in trouble shooting, performance tuning, and optimization ETL Mappings.

Having Good knowledge in Teradata environment that is capable of storing terabytes of data.

Excellent knowledge with SQL Assistant, Teradata DBQL,Teradata Manager,Teradata M-Load utility and capable of using those teradata utilities with Informatica to load huge data into the tables without consuming the time and system resources thus analyzing the Teradata system performance.

Having ability in developing ELT’s using Teradata M-load, Fast Load.

Expert in developing complex test cases and performing unit and integration testing.

Experienced in Gathering specifications, Design and Development and Smooth Integration of the Application.

Performed activities including execution of test plans and design of exception handling strategy. Completed documentation in relation to detailed work plans, mapping documents.

Analytical and Technical aptitude with the ability to solve complex problems Can work very well in a team as well as independently under minimum supervision

Technical Skills:

ETL Tools:

Informatica power center 9.x/8.x/7.x/6.x (Source Analyzer, Data Warehouse Designer, Mapping Designer, Mapplet, Transformations, Workflow Manager, Workflow Monitor, Worklets),

Informatica Data Quality (IDQ) 9.x/8.x

Data Modeling:

Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling,

Front-end tools:

TOAD 7.x, SQL Navigator 7.x, Win sql,


Oracle 9i/8i, IBM DB2 UDB 8.0/7.0,Teradata, MS SQL Server 2008/2010/2012, MS Access, IBM Natezza.


SQL, PL/SQL, Transact SQL, UNIX.


Windows 95/98/2000/XP, WinNT4.0, IBM-AIX 4.3, Sun Solaris, UNIX


Master of Science in Computer Science (M.S.C.S),U.S.A .

Professional Experience:

Coach, Inc. NY Jan’16 -- Till date

Sr. Informatica ETL / BDE Developer

Description: Coach, Inc. is an American luxury fashion company based in New York City. The company is known for Handbags, accessories and gifts for women and men.

Worked as ETL Informatica Developer to extract the data from various source systems Oracle, SQL server, SAP, Sales force and flat files and load it into Data warehouse to help the top executives make decisions and to also analyze various metrics and understand the performance of the organization.


Worked with Data Warehouse analysts for requirement gathering, business analysis, and translated the business requirements into Data Requirements specifications to build the Enterprise Data Warehouse and Data Model.

Extensively worked on Informatica Power Center 9.6 Designer client tools like Source Analyzer, Target Designer, Mapping Designer, Maplets Designer and Transformation Developer.

Worked on Source to Target Mapping Document to make sure that All Transformation are captured correctly and documented.

Extensively worked on Informatica Power Center 9.6 Designer.

Worked with different data base sources flat files and XML files into Natezza database.

Utilized Informatica Big Data (BDE)

Extracted Data from Hadoop and Modified Data according to Business requirement and load into Hadoop.

Imported Mappings and rules into power center for scheduling using Tidal.

Debugged the error’s using hadoop logs when mappings run in Hadoop mode and used informatica logs when mappings run in Native mode

Created mappings,mapplets according to Business requirement using informatica big data version and delopyed them as applications and exported to power center for scheduling

Worked on Natezza database used Natezza Bulk reader and Natezza bulk writer to handle Bulk data reading and loading

Extensively used the Pushdown for loading the Bulk data fast

Troubleshooted the ETL process developed for Conversions and implemented various techniques for enhancing the performance.

Prepared Key metrics and dimensions document. In this process, analyzed the System Requirement document and identified areas which were missed by Business and IT team.

Modifying the UNIX scripts as a part of Upgrade and making sure they point to the correct directories.

Modified existing and developed new ETL programs, transformations, indexes, data staging areas, summary tables, and data quality routine based upon redesign activities.

Performed Weekly Data load as per Sudden Data Model Changes in Agile Methodology and when Bugs Identified by Testing Team.

Used Informatica debugger to test the data flow and fix the mappings.

Moved the mappings, sessions, workflows, Maplets from one environment to other.

Formulated and implemented Historical load strategy from multiple data sources into the data warehouse. Data concurrency, Prioritization, comprehensiveness, completeness and minimal impact to existing users are taken as key attributes for Historical data load strategy.

Extensively used Tidal for scheduling the informatica jobs

Environment: Informatica 9.6 Hot fix3, Informatica Big Data Edition, PL/SQL, Linux Script, Oracle, Natezza, winsql, Aginity, SQL Developer, Tidal.

First American FinancialCorporation, Santa Ana, CA Jun’ 15 – Dec’ 15

Informatica Developer/IDQ Developer

Description: First American Financial Corporation is a United States financial services company and is a leading provider of title insurance and settlement services to the real estate and mortgage industries.First American offers its products and services directly and through its agents and partners throughout the United states and in more than 60 countries

The company's core business lines includes title insurance and closing/settlement services, title plant management services,title and other real property records and images,valuation products and services,home warranty products,property and casualty insurance and banking,trust and investment advisory services.The project mainly deals with the collection of data from different title companies at various locations.This data will be loaded intothe enterprise data ware house and individual data marts and reports are generated as per the requirements.


Involved in full project life cycle - from analysis to production implementation and support with emphasis on identifying the source and source data validation, developing particular logic and transformation as per the requirement and creating mappings and loading the data into databases.

Based on the requirements created Functional design documents and Technical design specification documents for ETL Process.

Designed, implemented and documented ETL processes and projects completely based on best data warehousing practices and standards.

Involved in performance tuning and fixed bottle necks for the processes which are already running in production. Also gained 3-4 hours load time for each process.

Designed and developed process to handle high volumes of data and high volumes of data loading in a given load window or load intervals.

Created Workflow, Worklets and Tasks to schedule the loads at required frequency using Workflow Manager.

Worked with Informatica tools IDQ (Developer) with various data profiling techniques to cleanse, match/remove duplicate data.

Enabling Business users to create/modify Data Quality Rules against source data from Ingest layer of source

Utilized the Informatica Data quality (IDQ) to identify and merge customers and addresses.

Created complex mappings using Aggregator, Expression, Joiner transformations including complex lookups, Stored Procedures, Update Strategy and others.

Extensively used Power Exchange 8.1 for CDC.

Designed and developed table structures, stored procedures, and functions to implement business rules.

Extensively used XML transformation to generate target XML files.

Extracted data from various data sources such as Oracle, Flat file, XML, SAP, Sales force .

Extensively developed shell scripts for Informatica Pre-Session, Post-Session Scripts.

Extensively involved in migration of Informatica Objects, Database objects from one environment to another.

Developed Unix shell scripts to scheduling Scripts by using scheduling tools.

Tested data integrity among various sources, targets and various performance related issues.

Provide an ability to perform data profiling on a specific data domain to identify data issues and work with source data to correct the data.

Provide the business with the ability to exclude data items from data quality business rules for a period of time or for a specific study.

Environment: Informatica Power Center 9/8.6, Power Exchange 8.1, Oracle 9i, PL/SQL,SAP, Toad, SQL,UNIX and Linux servers, Shell scripting.

Posidex Technologies LTD, India Jan’13 – Dec’13

Informatica Developer

Description: posidex Technologies develops software products, solutions and offer services in the domain of Entity Resolution and Analytics .It helps enterprises in their operations, decision making and planning during the process of customer data integration, data quality management and master data management . providing Business Solutions and High-End Technology based services to its customer base in USA, Europe, Nordic and Asia with on-site, off-site and off-shore development models. It delivered many large-scale enterprise class solutions in the areas of E-Business, Knowledge Management, Business Intelligence, etc.,


Worked with the business analysts and DBA to gather business requirements to be translated into design considerations.

Involved in dimensional modeling of the data warehouse to design the business process, grain, dimensions and facts.

Identified and tracked the slowly changing dimensions, heterogeneous sources and determined the hierarchies in dimensions.

Created cubes, dimensions, hierarchies, mappings and mapplets using Informatica Power Center Designer

Resolving the production issues in production support.

Used various transformations like lookup, update strategy, router, filter, sequence generator, source qualifier on data extracted according to the business rules and technical specifications.

Used task developer in the Workflow manager to define sessions.

Created reusable worklets and mapplets.

Involved in data cleansing and data profiling.

Developed schedules to automate the update process and Informatica Batches/Sessions.

Monitored sessions that were scheduled, running, completed or failed.

Debugged mappings for failed sessions.

Worked on database connections, SQL joins, aliases, views, aggregate conditions.

Implemented data integrity constraints like referential integrity using primary-key and foreign-keys relationships.

Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.

Wrote PL/SQL procedures for processing business logic in the database.

Involved in production support.

Tuned the matching parameters based on test results.

Implemented and supported the Business Intelligence environment and User Interface.

Environment: Informatica Power Center 6.1/7.1, Oracle 8i, SQL Server, SQL/PLSQL, UNIX.

Net Magic IT Services Pvt. Ltd, Mumbai Jun’12 – Dec ’12

SQL- Developer

Description: Net magic an NTT Communications company, is India’s leading Managed Hosting and Cloud Service Provider, with 9 carrier-neutral, state-of-the-art data centers and serving more than 1500 enterprises globallyIt provides a broad range of services like life, auto, home and business insurance. The project that I was involved consisted of developing a Claims Management System, which provides the technology that assists claim professionals in administering claim practices in a timely and effective manner. The data warehouse was designed for analyzing all claims across all its lines of business.


Designed and developed database views and stored procedures.

Maintained user roles and privileges at the database level. It involved enrolling users, maintaining the system security, controlling and monitoring user access.

Used SQL to extract the data from the database.

Worked with Development Life Cycle teams.

Wrote database triggers for automatic updating the tables and views.

Designed and developed forms and reports.

Responsible for system analysis, design, testing and documentation.

Responsible for developing PL/SQL procedures, packages, triggers and other database objects.

Responsible for developing user interfaces using Visual Basic 6.0.

Implemented integrity constraints on database tables.

Responsible for performance tuning activities like Optimizing SQL queries, explain plan, creating indexes.

Worked in Database structure changes, table/index sizing, transaction monitoring, data conversion and loading data into Oracle tables using SQL*Loader.

Environment: Oracle, SQL, PL/SQL, Windows, SQL Navigator, UNIX

Contact this candidate