Sign in


Irving, Texas, United States
January 28, 2018

Contact this candidate


RAMESH B 510-***-****


9+ years of IT experience in Data warehousing and Business intelligence with emphasis on Business Requirements Analysis, Application Design, Development, testing, implementation and maintenance of Data Warehouse and Data Mart systems

Expertise in development and design of ETL methodology for supporting data transformations and processing, in a corporate wide ETL Solution using Informatica Center 9.6.1/9.5/9.1/8.6/8.5/8.1.1/7.1.3/7.1.1/7.0 (Workflow Manager, Workflow Monitor, Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Transformation developer).

Experience in creation of reusable objects Mapplets & Transformations using Mapplet Designer and Transformation Developer

Knowledge in designing and developing Data marts, Data warehouse using multi-dimensional Models such as Snow Flake Schema and Star Schema while implementing Decision Support Systems. Experience in working with FACT & Dimensions tables.

Oracle11g/10g/9i/8i and Teradata14/13for querying data using SQL, T-SQL, DB2 and BTEQ.

Working knowledge on Business Objects XI R2maintained various reports using Business Objects

Worked on Multiple Data integration projects using Informatica Power center.

Working knowledge in Oracle and PL/SQL Writing Stored Procedures, Functions and Triggers.

Extensively worked with Data Extraction, Transformation, and Loading(ETL)tools to extract data from various sources including Oracle, Teradata &Flat files.

Experience in Performance tuning of Sources, Targets, Mappings and Sessions using Session Partioning & Push Down Optimization

Worked with Oracle Stored Procedures and experienced in loading data into Data Warehouse/Data Marts using Informatica, SQL *Loader. Extensive Expertise with error handling and Batch processing

Good understanding of Data Models (Dimensional & Transactional), Conceptual/Logical & Physical Data Models, DWH Concepts, ER diagrams, Data Flow Diagrams/Process Diagrams.

Strong knowledge of Software Development Life Cycle (SDLC) with industry standard methodologies like Waterfall and Agile including Requirement analysis, Design, Development, Testing, Support and Implementation.

Experience with TOAD, SQL Developer to test, modify and analyze data, create indexes, and compare data from different schemas.

Expertised in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.

Experienced in several facts of MDM implementations including Data Profiling, Data extraction, Data validation, Data Cleansing, Data Match, Data Load, Data Migration, Trust Score validation.

Experienced in UNIX Shell scripting and PL/SQL Functions & Stored Procedures.

Experienced working with job schedulers Autosys, WLM.

Experienced in working knowledge on Data stage, Business Intelligence experience using Cognos, and Working knowledge of Business Objects XI R2.

Worked in 24/7 production support of ETL and BI applications for large Insurance & Banking data warehouses for monitoring, troubleshooting, resolving issues.

Excellent soft skills and able to interact effectively with other members of Business Engineering, Quality Assurance, Users and other teams involved with the Software Development Life cycle.

Modularize complex systems by integrating services from vendor independent of platform and technology.


ETL Tools

Informatica 9.6.1/9.1/8.6.1/8.1//7.1.2/6.2 (Power Center),SQL Server SSIS, WINSCP, Putty.

Data Modeling

Star, Snow-Flake, Fact Tables, Dimension Tables, Physical and Logical Modeling, Normalization and De Normalization.


Teradata V2R12/V2R6, Oracle 11g/10g/9i/8i/8.0/7.x, MS SQL Server 2008/2005, DB2 UDB, MS Access 2000, Sybase


Toad, SQL Navigator, Crystal Reports, TSA


MS Windows 2008/2005, LINUX, UNIX

Job Scheduling

Autosys, Shell Scripting, Tidal

Client: Huntington Bank June 2016 – Till Date

Role: Informatica /Teradata Developer Columbus, Ohio

Project Description: Client is one of the top financial institutions in the world. The objective of the project was to create and enhance the data warehouse and accommodate the data to fetch reports as per the US norms. As the first phase only North American data is being considered, Next phases would be including south America etc.

Roles & Responsibilities:

Provided Guidance in Performance tuning of Informatica mappings and DB2 SQL

Prepared the Code Review Checklist, Standards document and Reusable components to be used across multiple projects.

Analysed the data and provide resolution by writing analytical/complex SQL in case of data discrepancies.

Created Informatica Mappings to load data and used transformations like Stored procedure transformation, Connected and Unconnected lookups, Source Qualifier, Expression, Sorter, Aggregator, Joiner, Filters, Sequence, Router and Update Strategy and XML transformation

Cleansed the source data, extracted and transformed data with business rules, and built reusable mappings, known as ‘Mapplets’ using Informatica Designer.

Extensively worked on UNIX Shell scripting.

Performed all kinds of MDM Hub jobs - Stage Jobs, Load Jobs, Match and Merge Jobs using the Batch Viewer and Automation Processes.

Expertised in Informatica MDM Hub Match and Merge Rules, Batch Jobs and Batch Groups Columns, Match Rule Sets by defining all suitable properties of Fuzzy and Exact Match concepts.

Experienced in several facts of MDM implementations including Data Profiling, Data extraction, Data validation, Data Cleansing, Data Match, Data Load, Data Migration, Trust Score validation.

Involved in Performance Tuning (Both Database and Informatica) and there by decreased the load time.

Implemented Session Partitioning and used Debugger to analyse the mappings and improve performance.

Worked with business SMEs on developing the business rules for cleansing. Applied business rules using Informatica Data Quality (IDQ) tool to cleanse data.

Leading the system architecture effort for the Huntington and the IC Enterprise Resource Planning ERP . Developed, analyzed and provided recommendations on alternative strategies to determine optimal path for architectural solutions.

Responsible for creating, designing and implementing a large-scale data warehouse to support the CIFA Intelligence Community.

Involved in different interface phases of project dynamic, self correcting, scanning Configuration improves flexibility and scalability because multiple services can be easily developed from the integration of existing applications.

Experienced in Performance Tuning in SSIS packages by using Row Transformations, Block and Unblock Transformations.

Experienced in providing Logging, Error handling by using Event Handler, and Custom Logging for SSIS Packages.

Presented Data Cleansing Results and IDQ plans results to the OpCos SMEs.

Worked with Informatica and other consultants to develop IDQ plans to identify possible data issues.

Documented Cleansing Rules discovered from data cleansing and profiling.

Implemented optimization technique like PUSH DOWN optimization from Informatica.

Involved in Design review, code review, Performance analysis

Updated the schedule of the jobs to maintain business functionality in Tidal.

Created SQL for data migrations and handled Re-Org of DB2 tables.

Worked with DB team and Informatica team for migrations across multiple environments.

Trouble shot the defects encountered in SIT & UAT and provided resolutions.

Environment: Informatica Power Centre 9.1/9.6, DB2, SQL Server, TOAD, TIDAL, UNIX, XML, TERA DATA, ORACLE

Client: Inova Health Systems Jan 2014–May 2016

Role: Informatica /Teradata Developer Sterling, VA


Responsible for Requirement Gathering from the client and Analysis of the same

Responsible for converting Functional Requirements into Technical Specifications

As a Sr ETL developer provided suggestions and improvements to adhere to the standards

Prepared the Code Review Checklist, Standards document and Reusable components to be used across multiple projects

Used Push Down Optimization and Partitioning to improve the performance on Informatica

Developed mappings using Filter, Router, Expression, Source Qualifier, Joiner, Connected & Unconnected Look up, Update Strategy, Stored Procedure, Sequence Generator and Aggregator Transformations.

Wrote SQL-Overrides and used filter conditions in source qualifier thereby improving the performance of the mapping.

Implemented complex business rules in Informatica Power Center by creating Re-Usable Transformations, and working with Mapplets.

Strong experience in MS SQL Server with Business Intelligence in SQL Server Reporting Services (SSRS) and SQL Server Integration Services (SSIS)

Implemented performance tuning of Sources, Targets, Mappings and Sessions by identifying bottlenecks.

Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.

Used Session Logs, and Workflow Logs for Error handling and Troubleshooting in the development environment

Good understanding of various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache

Responsible for Unit Testing of mappings and workflows.

Extensively worked on UNIX Shell scripting and BTEQs to extract the data from the warehouse

Experienced in using Workload Manager (WLM) for scheduling and running the on-demand or scheduled jobs.

Involved in Performance Tuning (Both Database and Informatica) and there by decreased the load time

Involved in Design review, code review, Performance analysis.

Effectively communicated with the Business Partners and team members, the problem and the expected time of resolution.

Developed advanced reports using complex multi query skills in Cognos Report Studio version C8.

Developed different reports and worked on Cognos Framework Manager and MS SQL environment.

Involved in Testing and improving Report Performance.

Responsible for providing timely feedback and necessary help/cooperation to meet client expectations

Environment: Informatica Power center (v9.1), DB2, Teradata 13.1, Oracle 11g, UNIX AIX-V6, Cognos, Workload Manager(WLM),

Client: Ross Nov 2012–Dec 2013

Role: Informatica Developer Dublin, California


Worked on complex mappings and always guided the team when stuck and ensured timely delivery of the ETL components.

Created STM’s (Source to Target Mappings) for data files into the PTY model.

Worked on Subversion SVN for maintaining documents and code.

Involved in the Design and Implementation of the Data Quality framework for Error handling and file validations

Troubleshooted and Debugged Macros in excel to run SQL queries using macros on Oracle to check for validations before the loads.

Cleansed and migrated HCO & HCP data, Integrated Med pro licensing data to enable sampling processes and compliance reporting, processing of sales data from WKHLTH and IMS (prescriber master, customer master, payor master and DDD outlet master) through informatica.

Built reusable mappings, transformations using Informatica Designer

Involved in performance tuning of Informatica and optimizing the performance by identifying and eliminating Target, Source, Mapping, and Session bottlenecks while loading into Sales Force

Involved in Design review, code review, Performance analysis.

Involved in migration of the Informatica components for multiple releases

Involved in Testing of all the Interfaces with huge amount of data and fixed bugs accordingly within the Mappings/ Workflows

Used Redwood to schedule UNIX shell scripts and Informatica jobs

Environment: Informatica 8.6 ETL, Oracle 11g Database, Redwood Scheduling, Subversion code control, Mercury Quality Center,

Client: Involgix Sep 2010– Oct 2012

Role: SQL/ Informatica Developer Hyderabad, India

Project Description: Scope is to extract clinical data from sources like SAS, Oracle tables and flat files to a staging EAV table from where its populated to SDTM domains which is submission model for the client.

From the SDTM domains data is populated to JANUS which is the final submission model and an industry standard for submitting to FDA.


Worked on design changes according the new CR’s and implement them effectively

Scheduling the workflows via AutoSys and trouble shooting

Creating Scripts in Unix to assist Informatica loads as per the requirements

Migrating the code from INF 7.1 to INF 8.1

Developing mapping &Optimizing the mapping for better performance

Did PLSQL coding required for loading of targets according to the business requirements

Worked on PLSQL objects like packages, triggers, stored procedures, functions and used oracle threads to use parallel mechanism

Performing debugging & tuning of mapping

Implemented SCD type-1 and type-II for the dimension load

Involved in Unit Testing and System Testing

Upgraded the scheduling mechanism from Informatica scheduler to Control-M

Environment: PL/SQL, Informatica Power Centre 7.1/8.1, Oracle 9i, TOAD, UNIX

Client: MOBIWARE Jun 2008 – Aug 2010

Role: SQL/Informatica Developer Hyderabad, India

Project Description: The Commercial Data Warehouse (CDW) is a centralized repository that contains business orientation and integration of data that has been re-shaped for analysis (i.e. the data models are organized for analytical requirements and not operational requirements. The CDW maintains historical, transformed transactional, aggregated, and derived (calculated) data.


Used workflow Manager to create, execute and monitor sessions that perform source to target data loads.

Developed mappings to transform extracted data into Facts & Dimension tables.

Designing mapping as per the ETL Specifications.

Designed views for SAS Reporting.

Developing mapping & Optimizing the mapping for better performance.

Performing debugging & tuning of mapping.

Part of deployment team to monitor the code moving across multiple environments (Dev, QA, UAT and PROD)

Created SQL for assisting in data loads.

Troubleshooted production defects and fixed Cognos reports.

Involved in Unit Testing and System Testing as well as preparing the low-level ETL design documents and the Unit testing documents and system testing documents.

Preparing ETL design documents as per the business requirements.

Environment: SQL, Informatica Power Centre 7.1.1, Oracle 10g/9i, TOAD, Cognos and UNIX(HP).

Education Details

Bachelors from Jawaharlal Nehru University, India 2008

Contact this candidate