Sign in

Manager Data

Secunderabad, Telangana, India
$65 hr
May 31, 2017

Contact this candidate



Phone : 732-***-****


Certificate No. 041-002094

Professional Summary:

Software professional with 6.5 years of extensive experience Designing and Development of Enterprise solutions based on business requirement analysis, application design, dimensional data modeling, data modeling, designing, coding, development, testing and implementation of business applications with RDBMS, Enterprise levele Data Integration, Data Warehouse/Data marts, ETL, and client/server environment applications.

Strong Data Warehousing ETL experience of using Informatica 9.6.1/9.5.1/8.6/8.5/8.1/7.1 PowerCenter Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools – Informatica Server, Repository Server manager.

Experienced in working with different relational systems like Oracle 11g/10g/9i/8x, Teradata, SQL Server 2008/2005, DB2 8.0/7.0, UDB2 & Netezza as source/target databases for the ETL components .

Strong experience in Teradata and with different file systems like Flat Files, VSAM files (COBOL Source) and XML Files as both Source as well Target.

Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star schema and Snowflake schemas.

Extensive experience in developing stored procedures, functions, Views and Triggers, Complex queries using SQL Server, TSQL,TOAD and Oracle PL/SQL,SSIS,SSRS.

Experienced in development and maintenance of scripts using Unix shell scripting.

Experienced in working with multiple job scheduling tools like DAC, Control-M and Autosys.

Strong teamwork skills, ability to lead collaboration with other functional and technical staff on the project.

Worked on various kinds of Performance Tuning techniques such as compression, aggregates and rollups.

Prepare code migration document and work with release team in migrating the code from Development to UAT, Production Servers. Involved in preparing ETL specifications, Performed Developer testing, Unit testing for the Informatica mappings.

Experience in preparing Test plans, unit test cases and perform Unit Testing, Integration Testing, and Support for SIT/UAT/PROD.

Good programming skills with understanding at the conceptual level and possess excellent interpersonal skills with a strong desire to achieve specified goals.

Experience in Code migration in all phases (DEV,QA,UAT,PRD) of the project.

Created WLM schedule jobs to automate run and designed required recovery/ad-hoc components.

Experience in Agile development methodology, waterfall and TDD(Test Driven Development).

Experience with Database like Oracle, MS SQL and MySQL & Strong experience in persistence frameworks like Hibernate and JPA.

Ability to handle multiple responsibilities and work within team as well as independently.

Exceptional problem solving and sound decision making capabilities, recognized by associates for quality of data, alternative solutions, confident, accurate & decision-making.

Strong communication, organizational, planning, and documentation skills to interact with staff, users & management.

Excellent consulting skills and the ability to work effectively with end users and team members.

Experience in preparing High/Low Level Design document & Implementation Plan.

Experience on Telecom, logistics, Banking and Health Care domain.

Certification :

Informatica Power Center Developer Specialist 9.X.


Bachelor of Engineering from JNTU University, Hyderabad, India.

Technical Skills:

Programming Languages

C#, SQL, PL/SQL, UNIX Shell scripts.

ETL Tools

Informatica Power Center 9.6/9.5.1/9.1/8.6/8.5/8.1(Designer, Workflow Manager, Workflow Monitor, Repository manager and Informatica Server), IDQ.


Dimensional Data Modeling, Star Schema, Snow-Flake schema Designing, FACT and Dimensions Tables, Physical and Logical Data Modeling

Web Services


Web Technologies



MS-SQL Server 2000/2005/2008, Oracle 11g/10g/9i/8i, MS SQL Server 2008/2005, DB2 v8.1, Teradata, NoSQL database, MS Access 97/2000, TSQL, PL/SQL, Netezza,IBM DB2

OLAP Tools

Cognos 8.0/8.1/8.2/8.4/7.0 Business Objects XI r2/6.x/5.x, OBIEE


Agile, Water Fall.

Scheduling Tools

DAC, Control-M,AutoSys

Operating Systems

Microsoft Windows, Linux and Unix

Domain Experience

Banking,Telecom, logistics and Health

Projects and Responsibilities:

Client: Valspar,Minneapolis, MN December 2016 – Till Date

Project : ETL Strategy for N11i ERP Data Sourcing

Role: Sr. Informatica Developer

Project Details : This project is not implementing a “standard” data warehousing approach. The architecture is optimized to integrate with existing Valspar data systems, to support rapid assimilation of new data sources, to support ongoing storage for data from non-11i sources, and to provide a transitional platform for non-11i sources that are migrating to Oracle Financials. This project pertains to the processing of procurement data from non-11i sources into Valspar corporate information structures.


Worked closely with client in understanding the Business requirements, data analysis and deliver the client expectation.

Used Informatica PowerCenter 9.6 for extraction, loading and transformation (ETL) of data in the data warehouse.

Extracted data from different sources like Oracle, flat files; XML and SQL Server loaded into Enterprise Data Warehouse (DWH).

Involved closely working with the BA's, SA's in order to verify all requirements are met and also worked closely with the Project Managers on quality deliverable.

Involved in Migration process of moving all the informatica objects from Informatica 8.6 to Informatica 9.6 and in this process building an entirely new enterprise data warehouse (BIW).

Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.

Extensively worked on Data warehouse Administration Console (DAC) to Configure, Monitor and Schedule ETL routines of Full and Incremental ETL loads.

Extensive working knowledge with designing columnar databases PADB to support ETL and reporting.

Cross-functional team member with extensive background working with Supporting QA testing, Integration testing, User Acceptance testing and Production GO-LIVE.

Performed Unit Testing and tuned the Informatica mappings for better performance.

Used SQL Developer to execute SQL queries and perform any query tuning etc. UNIX shell scripts to write wrapper scripts to call ETL Jobs, FTP and File-watcher Automation, PMCMD Automation etc.

Environment and Technologies: Informatica Power Center 9.6/8.6, Oracle EBS 11i, OBIEE DAC, Putty, Oracle 11g,, Windows server 2003, UNIX shell scripting, SQl Developer.

Client: Wells Fargo,Minneapolis, MN November 2015 – November 2016

Project: Sales measurement, Active Data Warehouse and Enterprise Customer Profile Reporting (SAER) Database

Role: ETL Informatica Developer

Project details: Sales Measurement, Active Data Warehouse and Enterprise Customer Profile Reporting (SAER) application allows Regional Banking Management to coach around activity and to raise awareness related to the effectiveness of team member's activities. The SAER Datamart is a strategic and operational intelligence Datamart which supports management reports like Interaction effectiveness Scorecard (IES), Make an Appointment (MAA), Offer Tracking (OT), Automated/Platform referral tracking (ART), contact events (CE), service signals which provides the Regional banking management with data used for enterprise marketing decision.


Follow the Agile Process flow and methodologies of SAER project.

Working with Business Team for Requirement Gathering, Analysis and impact analysis.

Preparing functional documentation for data integration and participate in reviews.

Coordinating with cross commits to finalize the data sourcing / mapping.

Participated system design and documentation of mapping specs as per business rules.

Involved in the Review of Requirements and development of unit test case documents.

Participated in ETL profiling process.

Analyzed the source data coming from different sources (Oracle and Flat files) and worked on developing ETL mappings .

Creating mappings with different transformations like Parser transformation, Standardizer transformation, Address Validator in Informatica developer (IDQ) tool.

Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system, This led to better session performance.

Analyzing the source data and deciding on appropriate extraction, transformation and load strategy.

Designing the ETL process and defining the strategies for data loads of type1/type2/type3.

Developed number of Complex ETL mappings to load data into Dev/QA and Data warehouse target.

Worked extensively on SQL Query Tuning.

Overseeing the inbound and outbound interfaces development process by closely working with functional, developer and tester.

Helping PMO with the creation of Project time and resources Management.

Involved with the Business analysts in requirements gathering, analysis, testing and project coordination.

Maintain traceability from requirements in order to design to test results.

Created Deployment list, Deployment group and migrate code to QA.

Fix the defects and flow the QA process.

Responsible for SIT and facilitating UAT with business team along with client's lead business analysts.

Done Code reviews and validated the mappings.

Designed the Cronacle jobs and schedule of workflows for daily and Monthly loads.

Update S2T document with as per wells Fargo Standards.

Completed Unit testing and System testing with test cases, test plans.

Environment and Technologies: Informatica Power Center 9.6.0/9.5.1, Oracle 11g, MS SQL Server 2008, Sybase 12.5.3, UNIX Shell Scripting, Autosys, Perforce, Putty.

Client: Anthem Inc, Indianapolis, IN Aug 2014 to Oct 2015

Project: ETS Member Data Hub / Clinical Integration Program

Role: ETL Developer

Project details: Many pharmaceutical companies treat each clinical data integration effort as a unique project that requires expensive manual coding and processes that are neither standard nor repeatable. SAS Clinical Data Integration is an easy-to-use solution that streamlines data integration and transformation processes, thereby reducing the delays and high costs associated with custom-coding each clinical data integration project, and enabling you to decrease your time to submission. With SAS, you can gain both speed and efficiency by automating repeatable clinical data integration tasks.


Analyzed ETL design requirements to create Copybooks, XML files, from Xml Files to Message Queues and then loading into DB2 Database.

Developed ETL mappings to create XML files from Mainframe Source Files using Normalizer, aggregator and XML generator transformations.

Created the migration documents to migrate the Informatica, UNIX scripts using Informatica repository manager and Power exchange Navigator.

Developed shell scripts to automate balance, archive and metrics of Transactions.

Created job sets, jobs along with predecessors and scheduled jobs to load data daily using WLM scheduling tool.

Designed and Developed the ETL process using Informatica workflows to populate the 278 database.

Identified gaps in business requirements to Informatica workflows, debugged missed information in the XML files, and worked with Source System team to analyze missing information. Documented the workflows, ETL designs, test cases and mentor the new team members.

Developed WLM automation process to run Informatica workflows and UNIX scripts to populate the Teradata Data Mart.

Environment and Technologies: Oracle 11g/10g, Teradata, Informatica 9.1, IBM DB2,UNIX Shell Scripting, SQL, WLM, XML files and SOA (Service-Oriented Architecture).

Client: GE Health Care, Sunnyvale,CA July 2013 to July 2014

Project: DI-RTS Engagement

Role: ETL Developer

Project Details: GEHC provides transformational medical technologies and services that are shaping a new age of patient care. GEHC expertise is in medical imaging and information technologies, medical diagnostics, patient monitoring systems, performance improvement, drug discovery, and biopharmaceutical manufacturing technologies. The business interests of the company are widely dispersed with clients across the globe. This makes it necessary for the company to maintain Information Systems, which provide reliable service to its clients with

optimal response time and efficiency.


Involved in Analysis, Design, Development and Deployment of the ETL enhancement requests .

Resolving ETL Problem Tickets with in stipulated time as per the priority involved using Informatica 8.1.

Involved in preparation of Migration documents, Process Documents required per process set up by client.

Involved in development and testing of the Informatica mappings and documented the test results.

Involved in Maintenance, troubleshooting and fine tuning of Informatica mappings for maximum Performance.

Involved in analyzing of requirements for developing ETL code of new applications.

Involved in Development and Unit testing of Informatica mappings.

Documentation of all the process involved like Analysis Document, Design Document, Code Document, Test Cases and Delivery Document.

Environment and Technologies: Informatica 9.1, Cognos 8, Oracle 9i, Teradata, Unix.

Client: Franklin Templeton, Pleasanton, CA Sep2012 to June 2013


Role: Informatica Developer

Project details: SALES AND ASSET REPORTING PROJECT (SARAT).The SARAT project has two different teams namely SARAT team and CIA team, which are part of US advisory and marketing organizations (USAS). The purpose of these teams is to provide Sales and Marketing with the necessary information to measure performance of wholesalers as well as guide, focus and guide distribution efforts.


Worked for the particular Schema development team.

Contacted the Business Analysts/Users to develop, document and to evolve process and data models based upon their input and feedback.

Responsible to translate business requirements into Conceptual and Logical Process and Data Models.

Responsible in preparing Logical as well as Physical data models and document.

Created Reusable transformations for converting varchar into upper case and string to date.

Created Mapplet for history management dates,ETL Process status.

Participated in Data modeling meetings, helped to identify the relationship between tables according to the requirements.

Prepared ETL design document which consists of the database structure, Change data capture, Error handling, restart and refresh strategies.

Created mapping for creating a Parameter file.

Developed functions and stored procedures to aid complex Mappings.

Worked with DBA's to Design and Build the Staging/Target databases.

Assisted business users for Data cleansing and user acceptance testing.

Worked in Informatica for the Extraction, Transformation and Loading from various sources to the Enterprise Data warehouse and Data marts.

Developed and tested extraction, transformation, and load (ETL) processes.

Involved in change data capture (CDC) ETL process .

Environment and Technologies: Informatica Power Center 8.6(Repository manager, designer, work flow manager, Work flow monitor), Oracle 10g, blue plum, SQL developer, TOAD, Control M, Windows NT/XP.

Client: Servion Global Solutions, Bangalore, India Sep 2010 to Aug 2012

Project: Claims Data Conversion.

Role: Software Engineer

Project Details: Claims Data Conversion project was aimed to replace the distributed claim’s data from Legacy HP All base systems and other small DB2 system with single IBM DB2 system as a central repository of all claim’s data. Activities involved extraction of claims data from different source systems, convert data from these systems to a common data format defined applying the key functional requirements for the data conversion and loading the same into IBM DB2 system to make a central repository of Claims data.


Involved in Analysis of the Change Requests.

Involved in Design and Development of the code changes to accommodate Change Requests.

Involved in Testing the ETL code changes made.

Involved in Delivery of all the work products for the Iteration.

Documentation of all the process involved like Analysis Document, Design Document, Code Document, Test Cases and Delivery Document.

Full volume test of the code delivered per iteration.

Resolution of Issues faced while performing the Full Volume Testing.

Documentation of such Issues and the resolution made for future references of the same.

Report generation consisting of statistics that help in analyzing performance of the code.

Environment and Technologies: Informatica 8.6, IBM DB2, Oracle 9i, UNIX.

Contact this candidate