Sign in

Data Microsoft Office

Omaha, Nebraska, United States
February 02, 2018

Contact this candidate




** ***** ** ********** ** Information Technology and in the areas of analysis, design, modelling and development in client-server environment with focus on Data warehousing, DataMart’s and Transactional applications.

Experience in data warehouse development life cycle, requirements analysis, dimensional modeling, Repository management, implementation of STAR, Snowflake schemas, Incremental loads/Change Data Capture and slowly changing dimensions and good knowledge in both Kimball/W.H.Inmon Methodologies.

Responsible for all activities related to the development, implementation and support of ETL processes for large scale data warehouses/ Data Marts using Informatica Power Center.

Experience in Process Optimization with Informatica Workflows and address scalability problems.

Experience in creating mappings using various transformations, and developing strategies for Extraction, Transformation and Loading (ETL) mechanism by using Informatica.

Experience in Netezza, Oracle, SQL Server and PL\SQL Programming.

Extracted data from various heterogeneous sources like Oracle, DB2, Flat file, SQL Server, XML files and loaded into Oracle/SQL Server and Netezza databases.

Extensively worked with Stored Procedures, Triggers, Cursors, Indexes, Functions and Packages.

Experience in business requirements gathering, define and capture metadata for business rules, creating physical infrastructure, system analysis, Data Analyst, Data Modeler, design, development and user training associated with the ETL processes.

Expertise with domain knowledge like Healthcare, Insurance and Financial and Compatible with Waterfall, Agile Methodologies and Kanban.

Strong knowledge in SDLC from Business Requirements, Functional

Requirement and Test documents, design decisions including data/control flow analysis, process,

development, Unit testing, Reviewing code and Documentation.

Experience in developing Scorecards/reports using SQL Server integration services (SSIS), Business Objects tools.

Experience includes system support, performance tuning, backup and recovery, space management, maintenance and troubleshooting. Prepared test scripts and test cases to validate data and maintaining the data quality.


BI /ETL Tools

Informatica 10.x/9.x/8.x/7.x/6.x/5.x PowerCenter / PowerMart / PowerConnect, Workflow Manager, Workflow Monitor, Designer, SQL Server Reporting Services, Business Objects, SSRS

Data Modeling

ER Studio, Erwin


Oracle 10g/9i/8i, MS SQL Server 6.5/7.0/2000/2005/2008/2008R2/2012, MS Access, Netezza 9.0, Advanced Query Tool (AQT)

Testing tools

Quality Center, Test Director, QTP, Visual Studio


Windows 2008R2/2008/2003/2000/XP/98/7/8, UNIX


Tidal, Informatica Scheduler, SQL Server Agent, CRON, Autosys


Undraleu, Team Foundation Server, BMC Service desk express, APL, Mass Mutual Designs (MMD), Prism, Beyond Compare, Prism and Product, Microsoft Office


Employer: Webilent Technology Inc., Windsor, CT

Client: Lincoln Financial Group, Omaha, NE June’ 15 – Till Date

Senior ETL\ Informatica Developer


Data Scorecards project are an organized set of performance measures, grouped together according to various aspects of the business which help to determine how well we are doing in each of these areas and also building a warehousing database which can be used by multiple departments to for analysis on the group protection data.


Building the DataMart’s to load group protection data for under writing, claims and service information.

Working with scorecard stewards, Data Governances and Data Advisory group to understand the requirement and loading the data based on the Business prerequisite.

Working on creating the mapping document, Unit test documents and design documents for each source systems.

Experience in modelling the DataMart’s using ERWIN and working with the Data Analyst and Data Architect in understanding the standards and Norms of the company.

Used Informatica 9.5.1 /10.1.0/10.1.1 in creating the slowly changing Type 1, type 2, type 6 and type 6 hybrid loads to the data warehouse and staging databases souring from Oracle/ SQL Server database and external source systems.

Enabling flash black archive (FDA) in the oracle on the source side to capture the changes occurred in the source system and which helped us to extract end of the day or intra data changes occurred in a day.

Worked as Data analyst and Data modelers and created new DataMart model and also created mapping sheets to provide requirements to other developers.

Attended weekly code reviews meeting and presented the code to other developers and also reviewed other team members code.

Created static parameter files to initial load and dynamic parameters for the incremental daily/monthly loads.

Scheduled the workflows in the Autosys based on the scheduling details provided by Business/ Data analysists.

Involved in enabling the error logging to all the sessions and created error logging tables in the reporting database.

Worked on performance tuning and also production support for the DataMart loads.

Documented unit testing the test results in development and executed the queries in QA/ UAT for validations before migrating the changed to production.

Attended stand up’s twice daily for two difference agile teams and used Version one for status update on the task.

Worked with the Implementation Team by submitting the ITR’s and EPCR’s tickets and provided the migration team with the changes required in a document for QA/ UAT and PROD implementations.

Experience working as Data Analyst/Data Modeler in creating ETL mapping sheet and also helping in validation of the DataMart’s once the data is loaded.

Environment: Informatica Power Center 9.5.1/10.1.0/10.1.1, Hotfix 4, Putty, Agile, Microsoft Office 2010, Windows 7 Enterprise, PL/SQL, Oracle, Toad Version 11.00.116, SQL Server, UNIX, WinSCP,Autosys –CA Workload Automation 11.4.3,ERWIN.

Employer: Webilent Technology Inc., Windsor, CT

Client: Wipro/Zurich North America, Schaumburg, IL Aug’14- May’ 15

Lead Informatica Developer/Tech Lead


Zurich is an insurance company which deals with different types of claims and insurances. TNPS (Transactional Net Promoter score program) is a new project Zurich was working which helps them to store different transactional and customer survey to be stored in the warehousing which will help them to improve their business and generate the Business objects reports to the higher management.


Converted Business Documents to technical document and mapping document for the given requirements.

Worked with the business to gather the requirements and enhancements required to the new process improvements for the Predictive Modeling process on the claims and predict the future claims analysis to help the business improvements.

Worked on the different software development life cycle’s including analysis, development, Unit Testing, QA testing, Production Implementation and Validation.

Worked with the offshore team members (WIPRO) in coordinating the requirements and reviewing the ETL codes and guiding through the process and created hand over documents to the production Implementation Team in case of failures with SLA’s and Process details.

Worked on the developing Mappings, Sessions and workflows using different transformations like Source Qualifier, Expression, Aggregator, sorter, Sequence Generator, XML, Router, Union, Filter, Look Up’s and also the other required transformations using Informatica Power Center 9.5.1.

Worked with the Implementation Team by submitting the change control process and documented the steps of the Implementation in the change control package.

Co-ordinated with the Testing team regarding the testing process and reviewed the test cases with the test team and helped them to understand the process flow.

Unit testing is coordinated with the other developers by using the writing the Queries using the Advanced Query Tool (AQT).

Used Debugger wizard to remove bottlenecks and to improve the performance tuning at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.

Scheduled the workflows using AUTOSYS Version 5.2 to run the workflows for the required time and also used scripts to check the dependencies of the workflows.

Created Architecture design documents, Design Document, Source to target mapping documents and system maintains technical document to support base support team if the production issue arises.

Used Netezza as the database while loading into dimensions and facts. Worked on the different types of flat files and loaded into the Netezza database after transforming the data based on Business requirement.

Data analysis has been performed to check if the source files coming from the external vendor is as excepted and issues has been forwarded to the Business analysts .

Created STTM, Technical design document, Architectural design documents, Implementation plans and co-ordination of the production implementation and validations.

Environment: Informatica Power Center 9.5.1,10.0.1,10.1.1, Hotfix 4, Putty, IBM Lotus Notes 8.5, Microsoft Office 2010, AUTOSYS Version 5.2, Windows 7 Enterprise, PL/SQL, Netezza 9.0, SQL Server, Advanced Query Tool (AQT), UNIX .

Employer: Webilent Technology Inc., Windsor, CT

Client: Blue Cross Blue Shield (BCBSNE) – Omaha, NE Oct’11 – Aug’14

Role: Sr. Informatica Developer


BCBSNE has a long tradition of health care coverage in Nebraska. The Blue Cross and Blue Shield organization is not one single company, but rather a confederation of independent, community-based Plans.


Complete SDLC including architecture, analysis, design, development, testing, implementation and maintenance of application software in the Enterprise Data Warehouse with production support

Participation in project meetings with other Data Analysts in preparing Analysis reports and project development reports according to industry regulations and Informatica standards.

Code, implement and unit test the application based on the technical requirements.

Provide support for the system once it is deployed and running in the production environment.

Document all the phases of the application development and follow the coding standards to improve the maintenance of the application.

Create ETL mappings to extract data from multiple legacy systems. Transform and load the data into the Enterprise Data Warehouse using Informatica Power Center 9.1.0.

Design and develop new database objects like Tables, Procedures, Functions, Indexes and Views using SQL Server 2008/2010/2012.

Understand business challenges and translate them into process/technical solutions. Creation of documents related to the ETL process.

Used different Data Warehouse techniques like Star-Schema, Snowflake schema.

Participated in the EL code review meeting to build the Informatica workflow, session and mappings according to the Informatica standards.

Created SSRS reports for the Business Users and scheduled them to run in Tidal as per the user requests.

Worked on analysis, Development and testing on International Classification of Disease (ICD10) codes. Actively participated in agile ceremonies and Sprint Planning.

Used Web Services Transformation using multiple WSDL’s to create a unique Subscriber ID’s.

Strong knowledge in Data warehousing concepts, dimensional Star Schema and Snowflakes Schema methodologies.

Worked with a lot of third party Vendors whose requirement varies from multiple file formats to various destinations

Experienced working in Sprints ranging from 2 days to 3 weeks.

Experience in working KANBAN Method in Agile for the service Teams.

Active participant in Sprint Planning,Peer to Peer Reviews, Retrospectives and Ceremonies

Used Debugger wizard to remove bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.

Designed Excel Sheets for each mapping of their Test Scenarios.

Worked with the team members to identify and resolve various issues relating to Informatica & other database Issues.

Designing mapping templates to specify high level approach.

Designed ETL mapping documents, various mappings with Transformation rules and complex mappings including Slowly Changing Dimensions.

Designed and developed Informatica Mappings, Mapplets, Workflows and Work lets for data loads.

Used TIDAL as a scheduler for all the enterprise procedures

In BCBSNE we had an additional step of Code Reviews (Informatica, SQL) prior to Implementation. This is an arena for improvement, Suggestions and explanations.

In BCBSNE we had a Community of Practice for the Data Integration Teams where ideas were shared and work across the teams was brain stormed and new technologies were discussed in this forum.

Environment: Informatica Power Center 9.1.0, SSRS, Oracle, Tidal, Microsoft SQL Server Management studio 2008 and Windows Vista/7, Change control, Microsoft office, Tidal, BCBSNE Config. Tool Builder, BMC Service Desk Express, Agile, Microsoft Visual Studio Professional 2010.

Employer: Sagarsoft, Inc.–Glastonbury, CT

Client : Medtronic Surgical Technologies – Jacksonville, FL Jan’ 11 –Aug’11Role: ETL Informatica Analyst


Medtronic Surgical Technologies are market leader for ENT. The main purpose of this project is to remove old data sources to new data sources by using SAP Implementation.


Designed and prepared the documents according the given requirement for the SAP Remediation Daily.

Used Oracle 10g - Toad and Worked on the stored procedures, Views, Packages and made changes depending upon the requirement.

Updated the worked done using the ETL Remediation Status Excel Sheet and Attended daily Data Mart Sap Remediation Update meeting.

Created and Modified the Unix Scripts using X-shell and Putty.

Involved in Design, Development, Unit testing and System Testing in the process and documented all the necessary steps in the low level documentation.

Used ETL Informatica Client tools like Designer, Workflow Manager, Workflow Monitor and Repository.

Worked extensively with transformations like Source qualifier, Aggregator, Expression, Lookups, Filter, Router, Sequence Generator, and Rank.

Created Design Document and Test Case Document for the SAP Implementation.

Unit testing is rigorously done to make sure we end up in zero errors and involved in integration testing of mappings in ETL different environments.

Performance tuning of mappings, transformations and sessions to optimize session performance.

Used change data capture (CDC) to get the new updated records details in a table.

Analysis of the requirements and translating it into Informatica mappings.

Migration of mappings are done from one folder to other folder using the Repository Manager and also created different folder.

Created session and workflows using workflow manager and customized the parameter according to the requirement.

Developed Test Cases for Unit Testing, also involved in Integration, system, and performance testing levels.

Created mapplets in the Informatica Designer which are generalized and useful to any number of mappings for ETL jobs.

Migrated all the mappings from development to the production using Informatica Client tool.

Environment: Informatica Power Center 8.6.1, Hotfix 7 & 12,Oracle 10g, Toad 9.7, SQL Detective, Microsoft office, Putty, UNIX, X-shell and Windows XP Professional.

Employer: Sagarsoft, Inc.–Glastonbury, CT

Client: Amerigroup – Virginia Beach, Virginia Nov’10 – Dec’10

Role: Informatica Developer


The purpose of the Care plus Mobile project is to provide a means for Care Managers to electronically capture member information via portable device; thus reducing paper usage, eliminate duplication of data capture, and improve efficiencies.


Understood the Business requirement for the application process and created the Date feeds accordingly.

Used Informatica 8.6.0 to develop the various mapping, Sessions, Work lets and Workflows.

Involved in the production support and used Tidal scheduling tool to run the workflows in production and QA.

Manually ran workflows in QA environment.

Used several transformations in creating mappings like lookup, stored procedure, Source Qualifier, Expression, Update strategy transformation.

Attended daily status meetings and walk through meeting regarding the project status and its maintenance process was discussed.

Created Date Feed for the application which captures daily updated data to Sync database to IPAD and vice-versa.

Daily MOD, MARS and HIPPA data was updated in the system using Informatica 8.6.0

Involved in the Reporting refreshing of the Web Feeds.

Used SQL server 2005 as source and target for the Informatica Mapping process and also used SQL Server to Unit Test the SQL Overrides.

Created Parameter files for the mapping dynamically.

Performed Bulk Load of large volume of data by creating pre- and post-load scripts to drop indexes and key constraints before session and rebuild those indexes and key constraints after the session completes.

Involved in the performance tuning of mappings, Session, Work lets and Workflows.

Created Data feed for Care plans and Mobile Medication list for the IPAD application.

Used SQL Server to create Non –clustered Indexes for the required tables.

Created Post SQL and Pre SQL attributes according to the requirements.

Involved in creating non reusable session for the corresponding Data feed mappings

Environment: Informatica Power Center 8.6.0, Power Exchange 8.0, SQL Server 2005, Windows XP, Microsoft Office, Tidal

Employer: Sagarsoft, Inc.–Glastonbury, CT

Client: ADECCO – Jacksonville, Florida Aug’10 – Oct’10

Role: Informatica Developer


This project deals with Business applications like sales, recruiting, placement, pay rate, bill rate and managing the profiles and creating pay stubs for the individual depending upon the Time Sheet numbers. In this project Informatica is used in the reporting division and with this ETL tool Time sheets are generated according to the pay distribution and bill distribution and mainly to obtain the split in the percentage


Developed mappings using various transformations like source qualifier, lookup, aggregation, expression, router transformations and also created session for various mappings.

Understood the requirement and made changes in the mappings and documented the changed mapping and the reason for the change.

Used Informatica 8.1.1 to debug the mappings and used edit Breakpoints to create breakpoint conditions in a mapping

Used Repository Manager to create the folders and to check the connections for the Informatica users.

Scheduling of the mapping is done using Workflow Manger and monitored by using Workflow Monitor.

Involved in system analysis, design, Build and Testing

Used the Oracle SQL Developer to generate and to create the query’s and extracted the data from Oracle 10g data source and created ODBC connections accordingly.

Used People soft financial, custom apps and human resource data as source to calculate the pay and bill split percentages.

Analysis is done to get the exact result that requirement by the client through Daily scrum meetings and walk through meetings.

Introduced the benefit of in using Persistent Cache Lookup which results in overall system Performance.

Involved in the Testing of mapping using Informatica 8.1.1

Changed Universes in Business Objects according to the requirements and generated reports to check the data flow.

Exported and Imported the Universes after making the changes in the Business Object Designer 6.5

Involved in changing the reports in Business objects and created new data providers, made changes in the structure of the reports and also used section delimiters.

Worked with the Elite change management system and Initiated the change process system

Environment: Informatica Power Center 8.1.1, Oracle 10g, Windows XP, Knowledge Spring, Decision Studio (Business Objects 6.5), Elite change management system, Oracle SQL Developer, SQL Server 2005, People soft 8.4.6.

Employer: Sagarsoft, Inc.–Glastonbury, CT

Client: Sagarsoft, Inc.–Glastonbury, CT Jan’ 09 – July’10

Role: ETL Developer


Involved in Requirement Analysis, ETL Design and Development for extracting from the source systems and loading it into the Data art.

Wrote Functional specifications for each of the ETL processes, working in close co-ordination with business users and development leads.

Implemented SDLC concept in Extraction, Transformation and Loading of data using Informatica.

Using Oracle 10g, Toad and working on the stored procedures, Views, Packages and make changes depending upon the requirements.

Converted functional specifications to technical specifications (design of mapping documents).

Involved in Performance Tuning of existing mappings as well as new ones.

Wrote complex SQL scripts to avoid Informatica joiners and Look-ups to improve the performance as the volume of the data was heavy.

Coordinated with DBA’s in resolving the database issues that lead to production job failures.

Migrated code/objects from the development environment to the QA/testing environment to facilitate the testing of all objects developed and check their consistency end to end on the new environment.

Involved extensively in Unit testing, integration testing, system testing and UAT.

Migrated code/objects from the development environment to the QA/testing environment to facilitate the testing of all objects developed and check their consistency end to end on the new environment.

Developed complex mappings to load data from Source System (Oracle), XML files and flat files

Participated in weekly end user meetings to discuss data quality, performance issues. Ways to improve data accuracy and new requirements, etc.,

Environment: Informatica Power Center 8.6/7.1.4, Informatica Power Exchange 5.2.2, Oracle 10g/9i

Employer: Sagarsoft, Inc.–Glastonbury, CT

Client: MASS MUTUAL Financial Services – Enfield, CT Oct’ 08 – Dec’ 08

Role: Programmer Analyst


MassMutual Financial Group is a marketing name for Massachusetts Mutual Life Insurance Company (MassMutual) and its affiliated companies and sales representatives. MassMutual is a mutually owned financial protection, accumulation and income management company.

Integrated Sales Illustration Platform (ISIP) provides technical infrastructure for sales Illustration product definitions, Calculation rules and functionality to be integrated with the ISIP solution and it also the functionality required to generate New Business Illustration on the new ISIP Solution for the three new products LP10, LP65, HECV and LEGACY 100.MassMutual Design (MMD) is the Strategic application for USIG Sales Illustration Improvement Program (SIIP). Currently supports Life Products and it uses Prism Calc Engine. SIIP objectives include integrating other lines of the business into MMD and the replacement of the product Calc Engine.


Involved in System Testing, Integrating Testing and Regression Testing.

Analyzed the Data and made changes in the Test Cases according to the changes occurred in MMD.

Interacted with the Business Analyst and understood the Requirements

Compared Actual Result and Expected Result using MassMutual Tools like APL and MMD.

Attended Weekly meetings regarding the Enhancements of Mass Mutual Designs Web Application and Desktop Application.

Generated advanced Reports to Compare the Unit Test Values (UTV’s) using Beyond Compare.

Responsible for creating Unconsolidated reports Using APL.

Involved in Testing the BAT cases and generating UTV’s.

Used Microsoft EXCEL (MACRO’s) to generate consolidated reports and to create the Analysis Reports for the Passed/Failed cases of the different sales Illustration products.

Involved in Hand patching the 7 pay premium code in UTV’s and generated the Analysis Reports.

Created Unit Test Values (UTV’s) for the all the products using Microsoft Excel.

Responsible for creating UTV’s for the Products according to new release of Latest VPM’s.

Manually compared the APL UTV’s with the UTV’s generated from Macro’s.

ENVIRONMENT: APL, Mass Mutual Designs (MMD), Prism, Beyond Compare, Prism and Product Accelerator, Windows XP, Microsoft Excel and Access

Employer: Sagarsoft. INC –Glastonbury, CT

Client: Sagarsoft.INC-Glastonbury, CT Feb’08 - Sep’ 08

Role: Software Developer


Working on creating the initial mappings for the given projects.

Getting the requirements and mapping documents from data analysts and team leads.

Attending daily meeting for the progress of the project and talking to off-shore teams on the updates of the projects.

Working on the writing and execution of test cases in QA and UAT.

Involved in writing the SQL queries for SQL Overrides in the information and also checking the performance testing of the queries.

Involved in software life cycle development like Analysis, development, testing and production implementations and validations.

Used Informatica scheduler in scheduling the workflows daily, monthly and quarterly based on the business requirements.

Used Expression Transformation, Filter transformation, Router transformations, and Aggregator transformations while creating the initial load.

Created static parameters to the workflows and also created the re-usable transformations.

Attended Advanced Informatica sessions organized by the company.


Master’s in Electrical Engineering, International Technological University, Sunnyvale, CA, USA.

Bachelors of Technology in Electronics and Communications Engineering, JNTU, INDIA.



TEACHING ASSISTANT Department of EE, ITU May’ 07 DEC’ 07

Worked as Teacher Assistant for VLSI under Prof.Wayne Shen


Knowledge in VLSI technologies.

Helping out the professor during classes and Lecturer notes.

Good Analytical Skills.

Effective Communication Skills and Leadership Qualities

Highly Adaptable to Working Environment.

Contact this candidate