Post Job Free

Resume

Sign in

Customer Service Manager

Location:
Brooklyn, NY
Posted:
February 28, 2014

Contact this candidate

Resume:

Suhas Patel- Informatica Developer

917-***-****- accwfc@r.postjobfree.com

Qualification Summary

• Over seven years of IT industry experience in Data Modeling, Database Development

and Data Warehousing, including development, implementation and database and data

warehouse applications for Health Care, Telecommunication Services, Retail and

Financial Services, with Strong analytical and problem solving abilities, self-

motivated and results-oriented which includes 6 years of extensive experience in Data

warehousing tools including Informatica (ETL).

• Good understanding of ETL, Dimensional Data Modelling, Slowly Changing

Dimensions (SCD) and Data Warehouse Concepts.

• Experience in Data Warehouse/Data Mart Development Life Cycle and performed

ETL procedure to load data from different sources into data warehouse using Informatica

PowerCenter (Repository Manager, Designer, Workflow Manager and Workflow

Monitor).

• Expertise in ETL design, including process flow, data flow, data mapping,

physical database designs, data models.

• Designed and developed mappings from varied transformation logic like

Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, SQL

transformation.

• Experience in debugging mappings. Identified bugs in existing mappings by analyzing

the data flow and evaluating transformations.

• Worked with Stored Procedures, Triggers, Cursors, Indexes and Functions.

• Actively involved in Performance Tuning and Trouble Shooting.

Thorough experience in integration of various data sources like Oracle, SQL Server,

Teradata and MS Access and worked on integrating data from flat files, COBOL files and

XML files.

• Involved in the SDLC (system development life cycle) of a data warehouse and

responsible for designing, coding, testing, implementing, supporting the ETL processes

for data warehouse solutions/data marts and implementing the reporting requirements.

• Data Modelling experience using Dimensional Data modelling, Star

Schema/Snow flake schema, Fact & Dimensions tables, Physical & logical data

modelling, Erwin Data Modeller with data source, Data Extracted from various sources

like Oracle, Flat files, SQL Server and MS access.

• Developed the code in SQL and PL/SQL. Developed the UNIX shell scripts to run

SQL scripts and Informatica work flows from UNIX server.

• Analytical and Technical aptitude with ability to work in a fast paced, highly

flexible environment where in-depth knowledge of technology, hard work and ingenuity

are highly appreciated.

• Excellent interpersonal and communication skills, technically competent and

result-oriented with problem solving skills and ability to work effectively as a team

member as well as independently.

• Developed effective working relationships with client team to understand support

requirements, develop tactical and strategic plans to implement technology solutions and

effectively manage client expectations.

Technical Skill Set

• Operating Systems: Windows (all flavors), UNIX, AIX 6.1

• ETL Tools: Informatica Power Center 9.x, 8.x, Power Mart 6.2/5.1,

Informatica Power Connect, Informatica Power

Exchange for

SFDC

• Data Bases: Oracle 11g/10g/9i/8i, SQL Server 2005, MS Access

2005, Teradata (Fast-Load, Multi-Load, Fast Export),

Netezza.

• Languages: SQL, PL/SQL, Shell Scripting

• Utilities: SQL*PLUS, PL/SQL Developer, Toad

• Data Modeling Tools: Visio, Erwin

• Reporting Tool: Business Object 6.5/6.1, Crystal Report 9.2

Professional Experience

• Company: Solar Turbines- San Diego, CA

Jan 2013 - till date

Role: Informatica Developer

Description : Solar Customer Service has over 1,200 field service personnel providing

service and support to Solar’s Customer all over the world. Also Solar has a very extensive

installed base, or fleet of packages of solar products globally that represent significant

opportunities for service and revenue. The main purpose of the new WFM and CRM

application is to enable Solar to more effectively forecast opportunities to actual sales of

service.

Environment: Informatica PowerCenter 9.1, Data Quality 3.1, Data Explorer 5.0, Business

Objects 6.5, Sybase 7.4.0, Flat files, Teradata V2R6, UNIX (HP-UX), Korn shell and

Windows NT.

Responsibility:

• Used Informatica 8.1.1 to extract, transform and load data from multiple input

sources. Worked closely with the Business Analyst for gathering business

requirements.

• Created Mappings, Mapplets and Transformations using the Designer and

developed Informatica sessions using Workflow Manager as per the business

requirement

• Done extensive bulk loading into the target using Sybase.

• Used workflow manager for session management, database connection

management and scheduling of jobs.

• Monitored sessions using the workflow monitor, which were scheduled, running,

completed or failed. Debugged mappings for failed sessions.

• Assisted the team in the development of design standards and codes for effective ETL

procedure development and implementation.

• Used data quality and data explorer to improve the quality of data on an ongoing basis

and to make easier to access by the business users.

• Extensive performance tuning by determining bottlenecks at various points like

targets, sources, mappings and sessions.

• Used Korn shell scripts for scheduling the informatica sessions.

• Worked on Source Analyzer, Warehouse designer, Mapping, Mapplet Designer,

Transformations, Informatica Repository Manager, Workflow Manager and

Monitor.

• Tested all the Informatica mappings individually as well as with the entire process

• Documented the complete Mappings and also migrated the data from Informatica 7.1

to Informatica 8.1.

• Tested the data and data integrity among various sources and targets. Associated with

Production support team in various performances related issues.

• Company: Independent Health – Williamsville, NY

Apr 2011 – Jan 2013

Role: Informatica Developer

Description: CTG provide innovative IT solutions that address the business needs and

challenges of companies in several higher-growth industries including healthcare, energy,

and technology services.

Independent Health (IH) provides case, disease and utilization management services for

approximately 350,000 members in New York and surrounding areas (including

Canada). Its line of business includes Commercial, Self-Funded, Medicaid/Medisource

and Medicare.

Currently, IH utilizes various systems including Medecisions, Power MHS, Siebel,

CorePlus etc. to perform many business-critical functions. IH has identified ZeOmega's

Jiva application as a solution to fulfil functionality gaps identified by the business units.

This project intends to implement Jiva for all groups that are eligible for Medical

Management Services.

Delta Dental offers national dental coverage, administering programs and reporting

systems that provide employees and individuals with quality, cost effective dental

benefits and superior customer service. Release 1 of the 2013 Medicare Benefits project

is to create a Medicare enrolment extract file from Source system – POWER and send it

to Delta Dental weekly. Worked on Hedis 2013 fixes on existing code.

Independent Health (IH) Actuarial and Underwriting (A&U) currently utilize the

Milliman Health Cost Guideline (HCG) Grouper through their MedInsight products and

services. MedInsight's offerings include a data warehouse that organizes the IH data sets

into one view. This allows IH to leverage Milliman's benchmarks and methodologies to

identify trends for product evaluation and pricing of Commercial, Medicare and

Medicaid products.

Environments used:

Windows XP, Unix AIX 6.1, Oracle 11g, Informatica 9.1, Toad 10.6, PL/SQL

Developer 7.1, Kalido MDM, DB2 7.1, Tidal scheduling tool, Flat files, Tortoise SVN

1.6,Surveyor 3.2

Responsibilities

• Design and Implement software applications and packages customized to meet

client requirements.

• Review and create software programs to ensure technical accuracy & reliability

of programs.

• Analyze the communications, informational, database and programming

requirements of clients, plan, develop, design, test and implement appropriate

information systems.

• Analyze the data in EDW and design the extracts accordingly.

• Develop and debug Informatica mappings to perform the extracts.

• Create the crosswalk using Kalido tool to crosswalk IH values to corresponding

Jiva data.

• Impact analysis on the project design for the changes done in Data warehouse.

• Documentations for UAT and System testing.

• Load the extract data into client database product called Jiva.

• Actively participated with QA team as well as Production support team.

• Scheduled jobs in Tidal with inter-dependency.

• Company : Level3 communications - Broomfield, CO

Feb 2010 – Apr 2011

Role: Informatica Analyst & Developer

Description: Level3 Communication is an international communications company,

headquartered in Broomfield, CO. They are one of only six Tier 1 Internet providers in the

world. This project was on Order Mart, SMART and SMMART applications where some new

changes or modification for existing ETL mappings is required depending upon the client

requirements. SAMART and SMMART application is used for Customer order entry using

SIEBEL, PIPELINE, CLARIFY, EON AND IFO order entry system.

Environment: Informatica Power Center 9.1, Oracle 11g, SQL Server 2005, UNIX, Flat files,

XML files, Rally (Agile-Scrum process), SQL*Plus, PL/SQL.

Responsibilities:

• Test the changes/code fixes that are implemented as part of change requests/code fixes

• Monitor the deployment activity when the specific deliverables are being produced.

• Ensure the deployed code is working fine in production environment.

• Give approaches/solutions in case of any production failures/errors.

• Generate test cases by using test case generator for the Level3, Looking glass and Wiltel

releases.

• Use the code review tool to validate the mappings developed in Level3 and Wiltel

releases.

• Working as a developer in data extraction and analysis for Looking Glass company

source data into the Target applications using the Environment, Analyzer Tool and

validated the mappings using Code review tool.

• Working within Continuous Integration frameworks where major focus is on developing

automated unit tests and documented code coverage.

• Actively participated in integration part from Global Crossing Data (Newly acquired

company) to Level 3 data from EON to BPMS

• Company: Capital One Bank - Richmond, VA

Jan 2009 – Feb 2010

Role: Informatica Developer

Description: Capital One Bank offers a variety of consumer lending and deposit products,

including credit cards, auto loans, small business loans, home equity loans, installment loans

and savings products. The aim of the project was to help the customer service representatives

to deal and transect with different customers data. The operational data of different financial

departments was loaded into a central data warehouse and farmed out to different regional

data marts.

Environment: Informatica PowerCenter 8.1, Oracle 9i, Erwin, UNIX scripts.

Responsibilities:

• Extensively worked for Data Analyzing, Cleansing and Data Integration and

worked to resolve inconsistencies in data.

• Created the Mappings, Mapplets in Informatica using designer and created

sessions using workflow manager.

• Responsible to create Informatica maps by utilizing various transformations like

Expression, Source Qualifier, Aggregator, Connected/Unconnected lookup, Filter,

Joiner etc.

• Involved in creation, edition and deletion of sessions and batches of session using

workflow manager in Informatica.

• Implemented Materialized Views using PL/SQL for loading extended tables.

• Implemented Slowly Changing Dimensions (Type 2: versions) to update the

dimensional schema.

• Delivered the new system in Agile methodology.

• Facilitated Agile development process in the company including requirements and

design processes. Developed build and release scripts and assisted with

configuration management process.

• Worked onB2B data transfer of customer payment info.

• Monitored workflows and collected performance data to maximize the session

performance.

• Used Informatica efficiently and tuned the scripts for better performance results

and for large data files by increasing data cache size and target based commit

interval.

• Prepared user and technical documentation. Documented information about

source system data locations, UNIX hosts, user login details.

• Company: Thomson Reuters, New York City-NY

Nov 2007 – Jan 2009

Role: ETL Developer

Description: Thomson Reuters efficiently monitors the market and performing thorough

fundamental and quantitative analysis, in-depth portfolio risk, performance analysis,

economic forecasting and more. Informatica power center was used as an ETL tool to extract

the stock data and load it into target systems in data warehouse. These data than were used to

generate different kinds of reports. Thomson Reuters using rapidly changing dimension in

this project.

Environment: Informatica Power Center 8.6, Erwin 7.2, Oracle 10g, SQL, PL/SQL, Toad,

Windows 2000 Server

Responsibilities:

• Analyzed the source data coming from different sources and worked with

business users and developers to develop the model.

• Imported various heterogeneous source files using Source Analyzer in the

designer.

• Involved in analyzing logical model of source and target.

• Developed logical and physical data models that captured data flows using Erwin

7.2.

• Worked on B2B requirements (Business to Business) and extensively DX (data

exchange) and DT(data transform).

• Involved in identifying the sources for various dimensions and facts for different

data marts according to star schema and snowflake schema design patterns.

• Designed and developed complex Informatica mappings, mapplets, reusable

transformations, workflows and worklets using various tasks to facilitate daily,

weekly and monthly loading of data.

• Based on the logic, used various transformation like Source Qualifier, Normalizer,

Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner

transformation in the mapping.

• Involved in migrating data from Staging to Data warehouse and Scheduling jobs.

• Created Connected, Unconnected and Dynamic lookup transformation for better

performance and increased the Cache file size based on the size of the lookup

data.

• Used Workflow Manager for Creating, Validating, Testing and Running the

sequential and concurrent Batches and Sessions.

• Used Debugger to test the mappings and fix the bugs. Created Unit Testing

Document for Informatica ETL routines

• Tatva Soft

Feb 2007 – Oct 2007

Role : Data Analyst

Description: Tatvasoft is one of the well known software development firm in India. This

project was developed for Prasant Engineering Mfg Co. It maintains a database for entire

production method from row material to warehouse management. Maintain daily work hour

and production and maintain stock.

Responsibility:

• As an Oracle Implementation Technical Expert, worked on writing custom Forms and

Custom Tuning the reports using hints.

• Extensively involved in writing SQL queries (Sub queries and Join conditions), PL/SQL

programming.

• Designed and Developed Complex Forms, LOVs, Record Groups, Object Groups, Visual

Attributes, Menus, Reports, Graphics, Editors, Parameters, System Variables, Master-Details Forms

and PL/SQL Libraries.

• Working as a technical support for a team that involved technical & quality reviews of

program codes and PL/SQL blocks for optimization and maintaining standards &

guidelines.

• Developed Database Triggers in order to enforce complicated business logic and

integrity constraints, and to enhance data security at database level.

• Developed Custom forms to view edit the data in custom interface tables and event

handling tables using template forms.



Contact this candidate