Sumeet Lalvani
Cell: +1-732-***-**** Email: **********@*****.***
Summary:
Strong credentials in designing, implementing and supporting technical solutions for Product based and Service based IT services. Received multiple appreciations organization wide for delivering cost effective solutions as per business requirements and track record of on-time delivery. Dynamic and accomplished professional with 12 years of experience in Information Technology and Services industry.
Undergone professional training in Oracle 9i form.
Worked Extensively on Informatica PowerCenter and IQD tool.
Worked on Informatica, IDQ, PL/SQL programming, SQL, Querying & Stored procedures, D2K, Java, JSP, HTML and JavaScript
Good self-learning capabilities and quick learner in new technologies and proved productive within short period of time.
Undergone professional training in Data Warehouse Concept and Actuate 8 and Birt (Java reporting tool) in Triumph System and Solution Pvt Ltd.
Good communication skills and excellent team player.
Managed team of 10-15 resources.
Dimensional Data Modelling experience on Data modelling, Erwin 4.0, Dimensional Modelling, Ralph Kimball Approach, Star Modelling, Data marts, OLAP, FACT & Dimensions tables, Physical & Logical data modelling
Strong Knowledge in Relational Database Concepts, Entity Relation Diagrams, Normalization and De Normalization Concepts.
Experience in preparing the ETL Design Documents like High Level and low-level Design documents. Involved in the Unit testing, integration testing, System testing and UAT using Quality Centre.
Education and Certifications:
MCA (Computer Applications), ICFAI University, India (2007) First Class
BCA (Computer Science), Bangalore University, India (2005) First Class
TOGAF 9 Certified (The Open Group)
Technical Skills Set:
Languages: Java, PL/SQL, SQL, Shell Scripts,Python, R
App/Web Servers: Tomcat
Databases: Oracle 9i, Oracle 10g, Netezza,Teradata, Greemplum
Web Technologies: Java Script, HTML, XML, SAOP
Tools: Informatica PowerCenter 8.1.1,8.6.1, 9.0.1, 9.5.1 and 10.2.1, IDQ 9.0.1, FCI (Reporting tool), Jasper, Tableau, Actuate 8, BIRT, SQL Developer, PL/SQL Developer and TOAD, Aginity
Operating Systems: UNIX, Linux, Windows
Scheduling tool: Autosys
Professional Experience
I)NGA Group Inc. – Programmer Analyst (March 2014 – Till Date)
US Bank - Senior Software Engineer (Nov 2018 – Till Date)
Key Technologies Informatica, Greenplum, SQL, R, Python, SOAP,
API’s,MuleSoft and Tableau.
Veritas Technologies LLC - Programmer Analysts (Feb 2018 – Nov 2018)
Key Technologies Informatica, Oracle, unix, OBIEE, BICC and Cloud
application
PayPal Inc - Senior Data Quality Developer (Aug 2017 – Feb 2018)
Key Technologies Informatica Data Quality, EIC and Teradata,
Apple Inc - Senior Developer (Feb 2017 – Aug 2017)
Key Technologies Informatica, Tableau, Oralce and Unix
Blue Shield Of California - Lead Data Quality Developer (April 2016 – Jan 2017)
Key Technologies Informatica, Informatica Data Quality and Netezza
Hyundai Capital - Onsite Technical Lead (Aug 2015 – March 2016)
Key Technologies Informatica, Oracle and DAC
Verizon Telemetric - Sr. Developer (May 2015 – July 2015)
Key Technologies Informatica, Oracle and Unix
Fannie Mae - Sr. Developer (March 2014 –April 2015)
Key Technologies Informatica, Oracle, Unix and Autosys
I)Cognizant Technologies, -- Technical lead (Nov 2012 – Feb 2014)
Key Profile:
Worked as Technical lead for ETL / Informatica Powercenter and Oracle
II)Zensar Technologies, -- Support Lead (May 2012 – Nov 2012)
Key Profile:
Worked as Tech Support lead for ETL / Informatica Powercenter and Oracle
III)Wipro Technologies, -- Technical Lead (June 2010 – May 2012)
Key Profile:
Worked as Technical lead for ETL / Informatica Powercenter,IQD, and Oracle,Teradata.
IV)Triumph System & Solution Pvt Ltd, -- Programmer (Nov 2008 – June 2010)
Key Profile:
Worked as Programmer for ETL / Informatica Powercenter,IDQ and Oracle
V)3i-infotech Pvt. Ltd., -- Software Engineer (June 2008 – Nov 2008)
Key Profile:
Worked as Senior Developer for ETL / Informatica Powercenter and Oracle (Pl/sql and D2k)
VI)Alchemist HR Services Pvt Ltd., -- Senior Trainer (March 2007 – June 2008)
Key Profile:
Worked as Developer for Oracle (Pl/sql and D2k) and Java, Jsp, Servlet.
Project Experience:
US Bank, San Francisco, California Feb 2018 – Nov 2018
Programmer Analysts
BICC Cloud Connector Application
The objective of this project is authentication to support the digital customer experience and the way revenue-generating business processes function, while enhancing risk controls and security. This first phase is for software development, vendor software, infrastructure, licensing, and implementation costs to implement fraud reduction controls that will protect the front door, and be leveraged by mobile and online banking, with further build out and integration planned. The goal of this work is to consolidate technical solutions, business processes, fraud analysis routines, and customer identification experiences that will provide improved customer authentication and fraud reduction. This first phase will integrate authentication controls for customer and device identification and transaction behavioral analytics. This foundational layer will provide cross platform enablement, device blocking, and device reputational details and will be the basis for reducing runaway losses to the Bank by providing specific account disablement capabilities as well as one-time password capabilities for high risk transactions in the branches.
Responsibilities:
Involvement in SDLC which includes gathering of requirements business stakeholders and prepare design and technical specification document.
Working with US Bank architectural team, in Architectural, Design review sessions of the system and prepare detailed System Architecture/Design Document
Perform data extraction and transformation and quality check through Informatica.
Develop all the Complex ETL process through Informatica and Complex Sql queries.
Develop Informatica 10.2.0 mappings/Workflows using various transformations like Expression, Router, Update strategy, Connected and Unconnected lookup’s, Aggregator, Sorter, Sequence generator etc and having files as a source and targets along with Greenplum database objects.
Applies the highest level technical skill to complex ETL mappings, Transformation and problem in Informatica.
Responsible for the resolution of large Complex system and Programming Problems
Design and Develop analytical tables using SQL scripts and REST API to load data into Greenplum database from Mulesoft which massively parallel distributed systems is operating on petabytes of data across nodes from different source systems Tesla, Salesforce, Microsoft Dynamics, Iovation and Digital Mobile App.
Development of dashboards and reports using Tableau for business stakeholders.
Environment: 3 Tier Architecture.
Tools Used: Informatica power center 10.2.1, MuleSoft, API’s, pgAdmin, Tableau
Database: Greenplum, SQL
Language Python, R,SOAP
Veritas Technologies LLC, Santa Clara, California Feb 2018 – Nov 2018
Programmer Analysts
BICC Cloud Connector Application
The objective of this project is to extract data from BICC(Business Intelligence Cloud Connector) sites and Oracle ERP, send processed data to OSC. Processed data from BICC and ERP is verified and OBIEE report is generated for quality assurance, which is presented to business stakeholders.
Responsibilities:
Analyzing the source data to build data models by defining facts and dimensions.
Extract, Transform and load data from transactional databases like Oracle Sales Cloud and RevPro using Informatica into data warehouse for OBIEE reporting.
Developed Informatica 9.6.1 mappings/Workflows using various transformations like Expression, Router, Update strategy, Connected and Unconnected lookup’s, Aggregator, Sorter, Sequence generator etc and having files as a source and targets along with database objects
Developed test cases and unit testing, conducting peer reviews of all my mappings before moving it to production.
Simplifying the existing PL/SQL procedures for competency calculation which helps to maintain easily and validating the new logic for Channel Partner Net reporting.
Altering tables and creating new tables for Oracle Service Cloud reporting enhancements.
Performance tuning by creating indexes, partition tables in database and by creating aggregate tables using informatica & procedures.
Effectively involved in test data preparation and code review.
Environment: 3 Tier Architecture.
Tools Used: Informatica power center 9.6.1, OBIEE, BICC CLOUD, WinScp, Putty, Sql Developer
Database: Oracle
PayPal Inc. San Jose, California Aug 2017 – Feb 2018
Senior Data Quality Developer
EDW Data Quality Uplift
This Project establishes unified adoption of data standards, policies and tools for the enterprise. Provide measurable uplift of Governance, Control and Transparency for critical data elements used by all business functions. Formulate actionable roadmap towards data remediation and leaner data processes breaking the nature of managing data in to a common set of tools and standards that is scalable and has enterprise-wide adoption. Provide assurance and facilitate attestation for key internal and external stakeholders such as audit, regulators and senior leadership.
Scope:
The scope of the project covers the current data gaps and work towards a road map to address the state especially as the enterprise transitions to Data Lake. Drive efficiencies and reduce redundancies as it relates to multiple level of scrubbing and re-validation of the same data set.
Responsibilities:
Create Profile and Scorecard of the current system tables and Files.
Implement different types of transformations including Address Validator
(Address Doctor), Standardizer, Merge and consolidation Transformations in Informatica data quality.
Develop DQ profiles, Data Analysis, Data Cleansing, Data Matching and error
Handling.
Identify the Critical data element for Business from Current existing system.
Identification and analysis of Existing KPI and data quality rules.
Analysis the Bteq Scripts, Fast load and MLoad process for identify the data flow in
Teredata.
Design the Informatica data quality framework for the project.
Extensive role in business requirement gathering, requirement analysis and data rules.
Performance tuning mapping and database.
Design processes for the framework which include processes for Data Cleansing and
Quality checks (De-duplication, Format Checks, Valid Value Checks, Enrichment),
Data Conforming System, Error Event Handler (Exception Handling), Auditing System
(Operational Runtime Statistics), Data Archiving.
Design generic process outside of standard framework for requirements not fitting
Standards.
Create low-level designs for ETL processes, which are generic in nature and
Completely metadata driven using Informatica. (Applied Rules for checks required)
Implement Advanced Informatica feature like Concurrent Workflow execution to have
One code base and enable to handle multiple feeds by the use of metadata.
Execute multiple Domain in the Gateway across various phases of SDLC, from
Requirement analysis to development, SIT, UAT and Production.
Develop mappings/Workflows/DeployApplication using various transformations like Expression, Router,Update strategy, Connected and Unconnected lookup’s, Aggregator, Sorter,Sequence generator etc. and having files as a source and targets along with database
Objects.
Environment: 3 Tier Architecture.
Tools Used: Informatica Data Quality 10.0.1, EIC (Enterprise Information Catalog),AXON,Teradata SQL
Assistant, Toad.
Database: Teradata, Oracle
Apple Inc, Sunnyvale, California Feb 2017 – Aug 2017
Senior Developer
Manufacturing Quality Management (MQM)
The objective of this project is to extract data from CM sites and send processed data to PDCA. Processed data from PDCA is verified and Tableau report is generated for quality assurance, which is presented to business stakeholders.
Responsibilities:
Developed Informatica 9.6.x mappings/Workflows using various transformations like Expression, Router, Update strategy, Connected and Unconnected lookup’s, Aggregator, Sorter, Sequence generator etc and having files as a source and targets along with database objects.
Developed test cases and unit testing, conducting peer reviews of all my mappings before moving it to production.
Using Tableau tool to generate the reports for business.
Design the warehouse for Data Quality Rules using unix shell script.
Create the Data Quality Rules to identify the data pattern.
Effectively involved in test data preparation and code review.
Daily CM (Configure Manager) calls for Product discussion.
Actively involved in Defect solving in UAT and Production phase.
Handing the Onshore Offshore (China and India) module.
Environment: 3 Tier Architecture.
Tools Used: Informatica Power Center 9.6.X, Tableau 10.1
Database: Oracle
Blue Shield OF California, SFO, CA April 2016 – Jan 2017
Data Quality Developer
Claim Book OF Record.
The objective of this project is to implement Data Quality rules to manage the data in ware house for generation the reports to help business to manage data smoothly.
Responsibilities:
Design the warehouse for Data Quality Rules.
Create the Data Quality Rules to identify the data pattern.
Develop the Data profiling rule in Informatica DQ tool.
Create the score card from profile and send the report to business.
Create the rule in Informatica Data Quality tool.
Developed Informatica 9.6.0 mappings/Workflows using various transformations like Expression, Router, Update strategy, Connected and Unconnected lookup’s, Aggregator, Sorter, Sequence generator etc and having files as a source and targets along with database objects.
Developed test cases and unit testing, conducting peer reviews of all my mappings before moving it to production.
Effectively involved in test data preparation and code review.
Environment: 2 Tier Architecture.
Tools Used: Informatica Power Center 9.6.0, IDQ and Aginity.
Database: Netezza 7.0.x
Hyundai Capital, Irvine, CA Aug 2015 – March 2016
Onsite Technical Lead
Computer Telephony Integration.
The objective of this project is to implement CTI technology which will allow interactions between the telephony and computer systems of HCA; therefore, merging the telephony call with the back-end data repository. The customer account information will be provided in screen view / Dash board to the call center agent.
Responsibilities:
Oracle Sieble with Call Center Application.
Interactive Voice Response (IVR) is an automated telephony system that interacts with callers, gathers information and routes calls to the appropriate recipient
Developed Informatica 9.5.1 mappings/Workflows using various transformations like
Expression, Router, Update strategy, Connected and Unconnected lookup’s, Aggregator, Sorter, Sequence generator etc and having files as a source and targets along with database objects.
Developed test cases and unit testing, conducting peer reviews of all my mappings before moving it to production.
Effectively involved in test data preparation and code review.
Daily client calls for Project discussion.
Actively involved in Defect solving in UAT and Production phase.
Handing the Onshore Offshore module.
Scheduling the Job through DAC.
Environment: 3 Tier Architecture.
Tools Used: Informatica Power Center 9.5.1, OBIEE (reporting Tool), SQL Developer, DAC.
Database: Oracle 10g and SQL Server
VERIZON telematics, Atlanta GA May 2015 – July 2015
Senior Programar Analyst
Verizon telematics is a leading telematics devise provider in market, it is a separate business wing in line of Verizon. And the project is a strategic roadmap that focuses on enhancing operational excellence as well as improving the customer experience, VTI is currently undertaking various enterprise data management initiatives aimed at implementing a robust Customer Relationship Marketing solution. In line with this strategic roadmap, the MDM project is one such initiative sponsored by Sales and Marketing that focuses on providing an in-depth understanding of the customer. MDM value proposition as it pertains to this document, in line with the Sales and Marketing objective of enhancing customer acquisition and retention
Responsibilities:
Designed and developed Powercenter/Idq.
Played an extensive role in business requirement gathering, requirement analysis and data rules creation in idq and Powercenter development.
Configured relation among table to obtain LKP.
Performance tuning mapping and database.
Working on reference tables.
Obtained golden records using Informatica master data management hub console.
Loaded the landing tables with the help of etl jobs and IDQ mappings and workflows
Created mapping applications in IDQ to lad landing data.
Developed Informatica 9.6.1 mappings/Workflows using various transformations like Expression, Router, Update strategy, Connected and Unconnected lookup’s, Aggregator, Sorter, Sequence generator etc and having files as a source and targets along with database objects
Create the store procedures.
Developed test cases and unit testing, conducting peer reviews of all my mappings before moving it to production.
Effectively involved in test data preparation and code review.
Attending Daily defect calls for Project discussion
Environment: 3 Tier Architecture.
Tools Used: Informatica Power Center 9.6.1, MDM 9.6.1, IDQ, SQL developer, IDD, FTP, SFTP, Shell Scripts, Text Editor.
Database: Oracle 11g
Fannie Mae, VA March 2014 – April 2015
Sr. Programmer Analyst
Fannie Mae Security Master Data is an essential input for various business functions within Fannie Mae. Currently this Data is sourced from various internal and external sources and stored in multiple data stores in Fannie Mae.
The objectives of Security Master are to:
Implement an enterprise solution that is aligned to future state enterprise data architecture, to house security master data required to support all business data within Fannie Mae.
Implement a service oriented vending mechanism that will provide security master data required to support various business processes within Fannie Mae.
Consolidate various sources for security master data in to a single version of truth using Master Data Management principles.
Consolidate different data stores across Fannie Mae that stores Security Master Data. The goal is to have enough data to support to retirement of the legacy systems over time.
Responsibilities:
Requirement Analysis through business model and functional model.
Developed Informatica 9.5.1 mappings/Workflows using various transformations like Expression, Router, Update strategy, Connected and Unconnected lookup’s, Aggregator, Sorter, Sequence generator etc and having files as a source and targets along with database objects.
Data profiling in Analysts and developer tool and improve the steps of analyzing profiling, validation and cleansing data.
Create rules as per business requirement (critical and warning) and use out of the box rules.
Capability to combine data quality rules with data transformation logic and ability to conduct profiling even in midstream.
Create mapplets as per business requirement and use as a rule.
Create the scorecard.
Working on reference tables.
Create the store procedures.
Developed test cases and unit testing, conducting peer reviews of all my mappings before moving it to production.
Wrote the High Level / Low level Design Documents, Application Control Documents, Interface Control Documents in DOORS.
Create Run Books and Release notes for the migration of the Code and performed Migration through Clear Case Tool.
Effectively involved in test data preparation and code review.
Attending Daily defect calls for Project discussion.
Environment: 3 Tier Architecture.
Tools Used: Informatica Power Center 9.5.1, MDM, IDQ, Toad, FTP, SFTP Shell Scripts, Text Editor. Database: Oracle 11g, Netteza 7.0.x
GHH Managed Dev Services (Merck, USA) Nov 2012 – Feb 2014
Tech Lead
As part of this program, Merck has asked ZS to implement the Javelin Affiliation Manager and Javelin Alignment Management Solution Modules of ZS Javelin Platform (which will be licensed separately to Merck) and VEEVA to implement the new front end Sales Force Automation (SFA) Tool. Integration between these two vendor solutions will enable an overall solution to support Merck’s future state Flex Program Capabilities. As part of integration Merck would send some data to Javelin to initiate the system.
The objectives of FSI are to:
Reduce operational risk through process change, improved data quality and Automation.
Improve reporting and analysis capabilities, reliability and timeliness.
Improve the quality, accuracy, and consistency of all data coming into business by providing a unified capability for acquisition of data and proactive management of upstream SLA.
Reducing the time and cost of data cleansing activities by doing it once as the data model created and complex data sourcing logic in multiple application.
Responsibilities:
Dimensional Data Modeling experience on Data modeling, Conceptual, Physical & Logical data modeling.
Doing Changes in LLD (Low Level Design) PDM (Physical Data Model) and new CR.
Requirement Analysis through business model and functional model.
Unstructured data parse from source to target files.
Developed Informatica 9.0.1 mappings/Workflows using various transformations like Expression, Router, Update strategy, Connected and Unconnected lookup’s, Aggregator, Sorter, Sequence generator etc and having files as a source and targets along with database objects.
Performance tuning.
Create the store procedures.
Developed test cases and unit testing, conducting peer reviews of all my mappings before moving it to production.
Effectively involved in test data preparation and code review.
Attending Daily client calls for Project discussion.
Actively involved in Defect solving in SIT and UAT testing phase
Environment: 2 Tier Architecture.
Tools Used: Informatica Power Center 9.0.1, SQL Developer, FTP, SFTP Shell Scripts, Text Editor.
Database: Oracle 10g
Cree Incorporated May 2012 – Nov 2012
Support Lead
Maintained the daily transaction.
The objectives of FSI are to:
Reduce operational risk through process change, improved data quality and Automation.
Improve reporting and analysis capabilities, reliability and timeliness.
Improve the quality, accuracy, and consistency of all data coming into Manufacture by providing a unified capability for acquisition of data and proactive management of upstream SLA.
Reducing the time and cost of data cleansing activities by doing it once as the data model created and complex data sourcing logic in multiple applications.
Responsibilities:
Worked with oracle EBS/oracle apps base tables in modules like PO, PA and AR.
Developed Informatica 8.1.1 mappings/Workflows using various transformations like
Expression, Router, Update strategy, Connected and Unconnected lookup’s, Aggregator, Sorter, Sequence generator etc and having files as a source and targets along with database objects.
Developed test cases and unit testing, conducting peer reviews of all my mappings before moving it to production.
Alter the store procedures and packages.
Effectively involved in test data preparation and code review.
Attending Daily client calls for Project discussion.
Actively involved in Defect solving in SIT and UAT testing phase.
Attending Daily client calls for Project discussion.
Monitoring the Dashboard DAC
Environment: 3 Tier Architecture.
Tools Used: Informatica Power Center 8.6.1, OBIEE (reporting Tool), SQL Developer, Shell Scripts.
Database: Oracle 10g
Wipro Technologies, Technical lead Oct 2010 – May 2012
Client: DCM CCP (JP Morgan, USA)
Derivative Collateral management wants to reduce the reporting cost.
The objectives of FSI are to:
Reduce operational risk through process change, improved data quality and Automation.
Improve reporting and analysis capabilities, reliability and timeliness.
Improve the quality, accuracy, and consistency of all data coming into Finance by providing a unified capability for acquisition of data and proactive management of upstream SLA.
Reducing the time and cost of data cleansing activities by doing it once as the data model created and complex data sourcing logic in multiple application.
Responsibilities:
Done the Data Profiling in IDQ 9.0.1.
Dimensional Data Modeling experience on Data modeling, Conceptual, Physical & Logical data modeling.
Prepare the HDL (High level Design) and LLD (Low Level Design)
Create PDM (Physical Data Model).
Requirement Analysis through business model and functional model.
Developed a Metadata as per the data model for Project work.
Performance tuning.
Developed Informatica 9.0.1 mappings/Workflows using various transformations like Expression, Router, Update strategy, Connected and Unconnected lookup’s, Aggregator, Sorter, Sequence generator etc and having files as a source and targets along with database objects.
Create oracle functions and procedures.
Developed test cases and unit testing, conducting peer reviews of all my mappings before moving it to production.
Effectively involved in test data preparation and code review.
Attending Daily client calls for Project discussion.
Actively involved in Defect solving in SIT and UAT testing phase
Environment: 2 Tier Architecture.
Tools Used: Informatica Power Center 9.0.1, IDQ 9.0.1, BIRT, FCI, Jasper (reporting Tool), SQL Developer, Netezza, Shell Scripts, Java, JavaScript, Autosys
Database: Oracle 10g, Teradata, D2B
Wipro Technologies, Senior Software Engineer June 2010 - Oct 2010
Client: FIN IT Gateway (Credit-suisse, USA)
Finance IT has a large volume of data coming from variety of front and back office systems in multiple formats with mixed data quality and sometimes overlapping datasets The Financial Systems Initiative(FSI), is transformation initiative with intense investment to provide a robust framework for the future.
The objectives of FSI are to:
Reduce operational risk through process change, improved data quality and Automation.
Improve reporting and analysis capabilities, reliability and timeliness.
Improve the quality, accuracy, and consistency of all data coming into Finance by providing a unified capability for acquisition of data and proactive management of upstream SLA.
Reducing the time and cost of data cleansing activities by doing it once as the data enters the gateway and Eliminating redundant and complex data sourcing logic in multiple application.
Responsibilities:
Requirement Analysis through business model and functional model.
Developed a Metadata as per the data model for Project work.
Developed Informatica 8.6 mappings/Workflows using various transformations like Expression, Router, Update strategy, Connected and Unconnected lookup’s, Java Transformation, HTTP transformation, Aggregator, Sorter, Sequence generator etc and having files as a source and targets along with database objects.
Improved Performance by selecting all Dimension LKP cache as persistent and reused those caches for Fact loads.
Effectively involved in test data preparation and code review.
Attending weekly client calls and Video conference for Project discussion.
Actively involved in Defect solving in SIT and UAT testing phase.
Environment: 2 Tier Architecture.
Tools Used: Informatica Power Center 8.6.1, SQL Developer, Java, and Shell Scripts, Autosys.
Database: Oracle 10g
Parent Company - Triumph System & Solution Pvt Ltd
Wipro Technologies, – Senior Software Engineer Nov 2008 - June 2010
Client: J.P Morgan Chase (USA)
Cleansing the data and then store into the staging table, Creation of reports from CCMS (AS400,Oracle) Database, Via Microsoft Excel and Birt and run into the FCI (Foundation Components Intelligence).I worked for writing the Queries and Generates the report in Microsoft Excel and BIRT.FCI is JPMorgan report generating tool. Upload the template generated by us on offshore.
Responsibilities:
Done the Data Profiling in IDQ.
Requirement Analysis through business model and functional model
Working as a developer for the entire modules.
Responsible for timely development and release of modules in phases.
Involved in Problem Solving and maintaining a constant interaction with the Onsite and Configuration team for speedy closure of development related issues.
Requirement Analysis through business model and functional model.
Developed a Metadata as per the data model for Project work.
Performance tuning informatica and database.
Developed Informatica 8.6 mappings/Workflows using various transformations like Expression, Router, Update strategy, Connected and Unconnected lookup’s, Aggregator, Sorter, Sequence generator etc and having files as a source and targets along with database objects.
Create oracle functions and store procedures.
Developed test cases and unit testing, conducting peer reviews of all my mappings before moving it to production.
Effectively involved in test data preparation and code review.
Attending weekly client calls for Project discussion.
Actively involved in Defect solving in SIT and UAT testing phase
Environment: 2 Tier Architecture.
Tools Used: Informatica Power Center 8.6.1.IDQ, FCI, BIRT, Jasper (reporting tool), SQL Developer.
Database: Oracle 10g, Teradata, Netezza and DB2
3i-infotech Pvt. Ltd., Senior Software Engineer June 2008 – Nov 2008
Client: American Physician