Post Job Free

Resume

Sign in

Data Sql Server

Location:
Frisco, TX
Salary:
$100000.00
Posted:
March 28, 2017

Contact this candidate

Resume:

Sagar Porika

aczi2g@r.postjobfree.com

C: 972-***-****

INFORMATICA DEVELOPER

PROFESSIONAL SUMMARY

Nine years of diversified experience in the field of Information Technology and Data warehousing using Informatica Power Center/PowerExchange/B2B Dx/IDQ 10.1/9.6/8.6/8.1/7.1/6.2 Target Databases and developing Strategies for Extraction, Transformation and Loading (ETL) mechanism-using Informatica.

Extensive experience in Banking, Finance Domain, Pharmaceutical and Manufacturing Industries,

Extensive experience in complete Data warehouse project development life cycle and Software Development Life Cycle (SDLC) including System Requirements Analysis, Design, Development, Testing, Maintenance and Enhancement in variety of technological platforms with special emphasis on Data Warehouse and Business Intelligence applications.

Complete understanding of Soft Ware Development methodologies like waterfall, Agile etc.

Data Processing experience in designing and implementing Data Mart applications, mainly transformation processes using Informatica PowerCenter 8.6.

Extensively worked on Informatica PowerCenter Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter and Sequence Generator.

Good knowledge in OLAP and OLTP process.

Understanding & Working knowledge of Informatica CDC (Change Data Capture).

Proficiency in using Informatica Power Center tool to design data conversions from wide variety of sources.

Strong in Data warehousing concepts, dimensional Star Schema and Snowflake Schema methodologies

Extensively provided end to end 24/7 Production Support by monitoring data into a complex high volume environment using Quick Base ticket logging system.

Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star Schema and Snowflake Schema used in relational, dimensional and multi-dimensional modeling.

Proficient in using Informatica workflow manager, Workflow monitor to create, schedule and control workflows, tasks, and sessions

Experience in Performance tuning of the Informatica and Oracle.

Expertise in implementing complex Business rules by creating complex mappings / mapplets, shortcuts, reusable transformations and Partitioning Sessions.

Experience in Performance Tuning of sources, targets, mappings, transformations and sessions.

Extensively worked on L1 & L2 production support related issues and analyses issue priority and address to L3 team with details and follow up on items.

Providing Level 2 support for Tier 0 application with a four hour SLA. Liaison with L3 and DEVOPS when production issues occur and triage resolutions within the SLA window.

Responsible for creating conference calls for groups to discuss incidents that are production

related, time or scheduled related incidents.

Coordinate with DEV teams and work with support teams to move fixes to production and CAB team.

Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer and Teradata SQL Assistant.

Experience on utilities like BTEQ, FastLoad, MultiLoad, TPump, FastExport, TPT.

Experience in working with Informatica Admin team on installing Informatica server, power center client, B2B DX client in Development, QA and Prod machines.

Experienced with working on Onshore Offshore Support Model and shifts.

Experience in working with production support team and Change management and Incident Management and Deployment Management teams.

Experience in using Informatica Data Quality IDQ for Data Profiling, Standardization, Enrichment, Matching, Consolidation transformations.

Experience in using the Informatica command line utilities like pmcmd to execute workflows in non-windows environments.

Extensively worked on various databases including Oracle 10g, SQL Server, DB2, and Teradata.

Experience in Integration of various data sources like Oracle, SQL Server, and Flat Files.

Extensively worked with SQL loader for bulk loading into the oracle database.

Experience in MS SQL Server 2008/2005/2000, Business Intelligence in MS SQL Server Integration Services (SSIS), MS SQL Server Reporting Services (SSRS), and MS SQL Server Analysis Services (SSAS).

Experienced in designing high level ETL architecture and Framework, created, configured, and fined tuned ETL workflows designed in DTS for overall data transfer from the OLTP to OLAP with the help of SSIS.

Expert in designing and scheduling complex SSIS Packages for transferring data manually from multiple data sources like SQL Server 2000, Oracle Database, Excel, Flat file, Oracle to Microsoft SQL Server by using SSIS and DTS utility to SQL server 2005/2008.

Experience in UNIX shell scripting, CRON, FTP and file management in various UNIX environments.

Proficient in understanding business processes / requirements and translating them into technical requirements.

Highly adept at creation, execution, testing and debugging of Informatica mappings, mapplets, sessions, tasks, worklets and workflows in UNIX and Windows environment.

Ability to work independently or in a team environment or in a rapid pace environment.

Possess strong interpersonal skills, communication and presentation skills with the ability to interact with people at all levels.

TECHNICAL SKILLS

Data Warehousing ETL

Informatica PowerCenter 10.x/9.x/ 8.x/7.x/6.x/5.x, (Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Repository Manager, Workflow Manager, Workflow Monitor and Informatica Server) ETL,Informatica Data Quality IDQ, IDE, Informatica B2B DX, Repository, Metadata, Data Mart, OLAP, OLTP,SAP BODS, DW, ECC, SLT.

SQL Server BI

SQL Server Management Studio, SQL Server Business Intelligence Development Studio, MS SQL Enterprise Manager, SQL Query Analyzer, SQL Server Profiler, Database Engine Tuning Advisor, DTS, SSIS, SSRS, SSAS, MS Visual Studio 2008, ODBC, FTP, IIS, MS Office 2000.

Data Modeling

Physical Modeling, Logical Modeling, Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flack, Dimensions), Entities, Attributes, Cardinality, ER Diagrams, ERWIN 4.0/3.5.2/2.x.

Databases

Oracle 11g/10g/9i/8.0, Sybase, SSIS, MS SQL Server 7.0/2000/2005/2008, DB2 8.1, OLTP, Teradata, Netezza,Sybase,SAP.

Programming

SQL, T-SQL, PL/SQL, SQL*Loader, Unix, Shell Scripting, SQL Tuning/Optimization, C,Cobol, HTML, Perl

Tools

TOAD,, Toad For Data Analyst, SQL*Plus, SQL Developer, Excel, Word, Autosys Scheduler, DAC, BTEQ, SQL Assistant, TWS, BMC Control-M, Rapid SQL, Aginity for Netezza, PuTTY, WinSCP, Notepad++

Reporting Tools

Oracle Business Intelligence, Cognos, SAP Business Objects.

Environment

UNIX, Windows XP/Vista

PROFESSIONAL WORK EXPERIENCE

BOFA, Dallas, TX Nov ’15 – Present

Informatica Dev & Prod Supp Analyst

Description: Bank of America is an American multinational banking and financial services provides its

products and services through ATMs, call centers, and online, mobile banking platforms. File Request Process FRP and Data Movement Services DMS are part of GWBT Global Whole Sale Banking where FRP& DMS are critical middleware components for file based processing of Large corporate and commercial treasury volumes and moving data.

Responsibilities:

Deep understanding of all technical components of the Informatica product suite (Power Center, Power Exchange, IDQ, B2B Data Exchange, B2B Data Transformation.

Coordinate with offshore Team to improve the applications to run efficiently within the nightly schedule window and meets the business requirements.

Provide development support, including walk through approvals, to other ETL application resources to insure that standards are followed and optimized workflow is implemented.

Coordinate and work with multiple development teams, outside vendors and other support areas for making improvements in the nightly schedules or correcting bugs.

Provide on-call support as needed and worked with internal and external resources to resolve production issues in a timely manner Ensure production environment stability and level of performance .

Analyze problems or issues with a range of complexity from limited to moderate, escalating the most complex. Collaborate with other IS technical groups as required on problem determination, solution implementation, and thorough documentation of progress and results

Apply expert judgement and professional discretion to track ticket patterns, understand business system impacts, determine priority and course of action for escalating tickets

Manage and track all incidents and metrics for EIM queue, monitor and facilitate escalation of requests using ITSM software.

During weekend as primary (on call) shifts, coordinated with upstream and downstream applications implementation that includes, application updates, DB upgrades, CRQ, UNIX upgrades and scheduled downtimes for maintenance.

As Production support analyst apart from working on call issues, also have experience working on manually assigned incident ITSM tickets such as moving processing Prod data to different lower level UAT environments as requested by the clients.

Experienced in Designing of logical and physical models, and walkthroughs of data flow and data mapping with business users.

Manage and own application Bugs and incident life cycle, and drive those to permanent resolutions, implement preventive measures to avoid repetitive issues.

Experienced in taking complete responsibility of driving incidents/ major problems to the resolution and documenting the resolution in knowledge base.

Provided L1 & L2 support on multiple applications across Data Management horizontal primarily met customer expectations on SLA and Process.

Communicate with L3 support team in different regions and facilitate the resolution of issue outside his primary application support area.

Investigation of underlying cause of day to day production issues and drive through resolution independently.

Experienced in creating and maintaining Technical Design document and Load Script document with workflow diagrams and mapping diagrams with all the details for modification to the existing code, Defect fix and code from scratch.

Worked with Teradata utilities like Teradata Parallel Transporter (TPT), BTEQ, Fast Load, Multi Load, Tpump and Worked with work tables and Teradata Stored procedures.

Development of scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.

Daily spot check of system health, monitoring the main jobs and File Request Process.

Responsible for 24 x 7 production system support, including off hours and weekend 'on call' production support responsibilities.

As a Production support analyst strengthening and driving Production Services service delivery capability, reinforcing SME knowledge in the team and across all regions.

Maintain and own bug management and incident resolution process Communicate with stakeholders in case of delays or SLA breaches Resolve bugs within defined service levels.

Experience with Remedy ticketing system like Change Management and Incident Management.

Demonstrated experience with Microsoft Office Suite – Excel, Outlook, SharePoint etc.

Ability to perform root-cause analysis and issue resolution as L1 &L2 Production Support prospective.

Developed DX workflows B2B DX environment after passing profile parameters and integrating with PC workflows.

Informatica B2B Extensively Worked in Processing Structured and Unstructured

data.

Worked as a part of Production Support on rotational basis and resolving root causes.

Carrying out ETL, monitoring jobs, fixing issues and tracking tickets etc .

Scheduling of jobs & Responsible for Monitoring of Jobs and re running if gets aborted.

Responsible for Production Support to resolve the ongoing issues and troubleshoot the problems.

Informing stakeholders for any outage observed & assure them to give a speedy resolution for the same also updating status of progress of restoration work.

Perform custom, ad-hoc data analysis and reporting based on stakeholder requests within SLA.

Excellent problem solving skills with strong technical background and good inter-personal skills, Quick learner and excellent team player.

Environment: Informatica Power Center 9.6, Informatica B2B DX Console, Autosys Scheduler, DAC, Teradata 13.0, SQL Assistant, MS SQL Server 2008, SSIS, SSRS, T-SQL, UNIX, Perl Script, BMC Remedy, ITSM, SVN subversion, WinSCP, Putty, Netezza, Aginity, Toad for Data Analyst, Toad 11.6, Oracle 11g, SQL Developer, MS Xcel, Flat files.

Honeywell, Tempe, AZ March ’14 – Oct ’15

Informatica Developer/Prod Suport

Description: Honeywell International, Inc. is an American multinational conglomerate company that produces a variety of commercial and consumer products, engineering services, and aerospace systems. Designed and developed the code using Informatica to move the Aerospace data of from SAP source systems to Servigistics Data warehouse system.

Responsibilities:

Worked closely with the Business analysts and attended meetings to gather and discuss the requirements for enhancements and defect fixes for various projects

Designed, developed mappings, work flows to extract the data from existing SAP R3 code source system and loaded into Servigistics database.

Analyzed and Identified the Tables/Views, Procedures and ETL objects that need to be migrated and prepared necessary Documentation.

Worked on complete life cycle from Extraction, Transformation and Loading of data using Informatica.

Developed reusable ETL components like Transformations and Mapplets.

Monitored day to day Loads, Addressed & resolved Production issues in an abrupt & timely manner and provided support for the ETL jobs running in Production in order to meet the SLA's.

Provided support and quality validation through test cases for all stages of Unit and Integration testing.

Worked with several facets of the Informatica PowerCenter tool - Source Analyzer, Data Warehousing Designer, Mapping & Mapplet Designer and Transformation Designer. Development of Informatica mappings for better performance.

Developed mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.

Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.

Configured the sessions using workflow manager to have multiple partitions on Source data and to improve performance. Understand the business needs and implement the same into a functional database design.

Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.

Involved in migrating objects to different environments using Deployment groups and Labels.

Prepared various documents like ETL Specification doc, Technical Design doc, Run Book etc.

Worked with Informatica Partitioning when dealing with huge volumes of data and worked on parallelism in loads by partitioning workflows using Pipeline, Round-Robin, Hash, Key Range and Pass-through partitions.

Scheduled configured various daily and monthly ETL loads using BMC Control-M Scheduler.

Worked with Informatica Partitioning when dealing with huge volumes of data and worked on parallelism in loads by partitioning workflows using Pipeline, Round-Robin, Hash, Key Range and Pass-through partitions.

Worked with Teradata utilities like Teradata Parallel Transporter (TPT), BTEQ, Fast Load, Multi Load, Tpump and Worked with work tables, log tables, error tables in Teradata.

Worked on Teradata SQL, BTEQ, MLoad, FastLoad, and FastExport for Ad-hoc queries, and build UNIX shell script to perform ETL interfaces BTEQ, FastLoad orFastExport.

Wrote numerous BTEQ scripts to run complex queries on the Teradata database. Used volatile table and derived queries for breaking up complex queries into simpler queries.

Experienced in configuring and monitoring informatica jobs in BMC Control-M scheduler.

Involved in all phases of SDLC from requirement, design, development, testing and support for production environment.

Resolved Production issues and helped in promoting the code to different environments for testing needs and participated in the deployment group.

Resolved Production issues and helped in promoting the code to different environments for testing needs and participated in the deployment group.

Participate in project status meetings, peer reviews and other events with developers to get the status of the deliverables.

Creating and modifying existing Technical Design Specification document and other client deliverables for different releases.

Coordinated and interacted with other teams to maintain integrity in performing the tasks and execution of the loads appropriately in a timely manner.

Environment: Informatica Power Center 9.5, Oracle 11g, Teradata 12.0, SQL Assistant, MS SQL Server 2008, Toad 10.6, SQL Developer, BMC Control-M Scheduler, SAP R3 Connector, ECC and BW, UNIX, Notepad++, WinSCP, Putty, Flat files.

Volkswagen Group, Detroit, MI Aug ’13 – Feb’ 14

Informatica Developer

Description: Volkswagen is a German automobile manufacturer, Volkswagen is the original and top-selling marque of the Volkswagen Group, the biggest German automaker and the third largest automaker in the world. Develop ETL code to bring identified vehicle (New, CPO and Used), Customer data and Dealer information extracted and loaded into MDM table. As a part of daily incremental load (CDC) all Vehicle sale, Customer and Dealer records that are added since last run will be selected. AOT, Customer and Dealer extracts from MDM sent to vendors. Data Extraction, transformation and loading are handled by Informatica power center tool.

Responsibilities:

Interacted with business community and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build Data Warehouse.

Participated in reviewing requirements during analyze and ensuring that sufficient details is available going into the design phase.

Used Informatica Power Center to create mappings, sessions and workflows for populating the data into dimension, fact, and lookup tables simultaneously from different source systems (SQL server, Oracle, Flat files).

Worked with Informatica Power enter-server, Repository Manager, Designer, server Manager, Informatica Server.

Performed customer address validation and Standardization, Enrichment, Matching, Consolidation using Informatica Data Quality IDQ and integrated with Informatica Power Centre.

Performed daily incremental load (CDC) all Vehicle sale, Customer update and Dealer records that are added since last run will be selected.

Implemented Change Data Capture CDC where Last update date greater than or equal to the last successful workflow run time and less than current workflow run time from audit table.

Scheduled various daily and monthly ETL loads using Tivoli Work Scheduler TWS.

Used transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Update Strategy, Rank, Sorter, Router and Sequence generator.

Worked with PMCMD to interact with Informatica Server from command mode and execute the shell scripts.

Test the Informatica mapping and workflows against the specifications on the detailed design.

Participate in project status meetings, peer reviews and other events with developers to get the status of the deliverables.

Creating and modifying existing Technical Design Specification document and other client deliverables for different releases

Proficient in importing/exporting large amounts of data from files to EDW and

developed the DW ETL scripts using BTEQ, Stored Procedures, Macros in Teradata.

Created numerous scripts with Teradata utilities BTEQ, MLOAD and FLOAD.

Extensively worked in the Performance Tuning of the programs, ETL Procedures and processes.

Developed Pre-Session and Post-Session UNIX scripts to automate the data load processes to target Data warehouse.

Involved in writing UNIX shell scripts to run and schedule batch jobs.

Developed mappings to extract data from SQL Server, Oracle, Flat files and load into Data warehouse using the Mapping Designer.

Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.

Identified sources, targets, mappings and sessions and tuned them to improve performance.

Environment: Informatica Power Center 9.1/8.6, IDQ, UNIX, Perl Script, Tivoli Work Scheduler, MS Access, WinSCP, PuTTY, DB2, Toad 10.6, Oracle 11g,Teradata 12.0, SQL Assistant, SQL Server, MS Xcel, Q, Flat files, Ultra Edit, SAP.

Spirit Aerosystems, Tulsa, OK Aug ’12 – July’13

ETL Developer

Description: Spirit Aerosystems Data Conversion Strategy of ETL process is to extract data from the legacy/mainframe systems, transform it and load to the Commercial Enterprise Resource Planning(CERP). Conversion of historical data, Master Data, In-Process Transactional Data and Point In-Time Balances into SAP. Informatica is the ETL Tool used to achieve data migration of the Tulsa, OK Business Unit, Material Master/Tool Master, BOMs, Routings and Purchase Orders/Inventory that will be used as part of the ETL from legacy to the CERP.

Responsibilities:

Interacted with business community and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build Data Warehouse.

Participated in reviewing requirements during analyze and ensuring that sufficient details is available going into the design phase.

Developed ETL Mappings using Informatica PowerCenter 9.1/8.6 involving extraction and loading of data from Flat Files and relational databases like MS SQL Server 2008 and Oracle9i.

Worked with Informatica PowerCenter-server, Repository Manager, Designer, server Manager, Informatica Server.

Used transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Update Strategy, Rank, Sorter, Router and Sequence generator.

Developed and scheduled Workflows using task developer, work let designer, and workflow designer in Workflow manager and monitored the results in Workflow monitor.

Test the Informatica mapping and workflows against the specifications on the detailed design.

Participate in project status meetings, peer reviews and other events with developers to get the status of the deliverables.

Creating and modifying existing Technical Design Specification document and other client deliverables for different releases.

Worked with business analysts to identify appropriate sources for Data Migration and to document business needs for decision support data.

Worked with the DBA group, SAP Functional Consultants to extract the historical data into Flat Files to build data marts in MS SQL Server database.

Extensively worked in the Performance Tuning of the programs, ETL Procedures and processes.

Developed Pre-Session and Post-Session UNIX scripts to automate the data load processes to target Data warehouse.

Extensively used parameter files to avoid hard-coding in mappings to facilitate possible future changes thereby saving migration times.

Developed mappings to extract data from SQL Server, Oracle, Flat files and load into Data warehouse using the Mapping Designer.

Extracted the data from MS SQL Server, applied Data Cleansing rules,Transformation rules and converted into LSMW format to load into SAP system.

Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.

Defects were tracked, reviewed and analyzed.

Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.

Worked with the DBA group, SAP Functional Consultants to extract the historical data into Flat Files to build data marts in MS SQL Server database.

Scheduled jobs to call the packages and Stored Procedures as per business requirement

Worked on SQL Server Integration Services (SSIS) to integrate and analyze data from multiple heterogeneous information sources (Oracle).

Created packages in SSIS with error handling and worked with different methods of logging in SSIS.

Created SSIS Packages to move the data from tab delimited flat flies into SQL database.

Made dynamic SSIS packages which did data scrubbing, including data validation.

Monitoring the ETL production jobs and resolved the issues/bugs.

Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.

Identified sources, targets, mappings and sessions and tuned them to improve performance.

Environment:Informatica PowerCenter 9.1, Work Flow Manager, Work Flow Monitor, Source Analyzer, Warehouse Designer, Transformation Designer, Mapping Designer, MS SQL Server 2008,SSIS,SSRS,SSAS, DB2, Teradata, SQL Assistant, Oracle 11g,SQL Developer, MS Xcel, Quality Center 10, Sharepoint, Flat files, UNIX Shell Scripts, ManageSoft, VMWare Client 7, PuTTY, SAP,LSMW,SAP.

Freddie Mac, McLean, VA Feb ’12 - July '12

Informatica Developer

Description: The objective of the Multi Family Reporting and Data Management project is to establish the MF reporting infrastructure and to implement a pilot set of MF reports, using the future state data architecture as a guide. Data Marts is built to load the data of Customers, Loan Plans and Transactions from different sources through Informatica server and generating reports on quarterly basis to help the business Analysis’s.

Responsibilities:

Interacted with business community and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build Data Warehouse.

Assisted in designing Logical/Physical Data Models, forward/reverse engineering using Erwin 4.0.

Developed mappings to extract data from SQL Server, Oracle, Flat files, XML files, and load into Data warehouse using the Mapping Designer.

Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.

Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Warehouse.

Developed Slowly Changing Dimensions for Type 1 SCD and Type 2 SCD

Used DTS package to import and export data to from the data base.

Written Queries, procedures, created Indexes, primary keys and data bases testing.

Defects were tracked, reviewed and analyzed.

Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.

Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.

Identified sources, targets, mappings and sessions and tuned them to improve performance

Actively participated in data base testing like checking the constraints, correctness of the data, stored procedures, field size validation, etc.

Created UNIX shell scripting and automation of ETL processes using Crontab.

Monitoring the ETL production jobs and resolved the issues/bugs

Attended the production on call support during weekends and during production deployments.

Used command line mode to embed pmcmd commands in to shell scripts and used pmcmd, pmrep commands in Interactive mode to access repository and workflows.

Environment:Informatica PowerCenter 8.6, Oracle 11g,SQL Developer, Sybase, Embarcadero Rapid SQL 7.7.2,PL/SQL, Flat files, UNIX Shell Scripts, Quality Center 9.5, PuTTY, Microstrategy 8,Open Text Exceed 14, Lotus Notes 7.

First National Banks of Brewton, AL. Mar ‘10 – Dec ‘11

ETL Developer

Description: FNBB Data Warehouse is a strategic initiative intended to re-invent the way First National Bank of Brewton designs, implements, and tracks the operations and to distribute up-to-date analysis and information to management for quick and effective decision-making. The Bank offers a breadth of services like checking accounts, savings account, mortgage loans, personal loans, credit cards and safe deposit boxes and ATM's Bank's initial goal is to better analyze the user actions at the bank's various branches and ATM counters and to capture the user actions and also to analyze the transactions that take place at various bank's branches and ATM's.

Responsibilities:

Interacted with business community and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build Data Warehouse.

Designed, developed data maps using Power Exchange to extract the data from existing mainframe source system.

Developed mappings using Mapping Designer and worked with Aggregator, Lookup (connected and unconnected), Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.

Extensively used Informatica Client tools -- Informatica Repository Manager, Informatica Designer, Informatica Workflow Manager and Informatica Workflow Monitor.

Involved in all phases of SDLC from requirement, design, development, testing and support for production environment.

Responsible for scheduling workflows, error checking, production support, maintenance and testing of ETL procedures using Informatica session logs.

Worked on developing workflows and sessions and monitoring them to ensure data is properly loaded on to the target tables.

Involved in unit testing, Integration testing and User acceptance testing of the mappings.

Used various performance enhancement



Contact this candidate