Post Job Free

Resume

Sign in

Informatica developer

Location:
Frisco, TX
Posted:
August 17, 2020

Contact this candidate

Resume:

NAGARATNA

Informatica Developer

adfd2o@r.postjobfree.com

Cell: 972-***-****

Location: Frisco, TX

Professional Summary:

Over 8 years of IT experience in all phases of Software Development Life Cycle (SDLC) which includes User Interaction, Business Analysis/Modeling, Design, Development, Integration, Planning and testing and documentation in data warehouse applications, ETL processing and distributed applications.

Excellent domain knowledge of Banking Financial, Manufacturing and Insurance.

Strong expertise in using ETL Tool Informatica Power Center 9.6.1/9.0.1/8.x (Designer, Workflow Manager, Repository Manager and ETL concepts.

Extensive experience with Data Extraction, Transformation, and Loading (ETL) from disparate data sources like Multiple Relational Databases (ORACLE 12c/11g/10g, Teradata 14/13, Netezza, UDB DB2, Mainframes, SQL server 2012, SAP, Sybase, Informix, MySQL, Flat Files, MQ series and XML.

Created custom modals in Informatica metadata manager for data governance and impact analysis.

Experience in Informatica PowerCenter with WebService Sources and Targets.

Strong experience with Informatica tools using real-time Change Data Capture.

Worked with various transformations like Normalizer, expression, rank, filter, group, aggregator, lookups, joiner, sequence generator, sorter, SQLT, stored procedure, Update strategy, Source Qualifier, Transaction Control, JAVA, Union, CDC etc.,

Worked with Teradata utilities like Fast Load and Multi Load and Tpump and Teradata Parallel transporter and highly experienced in Teradata SQL Programming.

Experienced in Teradata Parallel Transporter (TPT). Used full PDO on Teradata and worked with different Teradata load operators.

Designing and developing Informatica mappings including slowly changing dimensions (SCD) type 1 and 2.

Experienced in using advanced concepts of Informatica like push down optimization (PDO).

Validating data files against their control files and performing technical data quality checks to certify source file usage.

Very good in data modeling knowledge in Dimensional Data modeling, Star Schema, Snow-Flake Schema, FACT and Dimensions tables.

Experienced in writing SQL, PL/SQL programming, Stored Procedures, Package, Functions, Triggers, Views, Materialized Views.

Experience in Performance Tuning and Debugging of existing ETL processes.

Change Data Capture can do using the Power Exchange.

Hands on experience in writing UNIX shell scripts to process Data Warehouse jobs.

Coordinating with Business Users, functional Design team and testing team during the different phases of project development and resolving the issues.

Good knowledge of SAS.

Good skills in defining standards, methodologies and performing technical design reviews.

Excellent communication skills, interpersonal skills, self-motivated, quick learner, team player.

Education:

Bachelor of Computer Applications, IGNOU, 2008 India.

Technical Skills:

ETL Tools

Informatica Power Center (9.6.1/9.0.1/8.x /7.x), PowerExchange 9.6.1/9.0.1, Informatica Data Quality 9.6.1, Power connect for SAP BW, power connect for JMS, power connect for IBM MQ series, power connect for Mainframes, DTS

Languages

C, C++, SQL, PL/SQL, HTML, XML, UNIX Shell Scripting, Java

Methodology

Ralph Kimball’s Star Schema and Snowflake Schema

Databases

Oracle 11g/10g, SQL Server 2012/2008/2005/2000 DB2, Teradata 14/13, UDB DB2, Sybase

Operating Systems

Windows NT, 2003, 2007, UNIX, Linux

IDEs

PL/SQL Developer, TOAD, Teradata SQL Assistant

Modelling Tool

ERwin 9.5.2/7.3/4.1, MS–Visio 2013, UML

Scheduling Tools

Control-m, Autosys

Reporting

Tableau 9.2, Cognos 9, MicroStrategy 9

Others Tool

JIRA, Notepad++, Teradata SQL Assistant, Teradata view point MS office, T-SQL, TOAD, SQL Developer, XML Files, Icescrum, JIRA, Control–M, Autosys, GitHub, ORACLE ERP, PUTTY, SharePoint, SVN, SOAP, WSDL

Professional Experience

Toyota Fin Services Oct 2019 to Present

Informatica Developer

Description: Toyota Financial Services auto loans are available to those who plan to buy or lease a Toyota from a participating dealership.

The objective of this project to enable an improvement in overall financial performance by providing the reporting and analysis capabilities necessary to support Toyota growth and the resulting business process changes.

Responsibilities mentioned:

●Gathering requirements and implement them into source to Data lake.

●Experience in integration of data sources like SQL server and MS access and non-relational sources like flat files into S3 bucket/AWS.

●Moved the data from Data Lake to reporting team.

●Designing custom reports via SQL Reporting Services to align with requests from internal account teams and external Clients.

●Work on Mapping Designer, Workflow Manager and Workflow Monitor.

●Use shortcuts for sources, targets, transformations, mapplets, and sessions to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.

●Apply slowly changing Dimensions Type I and Type II on business requirements.

●Work on performance tuning and in isolating header and footer in single file.

●Designing and executing test scripts and test scenarios, reconciling data between multiple data sources and systems.

●Involve in requirement gathering, Design, testing, project coordination and migration.

●Project planning and scoping, facilitating meetings for project phases, deliverables, escalations and approval. Ensure adherence to SDLC and project plan.

●Effectively understand session error logs and used debugger to test mapping and fixed bugs in DEV.

●Precisely document mappings to ETL Technical Specification document for all stages for future reference.

●Create requirement specifications documents, user interface guides, and functional specification documents, ETL technical specifications document and test case.

●Extensively used Router, Lookup, Aggregator, Expression and Update Strategy Transformations.

●Fine-tuned ETL processes by considering mapping and session performance issues.

Verizon, Irving, TX Jul 2018 to Aug 2019

Informatica / Teradata Developer

Description: Verizon is a telecommunications company which offers wireless and wireline products and services.

This project involves building a data warehouse by consolidating data from a variety of systems into a central data warehouse and storing historical information. This helped the management to quickly identify trends and respond to changing market and economic conditions by achieving its continued goals of excellent customer service and solid performance. The scope of this project is to build a new analytical layer. Data Mart to serve Business Sales reporting.

Responsibilities:

Involved in design and development of data warehouse environment, interacted with business users and technical teams for gathering requirements.

Using Informatica Designer designed Mappings, which populated the Data into the Target system on Teradata.

Using SQL Teradata Assistant to query Teradata tables.

Optimized Query Performance, Session Performance and Reliability.

Use most of the transformations such as the Aggregators, Filters, Routers, Sequence Generator Update Strategy, Rank, Expression, lookups (connected and unconnected), Mapping Parameters, Session parameters, Mapping Variables and Session Variables.

Responsible for Creating workflows and Worklets. Create Session, Event, Command, Control, Decision and Email tasks in Workflow Manager.

Tuning the Mappings for Optimum Performance, Dependencies and Batch Design.

Code walks through with team members.

Customize UNIX wrapper script for ETL job run.

Schedule and Run Extraction and Load process and monitor sessions using Informatica Workflow Manager.

Scheduled the batches to be run using the Workflow Manager.

Involved in identifying bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the issues so that they conform to the business needs.

Performed Unit testing and Integration testing on the mappings in various schemas.

Monitored sessions that were scheduled, running, completed or failed. Debugged mappings for failed sessions.

Raise change requests, analyzed and coordinated resolution of program flaws and fixed them in DEV and Pre-Production environments, during the subsequent runs and PROD

Writing BTEQ scripts for Informatica ETL tool to run the sessions.

Excellent technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP.

Loaded the data to Dimensional as well as Fact tables.

Assist with ETL processes, dimensional modeling and OLAP cube development.

Fixed the issue in DEV and Production environments, during the subsequent runs.

Working knowledge of the Cognos BI reporting tool.

Proficient in Active Reports.

Constantly monitored application attributes to ensure conformance to functional specifications.

Mentored the development members on ETL logic and performed code and document reviews.

Environment: Informatica PowerCenter 9.6.1, Teradata 15, BTEQ Scripts, SQL Teradata Assistant, UNIX, SQL, HP Quality Center, SQL Server, Putty

Nationwide Insurance, OH Jan 2017 to Jul 2018

Informatica Developer/ Production support

Description: Nationwide Insurance Company is a group of large U.S. insurance and financial services companies. It is ranked 91 in the most recent Fortune 500.

Analyzed business requirements and module-specific functionalities to identify requirements and formulate an effective master plan.

Responsible for Error handling, bug fixing, Session monitoring, log analysis implemented various Data Transformations using Slowly Changing Dimensions in an Agile environment.

Worked on client tools viz. Source Analyzer, Mapping Designer, Mapplet Designer and Workflow monitor.

Analyzed the ETL mappings by validating whether the mapping adhere to the development standards and naming conventions; whether the mapping do what the technical design says it should do; whether the mapping work correctly in relation to other processes in the data logistical flow

Experienced in analyzing the issue by checking the log files.

Performed unit testing ETL mappings to extract and load data from different databases such as Oracle, SQL Server and flat files and loaded them in to Oracle.

Prepared RTT for different kind of tickets like Analysis, Operational to describe program development, logic, coding, testing, changes and corrections.

Used Quality Center for Test Planning, Test Designing, Test Analysis, Test Execution, Defect Tracking and Test Result.

Extensive used TOAD to run SQL queries and monitor quality & integrity of data.

Escalate issues and remediation steps effectively and in a timely manner.

Support the Client Reporting Production Environment as a first priority.

Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.

Performed Unit and functional testing for all the Mappings and Sessions.

Written SQL Scripts to extract data from Netezza Databases.

Environment: Informatica Power Center 9.6.1, Power Exchange, Oracle 11g, Netezza, Control-M, SalesForce, SQL, UNIX.

Travelocity.com. Dallas, TX Dec 2014 to Dec 2016

Sr. Informatica Developer

Description: Travelocity.com is a travel fare aggregator website and travel metasearch engine. The website is owned by a subsidiary of Expedia Inc. Travelocity.com allowed consumers the ability to reserve, book, and purchase tickets without the help of a travel agent or broker. In addition to airfares, the site also permits consumers to book hotel rooms, rental cars, cruises and packaged vacations.

The scope of CDM project is to develop new Customer Data mart that entails all offers sent to customer via any channel like mobile, email, social medial, third party etc. ; improve performance across all data elements into reports; provide real-time data of all offers across all channels and Business Units (BU) ; Better reporting tools; covers Financial reporting, Improve speed for changes, changes flowing to all reports, new fields, etc. through which Stakeholder will have a dynamic data driven system that covers all offers, improves overall member experience, improves data flow into all aspects of the business and provides stakeholders timely actionable reports to increase revenue.

Responsibilities:

●Analyze business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.

●Worked extensively with mappings using expressions, aggregators, filters, lookup, joiners, update strategy and stored procedure transformations.

●Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.

●Developed mapping to load Fact and Dimension tables, for type 1 and type 2 dimensions and incremental loading and unit tested the mappings.

●Extracted data from a web service source, transform data using a web service, and load data into a web service target.

●Experience in real time Web Services which performs a lookup operation using key column as input and provided response with multiple rows of data belonging to key.

●Used Web Service Provider Writer to send Flat file target as attachments and for sending email from within a mapping.

●Imported Hive table using PowerExchange.

●Used Power Exchange for loading/retrieving data from mainframe system.

●Coordinate and develop all documents related to ETL design and development.

●Involved in designing the Data Mart models with ERwin using Star schema methodology.

●Good understanding and exposure to data governance and MDM concepts.

●Excellent experience and knowledge on Star Schema and OLTP data modeling design and technique. Good with Erwin data modeling tool.

●Working with data architecture team and data governance team to apply the Data Modeling Best Practices, Modeling Standards, Naming Conventions to Entities / Attributes / Relationship

●Used repository manager to create repository, user’s groups and managed users by setting up privileges and profile.

●Used debugger to debug the mapping and correct them.

●Used Informatica workflow manager for creating, running the Batches and Sessions and scheduling them to run at specified time.

●Executed sessions, sequential and concurrent batches for proper execution of mappings and set up email delivery after execution.

●Preparing Functional Specifications, System Architecture/Design, Implementation Strategy, Test Plan & Test Cases.

●Improving the performance of the ETL by indexing and caching.

●Created Workflows, tasks, database connections, FTP connections using workflow manager.

●Responsible for identifying bugs in existing mappings by analyzing data flow, evaluating transformations and fixing bugs.

●Developed stored procedures using PL/SQL and driving scripts using Unix Shell Scripts.

●Created PL/SQL stored procedures, functionsandpackages for moving the data.

●Created interactive report and dashboards in using visualization tool Tableau.

●Created UNIX shell scripting for automation of ETL processes.

●Used UNIX for check in’s and check outs of workflows and config files in to the Clearcase.

●Automated ETL workflows using Control-M Scheduler.

●Involved in production deployment and later moved into warranty support until transition to production support team.

●Experience in monitoring and reporting issues for the Daily, weekly and Monthly processes. Also, work on resolving issues on priority basis and report it to management.

Environment: Informatica PowerCenter 9.6.1, IDQ 9.6.1, Tableau 10, Oracle 11g, Teradata 14.1.0, WebService, Hadoop, Hive, Teradata SQL Assistant, MSSQL Server 2012, DB2, Erwin 9.2, DAC Scheduler, Putty, Shell Scripting, Putty, WinSCP, Notepad++, JIRA, Control-M V8, Hyperion Server, OBIEE Reporting.

Connecture, Inc, Brookfield, WI Aug 2013 to Nov 2014

Sr. Informatica Developer

Description: Connecture, Inc. (CNXR) is a leading web-based consumer shopping, enrollment and retention platform for health insurance distribution. Its solutions offer a personalized health insurance shopping experience that recommends the best fit insurance plan based on an individual's preferences, health status, preferred providers, medications and expected out-of-pocket costs. Its customers are payers, brokers, government agencies, and web-based insurance marketplace operators, who distribute health and ancillary insurance.

The project (CMS Medicare) is aimed at developing an integrated data mart for long-term decision-making, strategic plans, support and solutions giving mainstream users the ability to access and analyze and help stakeholder to automate key functions in the insurance distribution process, allowing its customers to price and present plan options accurately to consumers and efficiently enroll, renew and manage plan members.

Responsibilities:

●Co-ordination from various business users’ stakeholders and SME to get Functional expertise, design and business test scenarios review, UAT participation and validation of data from multiple sources.

●Created Mappings using Mapping Designer to load the data from various sources, using different transformations like Source Qualifier, Expression, Lookup (Connected and Unconnected), Aggregator, Update Strategy, Joiner, Filter, and Sorter transformations.

●Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables and Session Parameters.

●Worked on Power Center Designer tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.

●Worked closely with governance group, SMEs, System Solution Architects and DBAs.

●Used Debugger within the Mapping Designer to test the data flow between source and target and to troubleshoot the invalid mappings.

●Worked on SQL tools like TOAD and SQL Developer to run SQL Queries and validate the data.

●Scheduled Informatica Jobs through Autosys scheduling tool.

●Extensively used advance chart visualizations in Tableau like Dual Axis, Box Plots, Bullet Graphs, Tree maps, Bubble Charts, Water Fall charts, funnel charts etc., to assist business users in solving complex problems.

●Developed of key indicators and the appropriate tracking reports with graphical and written summations using summary and Annotations to assist in the quality improvement initiatives.

●Created quick Filters Customized Calculations, Conditional formatting for various analytical reports and dashboards.

●Studying the existing system and conducting reviews to provide a unified view of the program.

●Involved in creating Informatica mappings, Mapplets, worklets and workflows to populate the data from different sources to warehouse.

●Created Global Prompts, Narrative Views, Charts, Pivot Tables, Compound layouts to design the dashboards.

●Responsible to facilitate load testing and benchmarking the developed product with the set performance standards.

●Used MS Excel, Word, Access, and Power Point to process data, create reports, analyze metrics, implement verification procedures, and fulfill client requests for information.

●Created interactive report and dashboards in using visualization tool Tableau.

●Created PL/SQL scripts to extract the data from the operational database into simple flat text files

●Involved in testing the database using complex SQL scripts and handled the performance issues effectively.

●Involved in Onsite & Offshore coordination to ensure the completeness of Deliverables.

Environment : Informatica PowerCenter 8.6.1/8.1.1,Tableau 9.2,Cognos 9, SQL Server 2008, IDQ 8.6.1, Oracle 11g, PL/SQL, TOAD, Putty, Autosys Scheduler, UNIX, Teradata 13, Erwin 7.5, ESP, WinScp

Caterpillar Inc., Peoria, IL Mar 2012 to Aug 2013

Informatica Developer

Description: Caterpillar Inc., is an American corporation which designs, manufactures, markets and sells machinery, engines, financial, products and insurance to customers via a worldwide dealer network. Caterpillar is the world's leading manufacturer of construction and mining equipment, diesel and natural gas engines, industrial gas turbines and diesel-electric locomotives.

The objective of this project to enable an improvement in overall manufacturing performance by providing the reporting and analysis capabilities necessary to support Caterpillar’s growth, and the resulting business process changes.

Responsibilities:

●Gathering requirements and implement them into source to target mappings.

●Experience in integration of data sources like SQL server and MS access and non-relational sources like flat files into staging area.

●Designing custom reports via SQL Reporting Services to align with requests from internal account teams and external Clients.

● Designed and developed Technical and Business Data Quality rules in IDQ (Informatica Developer) and created the Score Card to present it to the Business users for a trending analysis (Informatica Analyst)

●Effectively worked on Mapping Designer, Workflow Manager, and Workflow Monitor.

●Extensively used Sequence Generator in all mappings and fixed bugs / tickets raised in production for existing mappings in common folder for new files through versioning (Check in and Check out) on an urgency through support for QA in component unit testing and validation.

●Used shortcuts for sources, targets, transformations, Mapplets, and sessions to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.

●Integrated Informatica Data Quality (IDQ) with Informatica PowerCenter and Created POC data quality mappings in Informatica Data Quality tool and imported them into Informatica PowerCenter as Mappings, Mapplets

●Applied slowly changing Dimensions Type I and Type II on business requirements.

●Extensively worked on performance tuning and in isolating header and footer in single file.

●Working with huge volume of data independently executing data analysis, utilizing appropriate tools and techniques (Interpreting results and presenting them to both internal and external client.

●Writing SQL queries to create end-user reports /Developing SQL Queries and stored procedures in support of ongoing work and application support.

●Designing and executing test scripts and test scenarios, reconciling data between multiple data sources and systems.

●Involved in requirement gathering, Design, testing, project coordination and migration.

●Project planning and scoping, facilitating meetings for project phases, deliverables, escalations and approval. Ensure adherence to SDLC and project plan.

●Worked on Multidimensional Models and created Reports in Report Studio using Cubes as a data sources.

●Effectively understood session error logs and used debugger to test mapping and fixed bugs in DEV in following change procedures and validation.

●Raised change requests, analyzed and coordinated resolution of program flaws and fixed them in DEV and Pre-Production environments, during the subsequent runs and PROD.

●Perform analysis profiling on existing data and identify root causes for data inaccuracies, Impact Analysis and recommendation of Data Quality.

●Precisely documented mappings to ETL Technical Specification document for all stages for future reference.

●Scheduled jobs for running daily, weekly and monthly loads through control-M for each workflow in a sequence with command and event tasks.

●Created requirement specifications documents, user interface guides, and functional specification documents, ETL technical specifications document and test case.

●Used most of the transformations such as the Aggregators, Filters, Routers, Sequence Generator Update Strategy, Rank, Expression, lookups (connected and unconnected), Mapping Parameters, Session parameters, Mapping Variables and Session Variables.

●Responsible for studying the existing data warehouse and working on migrating existing PL/SQL packages, stored procedures, triggers and functions to Informatica Power Center.

●Fine-tuned ETL processes by considering mapping and session performance issues.

●Responsible for Creating workflows and Worklets. Created Session, Event, Command, Control, Decision and Email tasks in Workflow Manager.

●Customize UNIX wrapper script for ETL job run.

●Used SED/AWK command extensively for data/file operation.

●Wrote code for integrating ETL job which are called to job scheduling tool (Control-M)

●Maintained the proper communication between other teams and client.

Environment: Informatica Power Center 8.1, SQL, PL/SQL, UNIX, Shell Scripting, SQL Server 2008, Sybase, Oracle 11g, DB2, Control-M, MicroStrategy 9.



Contact this candidate