Post Job Free

Resume

Sign in

Informatica Developer Etl

Location:
Chirala, Andhra Pradesh, India
Posted:
January 24, 2022

Contact this candidate

Resume:

Murali

adp0p3@r.postjobfree.com

224-***-****

Summary:

Around 8 years of Information Technology experience in the analysis, design, development, testing, implementation and support for Data Warehousing projects.

Extensive experience in interaction with users and functional team for gathering the business requirements, functional specifications and estimation of efforts.

Extensive experience in production support processes, incident management, change management and release management.

Experience in various stages of System Development Life Cycle(SDLC) and its approaches like Waterfall & Agile Model.

Experience in Business Intelligence solutions using Data Warehousing/Data mart design, ETL and reporting tools.

Expertise in implementing complex business rules by creating robust Mappings, Mapplets, Sessions and Workflows using Informatica Power Center.

Strong expertise in designing and developing Business Intelligence solutions in staging, populating Operational Data Store (ODS), Enterprise Data Warehouse (EDW), Data Marts / Decision Support Systems using Informatica Power Center ETL tool.

Designed and developed complex mappings by using Lookup, Expression, Update, Sequence generator, Aggregator, Router, Stored Procedure, etc., transformations to implement complex logics while coding a mapping.

Experience in working with relational databases such as Oracle, SQL Server, DB2, MS Access and Teradata.

Strong hands-on experience usingTeradata utilities (SQL, B-TEQ, Fast Load, MultiLoad, FastExport, Tpump, Visual Explain, Query man), Teradata parallel support and Unix Shell scripting.

Proficient in coding of optimizedTeradata batch processing scriptsfor data transformation, aggregation and load usingBTEQ.

Experienced in data modeling and Reverse Engineer by usingERWIN, Microsoft VISIOandOracle Designer

Experienced and comfortable working on various databases like SQL Server, Oracle, DB2 UDB, Teradata.

Well conversant with test case design, test case execution, test data preparation, SDLC concepts, Defect Life cycle.

Experience in integration of various data sources definitions like SQL Server, Oracle, Teradata SQL Assistant, MYSQL, Flat Files, XML and XSDs.

Strong understanding in UNIX Shell scripts and writing SQL Scripts for development, automation of ETL process, error handling, and auditing purposes.

Proficient in coding of optimizedTeradata batch processing scriptsfor data transformation, aggregation and load usingBTEQ.

Expert in using design and management tools like Erwin and Toad, SQL Developer

Good exposure to Object Oriented Programming, Agile Methodologies

Proficient in Data Governance, Data Lifecycle, Data Quality Improvement, Master Data Management, and Metadata Management.

Been part of various integration, reporting, migration, enhancement engagements.

Worked with Data governance team to evaluate test results for fulfillment of all data requirements

Application Data warehousing experience in Financial, Banking, and Retail.

Unit testing and Integration test of the ETL code with test cases.

Proficient in Oracle, Teradata, SQL Server, PL/SQL on UNIX and Windows platforms. Extensive experience in Database activities like Data Modeling, Design, development, maintenance, performance monitoring and tuning, troubleshooting, data migration etc.

Technical Skills:

ETL Tools : Informatica 10.4.1/10.2.0 HotFix 2 /9.1/8.6/8.5/8.1/7.1, ODI, SSIS

Languages : UNIX Scripting, Perl, SQL, PL/SQL, XML, Java

BI Tools : Hyperion Essbase, SSAS, SSRS, Cognos, Business Objects, BIRT

Databases : Oracle 19c/11g/10g/9i/8i, MS SQL Server 2008/2014, DB2 v8.1, Teradata 16.20/14.0

Operating Systems : UNIX, Linux, Windows XP/ server, Sun Solaris

Other tools : JIRA, SVN, Git Hub Clear case, Putty, Erwin 7.1, Autosys, ServiceNow, Pac2k

Testing Tools : Quality Center, Test Director, Win Runner, Quick Test Pro, Load Runner

PROFESSIONAL EXPERIENCE:

Client: Wells Fargo, Chandler AZ Feb 2020 – Till Date

Role: Informatica Developer

Responsibilities:

Performed data analysis and gathered columns metadata of source systems forunderstanding requirement feasibility analysis.

Used Informatica power center 10.4.1 to Extract, Transform and Load data into Teradatadatabase from various sources like SQL Server, Oracle, SharePoint and flat files.

Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.

Parsed high-level design specification to simple ETL coding and mapping standards.

Used various transformations like Source Qualifier, Joiner, Lookup, SQL,router, Filter, Expression and Update Strategy.

Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.

Developed Workflows using task developer, Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor.

Designed and customized data models for Data warehouse supporting data from multiple sources on real time

Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.

Created mapping documents to outline data flow from sources to targets.

Extensively worked in data Extraction, Transformation and loading from Xml files,large volume data and Adobe PDF files toEDW using B2B data transformation andB2B Data exchange.

Developed Fast Load jobs to load data from various data sources and legacy systems to Teradata Staging

Used Teradata utilities FAST LOAD, MULTI LOAD, TPUMP to load data.

Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.

Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.

Worked on Data encryption and Data masking, Create Teradata User Defined Functions (UDF)and used Voltage secure data to Encrypt and Decrypt the confidential/PII data in table, created secured roles to apply data masking on confidential/PII data column in a table.

Worked on Teradata Stored procedures and functions to confirm the data and load iton the table.

Wrote Teradata Macros and used various Teradata analytic functions. Used SQLAssistant to querying Teradata tables.

Created TPT to transfer the data Oracle system to Teradata.

Created, optimized, reviewed, and executedTeradataSQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables.

Extracted data from different sources like MVS data sets, Flat files (“pipe” delimited or fixed length), excel spreadsheets and Databases.

Wrote, tested, and implemented Teradata Fast load, Multiload and BTEQ scripts, DML and DDL.

Performed tuning and optimization of complex SQL queries using Teradata Explain.Expert level knowledge of complex SQL using Teradata functions, macros and stored procedures.

Used Teradata Data Mover to copy data and objects such as tables and statistics from one system to another.

Analyze business requirements, designs and write technical specifications to design and redesign solutions.

involved in Analyzing / building Teradata EDW using Teradata ETL utilities and Informatica.

Involved in Performance tuning at source, target, mappings, sessions, and system levels.

Used ETL, Informatica Designer to design mappings and coded it using reusablemapplets.

Provided support during the system test, Product Integration Testing and UAT.Worked on Change Requests CR while integration testing and UAT are in progress.

Created UNIX scripts to automate the activities like start, stop, and abort theInformatica workflows by using PMCMD command in it.

Prepared UNIX Shell Scripts and these shell scripts will be scheduled in AUTOSYS for automatic execution at the specific timings.

Environment: Informatica 10.2.0 HotFix 2, Informatica 10.4.1, MDM, Oracle, SQL Server, XML, Teradata, GIT HUB, Agile, UNIX, Autosys, Teradata SQL Assistant, Putty.

Client: Department of Health and Human Services, Durham, NC Jul 2018 – Jan 2020

Role: Informatica Developer(Power Center)

Responsibilities:

Analyze and identify reporting requirements for NC FAST Case Management System reports

Develop recommendations to effectively report data of mid to high level complexity related to variedbusiness metrics and for designing, coding, testing, debugging, and documenting reporting solutions.

Enhance existing reports to ensure that solutions continue to meet business needs.

Possess understanding of underlying data sources (databases) and interpret requirements provided bythe business to analyze operational issues.

Gathering and documenting requirements for new reports and changes to existing reports.

Submitting Database Change Requests when modifications to the database are needed.

Working on Jira Tickets on Oracle Warehouse Builder (OWB) ETL failures in Datamart, CDW, Staging,apply code fixes, data fixes, Performance Tuning, deploy in testing and production.

Involved in designing numerous reports by interacting directly with End Users and Business Analysts.

Developed Adhoc Reports (SQL or PL/SQL scripts running in Unix System) to extract data fromdatabase and deliver to user in Excel format.

Using ETL Tool Informatica Power Center 8.x/9.x/10.2 (Mapping Designer, Workflow Manager,Repository Manager and ETL concepts.

Integrated Informatica data quality mappings with Informatica Powercenter.

Developed Oracle Stored procedures, functions, packages, and triggers that pull data for reports andpublished those reports to CSDW & FASTHELP where users access the reports by connecting toappropriate groups.

Developed Informatica mappings to load the data into oracle database.

Designing and developing ETL solutions using Informatica Power Center, Oracle database, Unix scripting, SQL,PL/SQL.

Developed complex reports using BIRT, enabling client to view consolidated data in oneintegrated report. Developing new BIRT Reports, Created reports with scripted Charts & Graphs andtabular reports.

Primarily responsible for Extraction, Transformation and Loading of data using Informatica Power Center toolssuch as Designer, Workflow Manager and Workflow monitor.

Created /updated tables, Indexes, materialized views, synonyms and sequences per requirements.

Used DBMS_SQLTUNE.REPORT_SQL_MONITOR package to generate SQL monitoring report andtunethe queries.

Validating developed software solutions using technologies such as SAP, Retalix, Informatica-MDM, BPM.

Performance Tuning of complex SQL queries using Explain Plan to improve the performance of theapplication.

Upgrade/Migrated the production and development databases from 11g to 12 c.

Partitioned Tables using Range Partitioning, List Partitioning and created local indexes to increase theperformance and to make the database objects more manageable.

Identified and deployed critical updates to ETL mappings with minimal turnaround time.

Tuned OWB mappings and PL/SQL procedures by defining hints, degree of parallelism, tweakingloading parameters and executing them in parallel threads.

Developed Informatica Workflows and sessions associated with the mappings using Workflow Manager.

Designed and developed various custom packages triggers and forms.

Extensively worked on BULK COLLECTS, BULK INSERTS, and BULK UPDATES & BULK DELETES forloading, updating and deleting huge data.

Overridden various methods to call dynamically query at run time using on before factory, onprepare, open, Initialize, on Create, On Render, On create, On-Fetch methods in BIRT

Used Oracle Analytical functions such as RANK, DENSE RANK, LEAD, LAG, LISTAGG & ROW_NUMBERFunctions for sorting data.

Handle the Stability/Improvement project – Analyze the Oracle AWR/ADDM report and suggest forperformance improvement for slow running reports or jobs, review the database tablespace with DBAand add space to the tablespace, Investigate Major job failure and come up with solution.

Extensively involved in performance tuning using Explain Plan, DBMS_PROFILER, Optimized SQLqueries and created Materialized views for better performance.

Worked extensively on Ref cursors and alter table exchange partition and implemented in manyprocedures.

Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target-based commit interval.

Used Oracle Data Integrator to schedule OWB mappings and process flows.

Created error trapping mechanism for all the PLSQL code and OWB mappings in case of failure.

Monitored nightly ETL production mappings through audit and runtime log tables.

Environment: Informatica Power Center,MDM, Toad for Oracle 12.1, PL/SQL Developer tool, BIRT, Putty, Service Now, Toad Data Modeler, SVNSubversion.

Client: Ally Financials, Charlotte, NC Sep 2017 – Jun2018

Role: Informatica Developer

Responsibilities:

Involved in all phases of SDLC from requirement, design, development, testing and support for production environment.

Extracted, Transformed and Loaded data into Oracle database using Informatica mappings and complex transformations (Aggregator, Joiner, Lookup, Update Strategy, Source Qualifier, Filter, Router and Expression).

Designed and developed the Informatica workflows/sessions to extract, transform and load the data into Target.

Implemented row level security and embedded Tableau dashboard in company& web portal using URL parameters for iframe tag.

Involved in the Agile Scrum Process.

Worked with non-transactional data entities for multiple feeds of reference data using Master Data Management(MDM).

Worked on modules like PowerCenter Designer, Workflow Manager, Workflow Monitor and Repository Manger.

Worked with Salesforce Process Builder and consolidated existing workflow rules with process builder.

Performed Teradata SQL Queries, creating Tables, and Views by following Teradata Best Practices.

Repository and Session Variables and Initialized Blocks.

Worked with PowerCenter Designer tools in developing mappings and Mapplets to extract and load the data fromflat files and Oracle database.

Experience in working with Business objects, Libraries, Real time Integration Solution and mediation modules.

Created Views in Oracle and SQL Server to analyze the Data Assurance.

Wrote SQL and PL/SQL scripts to perform database development and to verify data integrity

Expertise in the optimization of the performance of the Designed workflows processes in Informatica and to identify the bottlenecks in different areas after the full volume system run.

Used debugger to test the mapping and fixed the bugs.

Developed the Test Scripts and performed the Unit Tests on the ETL mappings.

Worked with various files like CSV, XML using Informatica Power Center

Help define and track Data Quality and Data Governance metrics for MDM.

Design & develop Cognos reports for various campaign metrics stored in Oracle and Teradata

Scheduling the sessions to extract the data into warehouse database on Business requirements.

Created and maintained the shell scripts and Parameter files in UNIX for the proper execution of Informatica workflows in different environments.

Experience in obtaining data from external systems using Salesforce Connect.

Defined best practices for Tableau report development.

Queried Teradata Database and validated the data using SQL Assistant.

Improved performance of the jobs using various performance tuning techniques.

Prepared the Functional and Technical Documentation.

Environment: Informatica Power Center,MDM, Oracle, Salesforce, Tableau, XML, Teradata, MVS, Agile, SQL Server, TOAD, UNIX, Autosys, MS Visio, JDE.

Client: Bravo Health, Baltimore, MD Aug 2016 -Aug 2017

Role: Informatica Developer

Responsibilities:

Involved in gathering and analyzing the requirements and preparing business rules.

Designed and developed complex mappings by using Lookup, Expression, Update, Sequence generator, Aggregator, Router, Stored Procedure, etc., transformations to implement complex logics while coding a mapping.

Worked with Informatica power center Designer, Workflow Manager, Workflow Monitor and Repository Manager.

Created project plans for development of MicroStrategy projects.

Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, SQL server and Flat files and loaded into Oracle.

Worked on building Salesforce standard/custom report types, Reports and Dashboards across various objects for different business groups.

Created various PL/SQL stored procedures for dropping and recreating indexes on target tables.

Involved in creating new table structures and modifying existing tables and fit into the existing Data Model.

Expertise in working in Agile (Scrum), Waterfall, Spiral methodologies.

Managed Teradata warehouse access of end user/roles and assigned data objects to roles, responsible for maintaining the group master file.

Extracted data from different databases like Oracle and external source systems like flat files using ETL tool.

Worked extensively on various Salesforce standard objects like Accounts, Contacts, Opportunities, Products.

Created User requirement document for realizing MicroStrategy objects to be integrated in the report.

Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches and Target Data.

Good experience working on the Agile environment and PMO standards.

Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using Informatica and experience in PMCMD Informatica commands.

Designed various prototypes of dashboards using MicroStrategy, which would give ability to the users to drill down to claim (lowest) level.

Created Teradata SQL scripts for Cleanup.

Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the Business requirements.

Environment:Informatica Power Center, Oracle, SQL server,Salesforce, XML, MicroStrategy, Teradata, Agile, MVS, SQL, PL/SQL, Toad, SQL Loader, Unix, Flat files.

Client: Sonata Software, India Jan 2013– Dec 2015

Role: ETL Developer

Responsibilities:

Analysis of the ERP system to pull data as per GSDW structure ad requirements.

Gap analysis for all the mandatory columns for GSDW.

Designed and documented all the ETL Plans.

Designed and customized data models for Data warehouse supporting data from multiple sources on real time.

Worked on Unix Shall script. Worked on Data profiling.

Tables, Views and other T-SQL code and SQL joins for applications. Developed various T-SQL.

Co-ordination with the ERP team during the analysis and design phase.

Responsible for tuning the ETL mappings in order to gain high performance.

Identified performance issues in the existing sources, targets and mappings by analyzing the data flow evaluating transformations and tuned accordingly for better performance.

Enhancement of reports and addition of data elements using BO Universe

Development of ETL mapping and Reports along with Unit Test, Integration Testing.

Documentation as per dPMM methodology.

Environment: Informatica, Oracle, Toad, BO, Shell scripts



Contact this candidate