Post Job Free

Resume

Sign in

Sr informatica Developer

Location:
Wisconsin
Posted:
November 25, 2016

Contact this candidate

Resume:

SRAVANTHI

acxm6e@r.postjobfree.com 339-***-****

SUMMARY:

9+ years of IT experience with special emphasis on Analysis, Design, Development and Testing of ETL methodologies in all the phases of the Data Warehousing.

Strong ETL experience using Informatica Power Centre 9.6.1/9.5.1//8.6.1/8.1, IDQ 9,8.6.1, (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Mappings, Mapplets, Transformations and worklets.

Extensive working experience in Retail, Healthcare, Insurances,Banking domain and Agile Envirnoment.

Experience in Design and Development of integrations for SFDC platform

Extensive experience in Extraction, Transformation and Loading (ETL) of data from multiple sources into Data Warehouse and Data Mart.Well versed with Star-Schema & Snowflake schemas for designing the Data Marts.

Experience in database design, entity-relationship modeling and dimensional modeling.

Responsibilities include requirement analysis, Defining the scope, Designing solution, Development and Project execution, Go-live and release activities; handled production support projects.

Extensive work in ETL process consisting of data transformation, data sourcing, mapping, conversion and loading using Informatica.

Extensive experience in implementation of transformations, shell Scripts, Stored Procedures and execution of test plans for loading the data successfully into the targets.

Experience in working with Quality Center to document test cases and track defects.

Designed, Developed and Deployed UNIX shell scripts and Linux Scripts.

Experience in all phases of SDLC including Requirements Gathering, Design, Development, Unit Testing, On-going support and Maintenance.

Hands on experience in design of logical and physical data modeling using Erwi.

Strong understanding of Data Modeling (Relational, dimensional, Star and Snowflake Schema), Data analysis, implementations of Data warehousing using Windows and UNIX.Exposure in informatica server configuration.

Experience in Migrating repositories and connections to different environments and worked on complex flat files using logic to transform data for mapping.

Experience in basic informatica admin activities.

Expertise in creating Mappings, Trust and Validation rules, Match Path, Match Column,Matchrules, Merge properties and Batch Group creation.

Experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.

Had implementation experience on CDC(Change Data Capture).

Exposure on the Informatica MDM Concepts.

Knowledge on Teradata Utility scripts like FastLoad, MultiLoad to load data from various source systems to Teradata.

Exposure on BO reporting tool and Worked on TIDAL and AUTOSYS Scheduling tools.

Extensive experience Oracle 11g/9i/10g, Cognos,SQL Server 2005/2008 and Mainframe DB2.

Excellent with PL/SQL Stored Procedures, Packages, Database Triggers, materialized Views.

Expertise in implementing complex Business rules by creating mappings, mapplets, reusable transformations, and Partitioning.

Experience in source systems analysis and data extraction from various sources like Flat files, ORACLE 11g/10g, SQL Server and XML files.

Proficient in using Informatica workflow manager, workflow monitor, server manager

Involved in the Performance/ Query tuning. Generation /interpretation of explain plans and tuning and optimization SQL to improve performance.

Knowledge in writing, testing and implementation of the Stored Procedures, Functions and triggers using Oracle PL/SQL & T-SQL, Teradata data warehouse using BTEQ,COMPRESSION techniques, FASTEXPORT, MULTI LOAD, TPump and FASTLOAD scripts.

Experience in creating Unit Test cases and development of testing methods and test logs, which are used for QA team and validate loaded data to ensure data quality.

Excellent communication skills, good organizational skills, self-motivated, positive attitude, ability to work independently or co-operatively in a team.

TECHNICAL SKILLS:

ETL Tools

Informatica Power Center 9.6.1/9.5.1/9.1/8.6.1/8.6/8.5/8.1/8.0/ Repository Manager, Source Analyzer, Designer, Work Flow Monitor Mapplet Designer, Mapping Designer, Workflow Manager,IDQ.

Data Modeling

Star Schema Modelling, Snow Flake Modelling

Databases

Oracle 11g/10g/9i/8i,Mainframe DB2, MS SQL Server 2005/2000, DB2, SQL Server 12.0/11.x Teradata ( V2R12,V2R6)

Languages

SQL, PL/SQL, TOAD, UNIX Shell Scripting,Linux, Perl, MLOAD, TPT, TPUMP, Fast Load, Fast Export, XML.

Tools

Microsoft Office - Word, Excel, Access, Visio, PowerPoint, SQL* Plus, Autosys, SQL Developer, TOAD, ISQL, AppWorx

Operating Systems

Windows 2003/2000/NT/XP, AIX, UNIX,Linux

Reporting Tools

OBIEE, BOXI R3,Cognos

EXPERIENCE:

Mattel American Girl/Cognizant, WI Sept 2015-Present

Sr ETL Developer / Informatica Developer

Project: One Store Imgration to Exact Target saleforce and ForCAST B2B

Mattel, known worldwide as the home of Barbie® and Hot Wheels® and countless other successful toy franchises and the Mattel family of companies is the worldwide leader in design, manufacture and marketing of toys and family products. The objective of this project is to get the Customers Abanded items which failed during processs of the purchase from all the Franchises around the world and also assigning each customer Email_address a single promo code and load it into the Data warehouse for analysis The data flow is from different sources (flat files, Oracle).

Responsibilities:

Interact with users and business analysts to gather the requirements on incidents and service requests and to work with the offshore team to get the issues and incidents resolved

Analyzed the source data and business related to it so that the new data sources can be seamlessly integrated to the existing data warehouse.

Worked on Data analyses and gap analyse between Silverpop and Salesforce Exact Target,provided the inputs for the migrations from the Silverpop to Salesforce Exact Target .

Gathered Requirement and also involved in the solution discussions with stakeholders.

Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.

Create high complexity detailed technical design specifications, Develop detailed analysis, design,

construction and testing specifications ensuring technical compatibility and integration for the ExactTarget

Designed Incremental loading process to load data into staging tables.

Used Dynamic Filter in the mappings. And Debugger to test the mappings and fixed the bugs.

Worked on XML sources to load the data in to the .CSV files and wrote unix script for deleting the special charaters from the XML Sources.

Developed mapping to generate EMAIL notification and parameterized the Email address.

Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.

Developed complex mappings with different transformation like Router, Lookup, sorter, Stored procedure,Normalizer, Filter, Update Stragey, Aggregaor etc.

Wrote the shell scripting for accessing the flat files from the FTP location and archiving the files.

Worked with Pre-Session and Post-Session UNIX scripts for automation of ETL jobs and to perform operations like gunzip, remove and archive files.

Extensively used an Informatica debugger to figure out the problems in mapping. Also involved in troubleshooting existing ETL bugs.

Investigating and fixing the bugs occurred in the production environment and providing the on-call support.

Create & review unit, integration test plans/scripts. Create & review project deliverables according to the software development life cycle methodologies.

Written documentation to describe program development, logic, coding, testing, changes and

corrections.

Wrote SQL logic for extracting the data from the Oracle DB.

Developed mapping where sales are sent as the input from different business and the forcast is populated as the output.

Create the Data Extension in the Exact Target and Import the files for testing.

Participate in client calls and prepare the clarification list to seek clarifications.

Prepare the list of requirements and seek review inputs from the key stakeholder.

Provided excellent support during QA/UAT testing by working with multiple groups

Monitoring of batch jobs using job scheduling system UC4 and CA.

Document and maintain code using change control software like PVCS.

Provided Off-hours/weekend support during the Production deployment and UAT

Informatica Data Quality (IDQ 8.6.1) is the tool used here for data quality measurement.

Performed data quality analysis, gathered information to determine data sources, data targets, data definitions, data relationships, and documented business rules.

Designed and performed all activities related to migration of ETL components between environments during development and deployment (Eg. Dev to SIT, SIT to UAT and UAT to PROD repositories).

Environment: Informatica Power Center 9.6, UNIX, Windows server 2003, Windows server 2008 R2, Unix shell programming, Cognos,Oracle 11g/10g, SQL, PL/SQL, SQL Developer, Silverpop,Salesforce Exact Target, CA, UC4, IDQ 8.6.1

State Of Georgia/Deloitte GA Nov 2014- Aug 2015

ETL Developer

Project: State of Georgia Eligibilty System IES(GA IES)

The GA IES will replace the existing GA legacy systems with a “NextGen” Integrated Eligibility System (IES) that complies with ACA(affordable Care Act) requirements, provides real-time eligibility determinations, and creates a common portal for client and users to process and manage eligibility for multiple HHS program

Responsibilities:

Actively participated in gathering the requirement documents, analysing, designing and development using extensively Informatica 9.6.1 and SQL query for the conversion of the data from mainframes system to Oracle Database.

Developed IT solutions by estimation and translation of business needs into system applications and processes .

Developed mapping for Coverting the Data from Legacy Systems(DB2) to the Oracle DB.

Extensively worked on Informatica Mappings/Session/Workflows to load data from Oracle to TD and vice versa.

Extensively used ETL processes to load data from flat files into the target database by applying business logic on transformation mapping for inserting and updating records when loaded.

Suggested improvements for architecture structure and ETL processes by proper identification and analysis

Extensively worked on Informatica Performance tuning to find potential bottlenecks in the Source, Informatica and Target systems.

Extensively worked on UNIX shell scripts on automation (Auto restart of servers, Disk space utilization, Informatica Server monitoring and UNIX file system maintenance and cleanup, and infacmd and pmrep operations).

Worked on Resusable Mappings and Transformations .

Developed Stage and Base Loads in Informatica MDM.

Testing the Informatica Objects in QA before moving them into Production.

Designed and documented ETL standards document.

Worked on SQL queries to query the Repository DB to find the deviations from Company’s ETL Standards for the objects created by users such as Sources, Targets, Transformations, Log Files, Mappings, Sessions and Workflows.

Migration of changes, enhancements, upgrades, etc. through the various environments.

Migration of Mappings, Sessions and Workflows from Dev to UAT and PROD environment.

Have good knowledge on Troubleshooting and providing speedy solution in case of failure of the applications.

Created Documents such as Technical Systems Design, Source-To-Target Data mapping Document, Unit Test Cases and Migration Documents for Production support

Environment: Informatica Power Center 9.6.1, Informatica Data Quality 9.1, Informatica MDM 9.7, UNIX, Windows server 2003, Windows server 2008 R2, Unix, VASAM files, Teradata, Mainframs DB2, Oracle 11g/10g, OBIEE, SQL, PL/SQL, SQL Developer,Notepad++,Agile methodology.

Dunkin Brands, MA Aug 2012-Nov 2014

Sr ETL Developer / Informatica Developer

Project: "Shop and Reported Sales Dunkin Donuts & Baskin Robins"

Dunkin' Brands Group, Inc. (DNKN) is one of the world's leading franchisors of quick service restaurants. The objective of this project is to get the sales data and Customers Feedback data from all the Franchises around the world and load it into the Data warehouse for analysis. The data flow is from different sources (flat files, Oracle).

Responsibilities:

Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.

Develop ETL mappings, transformation, define workflows and task, monitor sessions using Informatica Power Center.

Analyzed and communicated ETL development issues, including problems with data integrity, data design, and functional and technical issues.

Developed various transformations like Source Qualifier, Update Strategy, Lookup & Expression transformation, Expressions and Sequence Generator for loading the data into target table.

Assist with the monitoring, maintenance and sustainability of Enterprise Data Warehouse (EDW) implementation.

Provided technical support services to ETL and Data warehouse team.

Utilized of Informatica IDQ 8.6.1 to complete initial dataprofiling and matching/removing duplicate data

Involved in coding, testing, implementing, debugging and documenting the complex programs

Wrote the shell scripting for accessing the flat files from the FTP location and archiving the files.

Responsible for the up gradation of Informatica from 9.1 to 9.5.1 version and extensively done the testing.

Experienced with loading into and reading from XML files in Informatica Powercenter.

Extensively used an Informatica debugger to figure out the problems in mapping. Also involved in troubleshooting existing ETL bugs.

Involved in the testing the ETL modules, plans, deploys and tests the ETL mappings etc., to ensure that the clients remain satisfied.

Identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.

Created unit test plans and ETL code documentation

Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes

Worked on the production area using Informatica tools to move maps from development to system test environments.

Provided excellent support during QA/UAT testing by working with multiple groups

Manage and perform data load during the UAT process, Participate and contribute in the enhancements, design discussions (result of UAT feedback) plan and complete the initial data load Production after Prod migration.

Worked with Session Logs and Workflow Logs for Error handling and troubleshooting in DEV environment.

Wrote SQL logic for extracting the data from the Oracle DB.

Worked with Business Managers, Analysts, Development, and end users to correlate Business Logic and Specifications for ETL Development.

Worked as team lead for Offshore and Coordinate with the team, to ensure support tickets are resolved in a timely manner through the remedy system and ensure quality code is delivered to the client on time.

Modified existing mappings for enhancements of new business requirements.

Worked with FLOAD, MLOAD, TPUMP utilities to load the data to Teradata.

Loading the data into Teradata Data warehouse using utilities like BTEQ, FASTEXPORT,MULTI LOAD, and compression.

Involved in Query optimization for ETL interfaces with large volumes.

Worked with Reporting team to build tables to optimize reporting performance

Assist with project management and provide technical leadership to incorporate new aspects of the EDW implementation as the needs arise.

Interacted with client to make necessary modifications to the existing reports.

Provide weekly status report on support tasks, monthly reports and metrics for monitoring and support initiatives and quarterly reviews on existing processes and recommended improvements to Dunkin Brands Management.

Environment: Informatica Power Center 9.5.1/9.1, UNIX, Windows server 2003, Windows server 2008 R2, Cygwin, Unix shell programming,,AppWorx, Agile Environment, SQL, PL/SQL, TOAD,IDQ.

Applabs,Hyderabad, India Oct 2011-July 2012 Sr ETL Developer / Informatica Developer

Project: "Insurance consolidation Reports"

The Premium module has coverage policies for Fire, Motor, Marine and Engineering Products. It can be used to give the number of new, renewal, cancelled or expired policies. It calculates Gross, Reinsurance and Net outward and inward premium. The Claims module specifies number of claims opened, reopened and closed and claim payment s and claim reserves. There are external files for Policy expenses, which are distributed among policies. The Marketing module specifies new, renewal, cancels and expired policy details depending on products and agents.

Responsibilities:

Working with the Data Warehouse Architect to perform source system analysis and identification of key data issues via analysis of the data profiling results and queries Assistance in the creation of ETL design specifications.

Involved In On-site/ Offshore coordination.

Involved in direct discussion with the Functional/End-Users to study the system requirement and develop Functional and Technical specifications to meet their requirements.

Implemented PL/SQL scripts in accordance with the necessary Business rules and procedures.

Understanding existing business model and customer requirements. Used Source Analyzer and Warehouse Designer to import the source and target database schemes, and the mapping designer to map source to the target.

Upgraded Informatica Power center 9.0.1 from 8.6 on Linux servers on all the environments and applying Patches/Hot fixes.

Worked with Informatica Power Center client tools like Repository Manager, Designer, Workflow Manager, and Workflow Monitor.

Development of Informatica mappings, mapplets, workflows and worklets.

Worked with various look up cache like Dynamic Cache, Static Cache, Persistent Cache, Re Cache from database and Shared Cache.

Design and development error handling processes.

Worked on Data Profiling, Cleansing, Matching, Reporting and Monitoring of data quality

Developed complex mappings such as Slowly Changing Dimensions Type II-Time stamping in the Mapping Designer.

Scheduled ETL jobs in Informatica and Linux Shell scripts and issue resolution in case failures.

Extensively used expression, lookup, sequence generator, update and router transformations.

Create and Monitor the sessions using workflow manager and workflow monitor.

Configured session properties and target options for better performance.

Configured sessions with email when failure, command task and decision task.

Defined the program specifications for the data migration programs, as well as the necessary test plans used to ensure the successful execution of the data loading processes.

Performing troubleshooting and providing resolutions to ETL issues.

Perform debugging and tuning of mapping.

Interacted with client to make necessary modifications to the existing reports.

Tested all applications and transported data to target Warehouse tables, schedule and run extraction and load process and monitor sessions and batches by using Informatica Workflow Manager.

Working with business partners to review prototypes and develop iterative revisions.

Analyzed/modeled current information flows and recommended solutions.

Participated in testing in preparation of test plans and test cases to ensure the requirements are testable.

Environment: Informatica Power Center 8.6, Auto Sys, Agile,HP UNIX, Windows NT, Oracle 9i/10g, Mainframe, SQL, PL/SQL HP Linux,

Data Warehouse Developer

HealthCare Street (IBM), Hyderabad, India Sept2007-Sept 2009

Data Warehouse Developer

Project: "Management Reports"

This warehouse is to generate data to analyze the sales of various products. The product data is categorized depending on the product group and product family. This warehouse is also used to analyze the usage of products at different time of the year. The data is stored in a legacy system on Oracle 9i database.

Responsibilities:

Involved in design & development of operational data source and data marts in Oracle

Reviewed source data and recommend data acquisition and transformation strategy

Involved in conceptual, logical and physical data modeling and used star schema in designing the data warehouse

Designed ETL process using Informatica Designer to load the data from various source databases and flat files to target data warehouse in Oracle

Used Power mart Workflow Manager to design sessions, event wait/raise, and assignment, e-mail, and command to execute mappings

Created parameter based mappings, Router and lookup transformations

Created mapplets to reuse the transformation in several mappings

Used Power mart Workflow Monitor to monitor the workflows

Optimized mappings using transformation features like Aggregator, filter, Joiner, Expression, Lookups

Worked extensively on Oracle database 9i and flat files as data sources.

Performed integrated testing for various mappings. Tested the data and data integrity among various sources and targets.

Created daily and weekly workflows and scheduled to run based on business needs

Created Daily/weekly ETL process which maintained 100GB of data in target database

Created complex stored procedures/functions/triggers/query tuning in MS Access 97

Environment: Oracle, Web logic server, XML, JSP, JavaScript, EJB, Windows NT, Informatica Power Center 5.1, Oracle 8i, Erwin, Cognos 6.5 (Impromptu), Erwin 3.5, Responsible for requirement analysis of the application.



Contact this candidate