Post Job Free
Sign in

Data Manager

Location:
Herndon, VA, 20170
Posted:
May 03, 2017

Contact this candidate

Resume:

Devarasetty Syam Sundar

Sr. Informatica Developer

703-***-**** acz4mr@r.postjobfree.com

Summary:

Certified Informatica Developer with good experience in IT industry having extensively involved in designing, developing, installing, implementing and supporting Data Warehouses, data marts and data integration projects.

Followed ETL best practices and standards in implementing ETL mappings using different transformations like Source Qualifier, Expression, Connected / Unconnected Lookup, Aggregator, Sorter, Router, Filter, Update strategy, etc.,

Experience in full lifecycle design & Implementation of Star and Snowflake Schemas. Hands on Experience in Performance Tuning techniques and Implemented Advanced Features/properties of Power Center including Partitioning, Concurrent Execution, Creating Indexes and Workflow Recovery.

Experience in Integrating the Informatica power center/cloud 8.x/9.xwith Legacy systems

Extensively worked on Informatica power exchange 9.6.0 End to End Installation / up gradation / User Roles creation / group’s creation / migration activities / Repository & Integration services creation / Repository Backup and restore.

Sourced Data from various source systems like Oracle, SQL Server, DB2, Flat files, Accesses databases with the help of ODBC and native connections.

Experience in complete Software Development Life cycle (SDLC) including Requirement, Analysis, estimations, Design, Construction, Unit and System Testing and implementation.

Expert on Slowly Changing Dimensions Type 1, Type 2 and Type 3 for inserting and updating Target tables for maintaining the history.

Integrated Salesforce CRM with Informatica cloud to extract Customer related data.

Performed data synchronization, data replication using Informatica cloud

Worked extensively in preparing Unit test cases (UTC) documents, Design documents, for data validation and migration purposes.

Experience on Post / pre SQL scripts for performing any DML operations on the table after the session run or before the session run depending on the requirement.

Created Rule based files (RBL) files for custom based linking in Informatica metadata manager for generating data linage.

Created custom modals in Informatica metadata manager for data governance and impact analysis.

Experience in Data Warehouse Administration Console (DAC) / Control-M scheduling tools for Monitoring ETL jobs.

Good knowledge on OBIA 7.9.6.3 seeded ETL vanilla mappings& Customizing vanilla mappings as per the customer requirements. Good understanding of Reporting tools like OBIEE &Tableau.

Strong experience in Production support environment at all levels including releases and change requests. Also, well management capabilities in resolving the tickets with in the SLA time lines.

Excellent problem solving skills with strong technical background, Excellent Team Player with interpersonal skills.

Education:

Master’s in Computer Science Engineering, USQ, Australia – 2007

Bachelors in Electronics and communication engineering, Anna University, India – 2005

Technical Skills:

ETL Tools

Informatica power center/Exchange7.x/8.x/9.6.1, Informatica data quality IDQ, Informatica Metadata Manager v 10.x.,Informatica cloud

Reporting Tools

OBIEE, Tableau

Version control

Microsoft visual studio, Peregrine, Tortoise SVN

Databases

Oracle 11g/10g, SQL Server 2005/2008 R2, Ansi SQL/ Netezza

Languages

T-SQL, PL/SQL, basic Unix Shell Script, C#.,perl

DB Tools

Toad, SQL* Loader, SQL server management studio

Scheduling Tools

DAC, Control-M, Autosys

ERP Systems

SIEBEL, EBASE, PEOPLESOFT

Operating Systems

Windows 2003/08, UNIX, LINUX - UBUNTU

Professional Experience:

Fannie Mae, Reston/VA Mar 2016– Till Date

Sr. Informatica ETL Developer

Project: EDI Party master

Description: Fannie Mae serves the people who house America. We are a leading source of financing for mortgage lenders, providing access to affordable mortgage financing in all markets at all times. Our financing makes sustainable homeownership and workforce rental housing a reality for millions of Americans. I work as an ETL Informatica Developer to load all party related information in to party dimensional modal. The main source for party master is from SIEBEL CRM, Risk net& ABN Files. We load party and its counter party data into party model

Responsibilities:

Analyzed requirements to load Party and its counter party details into party modal and build Hierarchy relationships. For example, Business unit (BU) to Sub Business Unit (SBU) to Legal Entities.

Created multiple ETL mappings to source data from SIEBEL in to Party modal for various business requirements based on the source data and the warehouse design.

Worked on creating MDM Landing tables and loading them using ETL Informatica power center and scheduled them using Autosys scheduler.

Created Custom modal in Informatica Meta data manager for data governance and impact analysis.

Loaded Business glossary in Informatica metadata manager tool for business dictionary and created rule based linking between resources for generating data linage.

Worked on Data profiling and data analysis using Informatica data quality IDQ analyst tool to analyze the data patterns.

Worked as in sprint developer and develops the code with in the sprint time and migrate the code to Out of sprint environment for Integration testing (SIT).

Integrated Informatica cloud with salesforce.com for pulling responsible party contact information.

Worked on Jenkins for Continuous integration continuous deployment (CICD).

Scheduled Workflows using Autosys by using JIL files and scheduled them as per the requirements.

Created connections in Informatica cloud to retrieve data from salesforce.com using salesforce API

Data profiling using custom SQL to analyze the data patterns and redundant data

Last but not least, prepared sprint completion deck and presented to Business leads at the end of every sprint work.

Worked on Informatica Metadata manager for creating Data linage using custom modal and RBL’s.

Worked on creating business glossary using Informatica metadata manager for linking physical objects to business dictonary.

Environment: Informatica Power Center 9.x, Informatica Metadata manager 9.x, Oracle, Unix, Autosys, Jenkins, Informatica Analyst, SVN, DB2, Informatica Master data management 9.x, Tibco, BOXI reports, Netezza, Informatica cloud

Nabors Industries, Houston/TX Apr 2015– Mar 2016

Sr. Informatica ETL Developer

Project: Hyperion ARM

Description: Nabors integrates drilling capabilities, market intelligence, and rigorous analytics to drive workforce decisions that deliver lasting impact on performance. A balance of employer and employee advocacy makes Nabors unusually effective over time. Organizations of every size choose Nabors for Drilling oil and program execution tailored to enterprise goals and local resources. Nabors is a global Drilling company.

Responsibilities:

Created open interface and EPMA dimension tables based on the source data and the warehouse design. Created various mappings using different transformations like filter, Router, lookups, Sorter, joiner, update strategy, expressions and aggregator transformations.

Used FDMEE & Hyperion Essbase application for planning and ARM reconciliations.

Also have good knowledge in Performance Tuning at both Database Level and Informatica.

Extracted data from multiple operational sources for loading staging area, Data warehouse, Data Marts.

Worked closely with cross-functional business and stack holders for gathering requirements and converting them in to design documents and ETL code.

Created Deployment groups for package deployment activities.

Written SQL queries for data validation and data promotions.

Prepared Unit Test Cases (UTC), Design documents, Deployment plan, Deployment checklist

Environment: Informatica Power Center 9.x, Essbase, Hyperion ARM, Oracle, Unix, visual studio

PWC, Tampa, FL / New York NY Apr 2013– Feb 2015

Sr. Informatica ETL Developer/Lead

Project: Analytics Data Factory

Description: PWC is one of the big four Auditing Companies in the market. It is diversified into many sectors such as Auditing, Taxation, Consulting, Advisory, and Forensic services. PWC uses Informatica as the primary ETL tool to load the data into warehouse and Tableau to build the reports on top of Dimension and fact tables. As we work closely with the Advisory teams in building and Implementing the Analytics in the areas of survey, manufacturing, healthcare, Insurance and finance domain with respect to their Demographics and some quantitative metrics. The main agenda of the project is to build a warehouse with respect to a particular industry and to analyze its business trends and to load the data using Informatica 9.6.1.

Responsibilities:

Worked on gathering Public/private Data for Analysis and to build Applications/Interfaces such as Global Growth Radar, Demand Estimator, Company Performance, Cost Quality apps using Informatica salesforce.com.

Data Catalogue Created with respect to the available datasets and understanding of Source Data and Creation of ETL interface documents.

Created low level design documents, review log documents, unit test cases, Issue logs, Impact Analysis and validated the data against the source.

Integrated Informatica cloud with Amazon web services (AWS) Redshift to load data into.

Installed Informatica power Exchange 9.5.1 and successfully upgraded to version 9.6.1. Activities like creation of Users and assigning privileges for accessing the repository.

Data pulled from AWS redshift and amazon S3 for analysis and integration with enterprise data modal using Informatica cloud (Iaas).

Creates workday connection using workday adapter in Informatica cloud to get data from workday to data warehouse.

Create connections in Informatica cloud for data pulling into Oracle, SQL server, flat files.

Created Indexes in order to increase performance. Also, implemented Partition techniques at Informatica level such as pass through, key range etc.

Data extracted from T-MSIS (Transformed Medicaid statistical information systems ) into the data warehouse.

Used GET, PUT operators to load /read data from workday .

Also have good knowledge in Performance Tuning at both Database Level and Informatica.

Created Standardizer, match, key generator, parser transformation using Informatica data quality tool (IDQ) for Standardizing the data.

Created the deduplication process to eliminate the duplicate records in Informatica data quality IDQ.

Worked on claims data that are coming from Medicaid and Medicare (CMS).

Worked extensively on loading and formatting data by writing SQL scripts. This involves lot of conversion, cleansing, dicing, slicing the data using both excel and SQL scripts.

Worked as offshore Coordinator for frequent interaction with onsite Team for requirement clarifications.

Created data linage using Meta data manager for impact analysis and data governance.

Created custom resources to load external metadata into Informatica metadata manager repository for end to end application linage.

Repository back up and object migration/Data Replication to other environment.

Prepared weekly deck, time sheets, and presented SMR (Senior Management Review) presentation to the top officials.

Environment: Informatica Power Center 9.x, Informatica Metadata manager 9.x, Informatica Data quality 9.x, Oracle, SQL server, xml files, Tableau, Informatica cloud

Coloplast, Minneapolis, MN Jan 2012 – Feb 2013

Sr. Informatica ETL Developer/Lead

Project: Coloplast Informatica & Siebel analytics up gradation

Description: Coloplast develops products and services that make life easier for people with very personal and private medical conditions. Working closely with the people who use our products, Coloplast create solutions that are sensitive to their special needs. We call this intimate healthcare. Coloplast business includes ostomy care, urology and continence care, and wound and skin care. This is a Up gradation and data conversion project where we have Migrated our data ware house from OBIA 7.8.4 to OBIA 7.9.6.3 and Informatica 7.x to Informatica 9.0.1x.

Responsibilities:

Developed complex mappings in Informatica to load the data from source tables using different transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, stored procedure, Joiner, Xml, Filter, Sorter and Router.

Installation of Informatica 9.0.1 hotfix 2 environments and 7.8.4 contents back up.

Sand box environment created for regression/Load testing.

Informatica 7.8.4 Repository Backup and restore.

Unit test case creation along with test data creation for testing the code

Folder migration from old repository to upgraded repository.

OBIEE Reports creation and user prompts creation.

DAC Scheduler monitoring.

End to end ware house upgrade from Siebel analytics 7.8.4 to OBIA apps 7.9.6.3

Work assignment to the offshore team and regular status monitoring.

Data migration to upgraded tables from obsolete tables. Using vanilla UPG Informatica repository.

Configuring Infa_sequence_generator.bat file during data migration

Unit test case preparation and data validation.

Retro fixing Informatica mappings in order to replace obsolete tables

Data ware house full load using DAC scheduler, load monitoring and fixing issues if any.

SCD type mappings creation using different transformations.

Worked on Data Masking from hiding the original data by encrypting, using alias names at the object level.

Data analysis from SIEBEL source to ware house to BI front end.

Unit testing, surface testing, integration testing and sandbox testing.

Dry run activities before go live and support activities.

Developed Mapplets to implement business rules that involved complex logic.

Tuned the mappings and sessions for better performance by eliminating various performance bottlenecks.

Environment: Informatica Power Center 9.0.1, SQL, PL/SQL, Toad, MS SQL 2008, DAC 10.1.3.4.1, OBIA apps 7.9.6.3

ON Semiconductors, Phoenix, AZ May 2011 to Dec 2011

Sr. ETL Informatica Developer

Project: ONSEMI DataMart

Description: Client is a preferred supplier of efficient semiconductor technologies to customers in the computing, communications, consumer, automotive, medical, industrial, and military/aerospace markets. The company’s broad portfolio includes power management, signal, logic, discrete, and custom devices. In this project we have created three executive dash boards for their revenue. We have used Informatica as ETL tool for extraction and loading the data from Db2/Oracle to Oracle Warehouse, and OBIA/OBIEE at the reporting side. Approximately around three schemas have been created, with 3 facts and 20 dimension tables

Responsibilities:

Created Staging, Dimension’s & Fact tables based on the source data and the warehouse design.

Worked with the Informatica Designer and created various mappings using different transformations like filter, Router, lookups, Sorter, joiner, update strategy, expressions and aggregator transformations.

Participated in build of the Data Warehouse, which includes the Design of Data mart using Star schema.

Created repository using Informatica Power Center – Repository Manager.

Extracted data form DB2tables/ Oracle tables and applied business logic to load them in the central oracle database.

Created and ran sessions/workflows using Workflow Manager to load the data into the Target Database.

Optimized/Tuned mappings for better performance, example - the SQL override in source qualifier, lookup overrides in lookups, and other fixes based on the bottlenecks identified.

Created reusable transformations and Mapplets and used them in mappings.

Extensively used Shell scripts to automate the Pre-session and Post-sessions processes.

Performed data manipulation using basic functions and Informatica transformations.

Used session partitions, dynamic cache memory and index caches for improving performance of Informatica services/ server.

Extensively worked on SQL tuning process to increase the source qualifier throughput by analyzing the queries with explain plan, creating new indexes, partitions and Materialized views

Created Tpump, fast export and Mload loaders for loading the data into TERADATA tables.

Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing rank, lookup, joiner and aggregator transformations

Created various tasks like Event wait, Event Raise, E-mail and command etc.

Troubleshoot problems by checking Sessions and Error Logs. Also used Debugger for complex problem troubleshooting.

Environment: Informatica 8.6, UNIX, CSV Files, DB2, SQL, PL/SQL, Unix Shell scripting, Oracle 11g, Toad, Teradata.

EMC, USA Oct 2010 to May 2011

Sr. Informatica ETL Developer/Support

Project: EMC EDW

Description: EMC Corporation is a manufacturer of high-end storage hardware and software and is headquartered in Boston. EMC produces a range of enterprise storage products, including hardware disk arrays and storage management software. EMC Corporation is one of the world's leading software companies. In this project we maintain customer related data, service data and Install base data. We use Different technologies for different applications. Informatica and Kalido as ETL tools and Hyperion for reporting purpose, the project architecture includes all these three tools at different places. Our batch runs four times a week which takes about 40 hr.’s to complete.

Responsibilities:

Created different types of Transformations, mapplets, for different business logics.

Migrating into Production systems. Batch Monitoring and Production issue resolutions.

Hyperion report to Informatica mapping conversion work request and development.

Resolving data issues and sending mails to DBA teams to data base change request.

Created different transformations for loading the data into Oracle database e.g. Source Qualifier, Joiner transformation, Update Strategy, Lookup transformation, Rank Transformations, Expressions, Aggregator, and Sequence Generator.

Informatica mappings enhancements, monitoring log information in batch run times.

Breaks fix works for already implemented Mappings.

Kalido BEID error resolution and Hyperion out of memory Issue aspects.

Informatica mapping design for loading the data into extract schema from source schema.

Performance tuning on Informatica maps.

Production based ticket management and follow up’s using peregrine tool

Environment: Informatica 8.6, PL/SQL Oracle 11g, Toad, Hyperion, Peregrine, Teradata, Control - M

Bayer, Germany. April 2008– Oct 2010

Informatica/ETL Developer

Project: ESM DataMart

Description: Bayer is a highly diversified Health care company that discovers, develops, manufactures and markets products and services that span the continuum of care from prevention and diagnosis to treatment and cure. Bayer’s products fall under four principal business arenas like Pharmaceutical Products, Hospital Product, Diagnostic Product and Nutritional Product. There are several divisions such as, General, Finance and Government system, Pharmaceutical Products, Business Intelligence and Data warehousing. I worked Under the Business Intelligence and Data warehousing division. The main aim of the projectis to load the business related data into the required data marts as per the business logics.

Responsibilities :

Preparation of the mapping document like Design/UTC/Impact/Issue log.

Worked on Importing DB2 tables into Informatica ETL power center.

Preparation of Mappings and Mapping review.

Creating Different types of transformations and applying different business logics to load data into warehouse.

Extensively used Source Qualifier, Joiner transformation, Update Strategy, Lookup transformation, Expressions, Aggregator, Xml and Sequence Generator

Creating Sessions and workflows.

Worked on Production issues, Database change request

Sending communication emails to production coordinator and team with data load status

Communication with Control-M team for any job failures

Resolution of Production issues which may include development activities in Informatica.

Identifying redundancy and taking steps to control it.

Environment: Informatica 8.x, PL/SQL Oracle 9i, DB2, Toad, UNIX, csv files, java, pl/sql, T-SQl

Awards & Certifications:

Achieved Instant reorganization award for quick problem resolution in PWC.

Achieved Best Team Performance Award and Zero defect upgrade In Coloplast.

Oracle Certified associate (OCA) – Issued by Oracle corporation

Informatica Certified Professional 9.x –Issued by Informatica corporation.

Experience in working on Informatica cloud.

Experience in Informatica Data Quality (IDQ) Analyst.

Knowledge in Tableau – Attended Internal training’s on Tableau



Contact this candidate