Resume

Sign in

Manager Data

Location:
Cumming, GA
Posted:
March 20, 2018

Contact this candidate

Resume:

Manonmani S Senior Informatica/ Teradata Consultant

+1-470-***-**** / ac4vkd@r.postjobfree.com

Skype ID: manonmanis4 / Johns Creek, GA, USA

SUMMARY

Rich experience in IT. Installing, Configuring, Analyzing, Designing, Integrating, re-engineering, and Developing highly sophisticated systems. Have wide range of IT knowledge in Software Analysis, Development, and Implementation of business applications for Logistics, Transportation, Finance, Insurance and Telecom verticals.

7+ Years of Experience in Data Warehousing to develop ETL mappings and scripts in Informatica Power Center 9.6 and Power Mart 9.6 using Designer (Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer, Task Developer), Repository Manager, Workflow Manager & Workflow Monitor.

35 Months experience in using Teradata (currently using version 13.10),Oracle 10g/9i/8i/7.x, MS SQL Server 2002/7.0/6.5, SQL Server 12.0/11.x, MS Access 7.0/2000, SQL, PL/SQL, SQL*Plus, Sun Solaris 2.x.,MQ explorer

Expertise in working in Teradata V2R13.x systems and used utilities like MultiLoad, FastLoad, FastExport, BTEQ, TPump, Teradata SQL

Good Experience in developing and designing Informatica transformation objects, session tasks, worklet, workflows, workflow configurations and sessions (reusable and non-reusable).

Knowledge in Informatica Advanced Techniques – Dynamic Caching, Memory Management, Parallel Processing to increase Performance throughput.

Excellent knowledge and experience in creating source to target mapping, edit rules and validation, transformations, and business rules.

Involved in development of Informatica Mappings and Informatica Workflows. Created Mapplet and used them in different Mappings.

Change Data Capture mechanism using Informatica Mapping Variable. We can use the Informatica Mapping Variable to extract the CDC data .

Worked on analysis and documentation of OLAP reports requirements. Solid understanding of OLTP & OLAP concepts and challenges, especially with large data sets.

Worked with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.

Proficient in data warehouse development practices and processes. Excellent working experience in implementing large Teradata DBMSs

Participated in requirements Analysis reviews, Business reviews and working sessions to understand the requirements and system design.

Involved in functional designs, writing functional specifications and design/code review.

Interacted with users for gathering requirements and also provided end-user training/documentation

Having Knowledge in JAVA/J2EE Technologies. Knowledge in using IDE’s such as My Eclipse, Eclipse (EE) for development.

Knowledge in Hadoop,Pig, Hive,Map Reduce.

Used Hadoop,It Provides massive storage for different type of data,enormous processing power and the ability to handle virtually limitless concurrent task.

Monitoring end-to-end performance requires tracking metrics from brokers, consumer, and producers, in addition to monitoring ZooKeeper which is used by Kafka for coordination among consumers.

Experience in GUI development with HTML, DHTML and JavaScript. Comprehensive knowledge in frameworks like Struts 1.1/1.2, Hibernate 3.0 and Spring 2.

Experience in full SDLC (Design, Development, Testing, Deployment and Support) & understanding of ITIL Process.

Strong Analytical & Communication skills and ability to Work independently with Excellent Interpersonal Skills.

SKILLS SUMMARY

Operating System: Windows 2000/98/95/NT/7,unix

Languages: Java 2.0/J2EE, SQL, PL/SQL, XML, HTML, JavaScript and UML, C, C++

Databases: Teradata 13.10, Oracle 10g,Oracle 11g, MS-SQL Server, MySQL, SAP-HANA 9

ETL / BI Tools: Informatica 9.6 / 9.1, Cognos7.x/8.x, Business-Objects 5.x/6.x,

Tools: SQL Plus, Toad, Power Designer 9.6, Rational Rose 2000. IBM Rational Clear Case,

HP-QC,Hadoop,Pig,Hive,Kafka,zookeeper.,docker

Framework: Struts, Hibernate, spring

Others: Data Modeling Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling

EXPERIENCE HIGHLIGHTS

UST Global

1.Informatica – Teradata Senior Developer Oct 2016 - TillDate

System Analyst – B1 Band

Maersk Line, the global containerized division of the A.P. Moller – Maersk Group, is dedicated to delivering the highest level of customer-focused and reliable ocean transportation services. The scope for the below projects are to receive data from Various sources like MQ messages, files,tables and encompasses the transfer of the data through the data layers, loading to SA (Staging Area), then SIM to SEDW and finally to the SSL area of EBIE. History data in the existing application’s SIM layer will be migrated to the SEDW layer of the new solution.

Responsibilities:

Responsible for Business Analysis and Requirements Collection.

Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor. Designed and customized data models for Data warehouse supporting data from multiple sources on real time

Involved in building ETL architecture & Source-Target mapping to load data into Data warehouse.

Created mapping documents to outline data flow from sources to targets. Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.

Developing Teradata Utility scripts like FastLoad, MultiLoad to load data from various source systems to Teradata.

Writing Teradata sql queries to join or any modifications in the table. Creation of customized Mload scripts on UNIX platform for Teradata loads

Creating BTEQ (Basic Teradata Query) scripts to generate Keys.

Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.

Wrote BTEQ scripts to transform data

Wrote and implemented Teradata Fastload, Multiload and Bteq scripts

Extensively Used Change Data Capture (CDC) in data warehouse applications.

Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.

Created mapplets to use them in different mappings. Developed mappings to load into staging tables and then to Dimensions and Facts.

Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.

Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.

Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.

Extensively used SQL* loader to load data from flat files to the database tables in Oracle.

Modified existing mappings for enhancements of new business requirements.

Used Debugger to test the mappings and fixed the bugs.

Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.

Involved in Performance tuning at source, target, mappings, sessions, and system levels.

Prepared migration document to move the mappings from development to testing and then to production repositories.

Migration of code between the Environments and maintaining the code backups.

Expertise in Performance Tuning by identifying the bottlenecks at sources, targets, PowerCenter transformations and sessions using techniques like Explain plans, re-designing the mappings.

Collected performance data for sessions and performance tuned by adjusting Informatica session parameters.

Creating Checklists for Coding, Testing and Release for a smooth, better & error free project flow.

Creating the Release Documents for better readability of code/reports to end users.

Handling User Acceptance Test & System Integration Test apart from Unit Testing with the help of Quality Center as bug logging tool. Created & Documented Unit Test Plan (UTP) for the code.

2. Informatica-Teradata Developer / System Analyst Jun 2013 – Dec 2015

System Analyst – B1 Band

Description: (Maersk Line)

Responsibilities:

Parsing high-level design spec to simple ETL coding and mapping standards.

Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer and Transformation Designer.

Worked with the data Architecture team to some extent on the Entity-Relational Ship Model Diagram, and worked towards implementing the same in SEDW.

Responsibilities included designing and developing Informatica mappings to load data from Source systems to SA and then to SEDW and Finally to SSL. Also involved in Type-II slowly changing dimensions.

Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner.

Extensively used Power Center to design multiple mappings with embedded business logic.

Creation of Transformations like Joiner, Rank and Source Qualifier Transformations in the Informatica Designer.

Created Mapplet and used them in different Mappings.

Using Informatica Repository Manager maintained all the repositories of various applications, created users, user groups, and security access control.

Good knowledge in the Physical data Model and Logical data Model

Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.

Maintain Development, Test and Production mapping migration Using Repository Manager, also used Repository Manager to maintain the metadata, Security and Reporting.

Understand the business needs and implement the same into a functional database design.

Data Quality Analysis to determine cleansing requirements. Designed and developed Informatica mappings for data loads.

Tuning Informatica Mappings and Sessions for optimum performance. Created and maintained several custom reports for the client using Business Objects.

Environment:

Informatica PowerCenter 9.6 (Source Analyzer, Data warehouse designer, Mapping Designer, Mapplet, Transformations, Workflow Manager, Workflow Monitor), Teradata 13.10, MQ Series, Kafka,Hadoop,Zookepper,Docker,Erwin 3.5, PL/SQL, Windows 7 / 2000.

3. Informatica Developer, Chase Bank, USA Apr 2012 – Jun 2013

Software Engineer

Description:

Chase Bank is one of America's leading financial companies, which serves its customers a broad range of first class banking services. I have assisted in the design and the development of the Data Warehousing project for the improvement of Account Management System. This data warehouse was developed to provide reports for the portfolio managers to assist in deciding the potential customers, the potential customer who can afford the loan and the graduate students who can afford the car loan.

Responsibilities:

End-to-end ETL development of the Premium Module Data Mart. Maintained warehouse metadata, naming standards and warehouse standards for future application development.

Developed design spec to simple ETL coding and mapping standards. Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer.

Designed and developed Informatica Mappings to load data from Source systems to ODS and then to Data Mart.

Extensively used Power Center/Mart to design multiple mappings with embedded business logic.

Creation of Transformations like Lookup, Joiner, Rank and Source Qualifier Transformations in the Informatica Designer.

Created mappings using Sorter, Aggregator, newly changed dynamic Lookup and Router transformations.

Created Mapplet and used them in different Mappings.

Worked with mapping variable, Mapping parameters and variable functions like Setvariable, Countvariable, Setminvariable and Setmaxvariable.

Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.

Maintain Development, Test and Production mapping migration Using Repository Manager, also used Repository Manager to maintain the metadata, Security and Reporting.

Understand the business needs and implement the same into a functional database design.

ETL Migration process from legacy scripts to Informatica Power Center 5.2.

Data Quality Analysis to determine cleansing requirements. Designed and developed Informatica mappings for data loads.

Environment:

Informatica PowerCenter 9.0.1 (Source Analyzer, Data warehouse designer, Mapping Designer, Mapplet, Transformations, Workflow Manager, Workflow Monitor), Erwin3.5, PL/SQL,Oracle 9i, DB2, Windows2000.

4. ETL Developer, AT&T, USA Oct 2010 – Mar 2012

Software Engineer

Description:

Enterprise Data warehouse for AT&T. This data warehouse is developed to help the top executives make decisions about the products (Network Bandwidth) and to also analyze various metrics and understand the performance of the organization. Data in EDW is consumed by different systems like Business Intelligence reporting using Business Object, Decision support Systems, Executive Information, Demand Planning, Automated Underwriting systems.

Responsibilities:

Interacted with business analysts and translate business requirements into technical specifications.

Using Informatica Designer, developed mappings, which populated the data into the target.

Used Source Analyzer and Warehouse Designer to import the source and target database schemas and the mapping designer to map the sources to the targets.

Worked extensively on Workflow Manager, Workflow Monitor and Worklet Designer to create, edit and run workflows, tasks.

Enhanced performance for Informatica session using large data files by using partitions, increasing block size, data cache size and target based commit interval.

Extensively used aggregators, lookup, update strategy, router and joiner transformations.

Developed the control files to load various sales data into the system via SQL*Loader.

Extensively used TOAD to analyze data and fix errors and develop.

Involved in the design, development and testing of the PL/SQL stored procedures, packages for the ETL processes.

Created different types of reports like List, Cross-tab, Chart, Drill-Thru and Master-Detail Reports. Created multiple Dashboard reports for multiple packages.

Migrated Reports from Cognos 8.3 to 8.4, initiated training for the users to do ad-hoc reporting using Query Studio.

Environment:

Informatica PowerCenter 8.x (Source Analyzer, Data warehouse designer, Mapping Designer, Mapplet, Transformations, Workflow Manager, Workflow Monitor), Erwin 3.5, PL/SQL, Oracle 10g/9i, SQL Server 2005, ROLAP, Cognos 8.2/3/4 (Framework Manager, Report Studio, Query Studio, Analysis Studio, Cognos Connection), Windows 2000.

5. Oracle Developer. Confidential, San Francisco, Jan 2010 – Oct 2010

Software Engineer

The project was with IT Asset Management Group and major responsibilities included helping Business users to keep track of all elements of software and hardware in the company.

Responsibilities:

Wrote few Unix Shell Scripts to process the files on daily basis like renaming the file, extracting date from the file, unzipping the file and remove the junk characters from the file before loading them into the base tables.

Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart.

Handled errors using Exception Handling Daily Operations: Job monitoring, notifying/fixing data load failures.

Production Support of the existing system. Fixing the database problems and processing error out records. Resolving bugs in the code of the EIB system. And Resolving the Call and supporting Oncall.

Extensively used the advanced features of PL/SQL like Records, Tables, Object types and Dynamic SQL.

Root cause analysis, Enhancements, Requirement collection and Estimation.

Environment:

Oracle 10g, SQL * Plus, TOAD, SQL*Loader, PL/SQL Developer, Shell Scripts, UNIX, Windows XP

EDUCATION, TRAINING & CERTIFICATIONS

Master of Computer Applications (MCA), Mother Theresa Women’s University, Kodikanal, India

Jobs Before 2010:

Job

Role & Responsibility

Duration

Company

1.

EDP In-Charge, Computer Engineer

EDP In-Charge, Computer Engineer

Aug 2008 – Dec 2009

Krishnaveni Textiles



Contact this candidate