Post Job Free
Sign in

Manager Data

Location:
Monmouth Junction, NJ
Posted:
July 30, 2020

Contact this candidate

Resume:

Manonmani Senior ETL Consultant

+1-848-***-**** / *********@*****.***

SUMMARY

Rich experience in IT. Installing, Configuring, Analyzing, Designing, Integrating, re-engineering, and Developing highly sophisticated systems. Have wide range of IT knowledge in Software Analysis, Development, and Implementation of business applications for Logistics, Transportation, Finance, Insurance and Telecom verticals.

9 Years of Experience in Data Warehousing to develop ETL mappings and scripts in Informatica Power Center 9.6 and Power Mart 9.6 using Designer (Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer, Task Developer), Repository Manager, Workflow Manager & Workflow Monitor.

35+ Months experience in using Teradata (currently using version 13.10),Oracle 10g/9i/8i/7.x, MS SQL Server 2002/7.0/6.5, SQL Server 12.0/11.x, MS Access 7.0/2000, SQL, PL/SQL, SQL*Plus, Sun Solaris 2.x.,MQ explorer

Expertise in working in Teradata 13.x systems and used utilities like MultiLoad, FastLoad, FastExport, BTEQ, TPump, Teradata SQL

Good Experience in developing and designing Informatica transformation objects, session tasks, worklet, workflows, workflow configurations and sessions (reusable and non-reusable).

Knowledge in Informatica Advanced Techniques – Dynamic Caching, Memory Management, Parallel Processing to increase Performance throughput.

Excellent knowledge and experience in creating source to target mapping, edit rules and validation, transformations, and business rules.

Involved in development of Informatica Mappings and Informatica Workflows. Created Mapplet and used them in different Mappings.

Change Data Capture mechanism using Informatica Mapping Variable. We can use the Informatica Mapping Variable to extract the CDC data .

Strong knowledge on the AWS S3, Aurora, EC2 and etc.,

Implemented data warehouses on AWS using Redshift.

Migrating existing on premise Data Warehousing to AWS using Redshift as target Data Warehouse.

Worked on ETL design and did implementation on cloud

Hands on with developing ETL in AWS Glue, and understand the process of Crawler and etc.,

Have Created Different types of attribute to Handle the data in SAP BW.

Worked on analysis and documentation of OLAP reports requirements. Solid understanding of OLTP & OLAP concepts and challenges, especially with large data sets.

Worked with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.

Proficient in data warehouse development practices and processes. Excellent working experience in implementing large Teradata DBMSs

Participated in requirements Analysis reviews, Business reviews and working sessions to understand the requirements and system design.

Involved in functional designs, writing functional specifications and design/code review.

Having Knowledge in Mainframe DB2, Cobol, MQ .

Interacted with users for gathering requirements and also provided end-user training/documentation

Having Knowledge in JAVA/J2EE Technologies. Knowledge in using IDE’s such as My Eclipse, Eclipse (EE) for development.

Have experience in SAP HANA and SAP BW.

Knowledge in Hadoop,Pig, Hive,Map Reduce,Scala.Spark

Used Hadoop,It Provides massive storage for different type of data,enormous processing power and the ability to handle virtually limitless concurrent task.

Monitoring end-to-end performance requires tracking metrics from brokers, consumer, and producers, in addition to monitoring ZooKeeper which is used by Kafka for coordination among consumers.

Experience in GUI development with HTML, DHTML and JavaScript. Comprehensive knowledge in frameworks like Struts 1.1/1.2, Hibernate 3.0 and Spring 2.

Experience in full SDLC (Design, Development, Testing, Deployment and Support) & understanding of ITIL Process.

Strong Analytical & Communication skills and ability to Work independently with Excellent Interpersonal Skills.

Technical Skills

ETL Tools

Informatica 9.6 / 9.1,

BI Tools

Tableau, Cognos7.x/8.x, Business-Objects 5.x/6.x

RDBMS

Teradata 13.10, Oracle 10g/11g, MS-SQL Server 2016, MySQL and SAP-HANA 9

OS

Windows, Linux / Unix

Frame Work

Struts, Hibernate, spring

Languages

Java 2.0/J2EE, SQL, PL/SQL, XML, HTML, JavaScript and UML, C, C++

Cloud

AWS, S3, EC2, Aurora, Redshift, Glue

Data Model

Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling,

a

Other Tools

SQL Plus, Toad, Power Designer 9.6, Rational Rose 2000. IBM Rational Clear Case,

HP-QC,Hadoop, Hive, WinSCP, Putty, SSH, Autosys, Teradata SQL Assistant, Toad SQL Assistant, Teradata Utilities – FastLoad, MultiLoad, FastExport, BTEQ,Git HUB,Jira

EXPERIENCE HIGHLIGHTS

IMG Systems

1.RCM2 - Verizon Business Markets Oct 2019 – Till Date

Role: Senior ETL/Teradata/Tableau Developer, NJ USA

This application aiming to provide single source of truth for tracking Verizon’s Customer Data for FP&A reporting and analytics. It reduces time to generate and increase the time available to analyze. Automate the manual process to ensure accurate data from the different sources. And provides the flexibility to adapt quickly to changing business requirements. Finally, it builds strong controls and consistency apart from that enable quick access and self-serve to data. This transforms the business to generate faster insight around customer company-initiated activities such as, price-ups, offers, bundle, products. Simplifying the business requirements for one of Sales Channel Category to identify the applicable Fios Sales channels and calculating Cost for diff sales channels based on reference file.

Responsibilities:

Working with various business stakeholders on requirements gathering for application development.

Working with various teams on ongoing activities to complete the project requirements as per BRD and time lines.

Handling multiple meetings with business users, Reporting teams and other stakeholders to clarify the requirements.

Designing and developing the ETL processes from different source systems to transform the data as per the business requirements to be used by the reporting teams and other business teams for further analysis to improve customer value services.

Generating the required data for direct marketing teams on daily basis for the implementation of their promotional plans to consumer & business.

Working with architect and Functional teams/users to define and requirements/specs for the enhancements/new deliverables along with other client teams.

Expertise in importing/exporting large amounts of data from files to Teradata and vice versa.

Analysis the existing source tables and created facts, dimensions and star schema representation for the data marts.

Delivering BTEQ scripts to extract, transform and load data into target Fact tables.

Delivering MLOAD scripts to load data in the Stage/Target tables from source feeds.

By utilizing FEXP Utilities, transferring the data from diff data bases to target client server systems.

Performing the code reviews of the scripts delivered by the off-shore and on-site teams to ensure the correctness of functionality and performance.

Handling the Performance Tuning by collecting statistics, analyzing explains, choosing appropriate indexes.

Involving in System Integrating testing for developed modules and create defects and deploy the code in Production using GIT Hub and support .

Strong knowledge on the AWS S3, Aurora, EC2 and etc.,

Implemented data warehouses on AWS using Redshift.

Migrating existing on premise Data Warehousing to AWS using Redshift as target Data Warehouse.

Worked on ETL design and did implementation on cloud

Hands on with developing ETL in AWS Glue, and understand the process of Crawler and etc.,

Have Created Different types of attribute to Handle the data in SAP BW.

Hands on experience developing dashboards on Tableau Desktop versions 2018.3, prepared user stories to create compelling dashboards to deliver actionable insights.

Experience in gathering Requirements, Analysis, Design, Development, Deployment, Testing and Support of rich Interactive Dashboards and Workbooks using Tableau Desktop

Proficient in design and development of various dashboards, reports utilizing Tableau Visualizations like Dual Axis, Bar Graphs, Scatter Plots, Pie-Charts, Heat Maps, Bubble Charts, Tree Maps, Box Plots, Geographic Visualizations, side by side bars, Stacked Bars, Filled Maps.

Hands-on development assisting users in creating and modifying worksheets and visualization dashboards.

Responsible for Developing Dashboard, ad-Hoc Reporting and defining best practices in SQL

Provided Production support to Tableau users and Wrote Custom SQL to support business requirements.

UST Global

2.Informatica – Senior Developer,NC,USA Oct 2016 - Sept 2019

Maersk Line, the global containerized division of the A.P. Moller – Maersk Group, is dedicated to delivering the highest level of customer-focused and reliable ocean transportation services. The scope for the below projects are to receive data from Various sources like MQ messages, files,tables and encompasses the transfer of the data through the data layers, loading to SA (Staging Area), then SIM to SEDW and finally to the SSL area of EBIE. History data in the existing application’s SIM layer will be migrated to the SEDW layer of the new solution.

Responsibilities:

Responsible for Business Analysis and Requirements Collection.

Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor. Designed and customized data models for Data warehouse supporting data from multiple sources on real time

Involved in building ETL architecture & Source-Target mapping to load data into Data warehouse.

Created mapping documents to outline data flow from sources to targets. Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.

Developing Teradata Utility scripts like FastLoad, MultiLoad to load data from various source systems to Teradata.

Writing Teradata sql queries to join or any modifications in the table. Creation of customized Mload scripts on UNIX platform for Teradata loads

Creating BTEQ (Basic Teradata Query) scripts to generate Keys.

Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.

Wrote BTEQ scripts to transform data

Wrote and implemented Teradata Fastload, Multiload and Bteq scripts

Extensively Used Change Data Capture (CDC) in data warehouse applications.

Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.

Created mapplets to use them in different mappings. Developed mappings to load into staging tables and then to Dimensions and Facts.

Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.

Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.

Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.

Extensively used SQL* loader to load data from flat files to the database tables in Oracle.

Modified existing mappings for enhancements of new business requirements.

Used Debugger to test the mappings and fixed the bugs.

Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.

Involved in Performance tuning at source, target, mappings, sessions, and system levels.

Prepared migration document to move the mappings from development to testing and then to production repositories.

Migration of code between the Environments and maintaining the code backups.

Expertise in Performance Tuning by identifying the bottlenecks at sources, targets, PowerCenter transformations and sessions using techniques like Explain plans, re-designing the mappings.

Collected performance data for sessions and performance tuned by adjusting Informatica session parameters.

Creating Checklists for Coding, Testing and Release for a smooth, better & error free project flow.

Creating the Release Documents for better readability of code/reports to end users.

Handling User Acceptance Test & System Integration Test apart from Unit Testing with the help of Quality Center as bug logging tool. Created & Documented Unit Test Plan (UTP) for the code.

Give support for Production team to successfully deploy and run in their environment.

Environment:

Informatica PowerCenter 9.6 (Source Analyzer, Data warehouse designer, Mapping Designer, Mapplet, Transformations, Workflow Manager, Workflow Monitor), MDM,Teradata 13.10, MQ Series, Control M,Kafka,Hadoop,Zookepper,Docker,Erwin 3.5, PL/SQL, Oracle 10g,Windows 7 / 2000.

3. Informatica- Developer-UST- Trivandrum /India Jun 2013 – Dec 2015

Role :System Analyst – B1 Band

Description: (Maersk Line)

Responsibilities:

Parsing high-level design spec to simple ETL coding and mapping standards.

Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer and Transformation Designer.

Worked with the data Architecture team to some extent on the Entity-Relational Ship Model Diagram, and worked towards implementing the same in SEDW.

Responsibilities included designing and developing Informatica mappings to load data from Source systems to SA and then to SEDW and Finally to SSL. Also involved in Type-II slowly changing dimensions.

Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner.

Extensively used Power Center to design multiple mappings with embedded business logic.

Creation of Transformations like Joiner, Rank and Source Qualifier Transformations in the Informatica Designer.

Created Mapplet and used them in different Mappings.

Using Informatica Repository Manager maintained all the repositories of various applications, created users, user groups, and security access control.

Good knowledge in the Physical data Model and Logical data Model

Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.

Maintain Development, Test and Production mapping migration Using Repository Manager, also used Repository Manager to maintain the metadata, Security and Reporting.

Understand the business needs and implement the same into a functional database design.

Data Quality Analysis to determine cleansing requirements. Designed and developed Informatica mappings for data loads.

Tuning Informatica Mappings and Sessions for optimum performance. Created and maintained several custom reports for the client using Business Objects.

Environment:

Informatica PowerCenter 9.6 (Source Analyzer, Data warehouse designer, Mapping Designer, Mapplet, Transformations, Workflow Manager, Workflow Monitor), MDM,Teradata 13.10, MQ Series, Kafka,Hadoop,Zookepper,Docker,Erwin 3.5, PL/SQL, Oracle 10g,Windows 7 / 2000.

4.InformaticaDeveloper, wisdom soft solutions Chase Bank,Chennai,India Apr 2012 – Jun 2013

Role :Software Engineer

Description:

Chase Bank is one of America's leading financial companies, which serves its customers a broad range of first class banking services. I have assisted in the design and the development of the Data Warehousing project for the improvement of Account Management System. This data warehouse was developed to provide reports for the portfolio managers to assist in deciding the potential customers, the potential customer who can afford the loan and the graduate students who can afford the car loan.

Responsibilities:

End-to-end ETL development of the Premium Module Data Mart. Maintained warehouse metadata, naming standards and warehouse standards for future application development.

Developed design spec to simple ETL coding and mapping standards. Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer.

Designed and developed Informatica Mappings to load data from Source systems to ODS and then to Data Mart.

Extensively used Power Center/Mart to design multiple mappings with embedded business logic.

Creation of Transformations like Lookup, Joiner, Rank and Source Qualifier Transformations in the Informatica Designer.

Created mappings using Sorter, Aggregator, newly changed dynamic Lookup and Router transformations.

Created Mapplet and used them in different Mappings.

Worked with mapping variable, Mapping parameters and variable functions like Setvariable, Countvariable, Setminvariable and Setmaxvariable.

Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.

Maintain Development, Test and Production mapping migration Using Repository Manager, also used Repository Manager to maintain the metadata, Security and Reporting.

Understand the business needs and implement the same into a functional database design.

ETL Migration process from legacy scripts to Informatica Power Center 5.2.

Data Quality Analysis to determine cleansing requirements. Designed and developed Informatica mappings for data loads.

Environment:

Informatica PowerCenter 9.0.1 (Source Analyzer, Data warehouse designer, Mapping Designer, Mapplet, Transformations, Workflow Manager, Workflow Monitor), Erwin3.5, PL/SQL,Oracle 9i, DB2, Windows2000.

5. ETL Developer, AT&T,- wisdom soft solutions Chennai,India Oct 2010 – Mar 2012

Role :Software Engineer

Description:

Enterprise Data warehouse for AT&T. This data warehouse is developed to help the top executives make decisions about the products (Network Bandwidth) and to also analyze various metrics and understand the performance of the organization. Data in EDW is consumed by different systems like Business Intelligence reporting using Business Object, Decision support Systems, Executive Information, Demand Planning, Automated Underwriting systems.

Responsibilities:

Interacted with business analysts and translate business requirements into technical specifications.

Using Informatica Designer, developed mappings, which populated the data into the target.

Used Source Analyzer and Warehouse Designer to import the source and target database schemas and the mapping designer to map the sources to the targets.

Worked extensively on Workflow Manager, Workflow Monitor and Worklet Designer to create, edit and run workflows, tasks.

Enhanced performance for Informatica session using large data files by using partitions, increasing block size, data cache size and target based commit interval.

Extensively used aggregators, lookup, update strategy, router and joiner transformations.

Developed the control files to load various sales data into the system via SQL*Loader.

Extensively used TOAD to analyze data and fix errors and develop.

Involved in the design, development and testing of the PL/SQL stored procedures, packages for the ETL processes.

Created different types of reports like List, Cross-tab, Chart, Drill-Thru and Master-Detail Reports. Created multiple Dashboard reports for multiple packages.

Migrated Reports from Cognos 8.3 to 8.4, initiated training for the users to do ad-hoc reporting using Query Studio.

Environment:

Informatica PowerCenter 8.x (Source Analyzer, Data warehouse designer, Mapping Designer, Mapplet, Transformations, Workflow Manager, Workflow Monitor), Erwin 3.5, PL/SQL, Oracle 10g/9i, SQL Server 2005, ROLAP, Cognos 8.2/3/4 (Framework Manager, Report Studio, Query Studio, Analysis Studio, Cognos Connection), Windows 2000.

6. Oracle Developer. Credit Suisse / Chennai,India Jan 2010 – Oct 2010

Role :Software Developer

The project was with IT Asset Management Group and major responsibilities included helping Business users to keep track of all elements of software and hardware in the company.

Responsibilities:

Wrote few Unix Shell Scripts to process the files on daily basis like renaming the file, extracting date from the file, unzipping the file and remove the junk characters from the file before loading them into the base tables.

Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart.

Handled errors using Exception Handling Daily Operations: Job monitoring, notifying/fixing data load failures.

Production Support of the existing system. Fixing the database problems and processing error out records. Resolving bugs in the code of the EIB system. And Resolving the Call and supporting Oncall.

Extensively used the advanced features of PL/SQL like Records, Tables, Object types and Dynamic SQL.

Root cause analysis, Enhancements, Requirement collection and Estimation.

Environment:

Oracle 10g, SQL * Plus, TOAD, SQL*Loader, PL/SQL Developer, Shell Scripts, UNIX, Windows XP

EDUCATION, TRAINING & CERTIFICATIONS

Master of Computer Applications (MCA), Mother Theresa Women’s University, Kodikanal, India



Contact this candidate