Post Job Free
Sign in

Datawarehouse -ETL Informatica

Location:
Garland, TX
Posted:
February 21, 2016

Contact this candidate

Resume:

SAKTHIVEL JEDARPALAYAM PALANIVEL

972-***-**** actmxq@r.postjobfree.com 2863 N Shiloh RD, Garland, TX 75044

ETL/Data stage/Big data/Informatica Consultant

SUMMARY

Over 10 years of Hands-on experience in Data warehousing for ETL tools Data stage & Informatica with Oracle 11g/10g & DB2. Involved in every phase of the system Development lifecycle, including feasibility studies, gathering user and business requirements, analysis, design, development, testing and management for medium and large data warehouse projects in domains such as Sales, Insurance, Retail, and Manufacturing.

Application Operation Support for Corporate data warehouse projects

Excellent experience with Hadoop Clusters set up

Big-Data Hadoop Administrator certified in HTC Global Service Ltd

ITIL Foundation V3 certified in ATOS (Siemens)

TECHNOLOGIES

Big Data Tools

Storm Streaming Process, LogStash(Shipper, Indexer), Redis/RabbitMQ server, Elastic Search, Cassandra, Hbase, Hive, Pig, Sqoop, Kibana Visualization

ETL Tools

Datastage 7.5.2 & 7.1, Informatica 9.1 & 8.6.1

Database

DB2, ORACLE 10g/9.2, PL/SQL

GUI & Tools

Visual Basic, Crystal Reports, TOAD, SQL Developer

Operating System

Windows XP/NT/2000/98,UNIX,LINUX

EDUCATION

Master of Business Administration, Annamalai University, Tamilnadu, India

Master of Computer Application, Bharadhidasan University, Tamilnadu, India.

B.Sc., Computer Science, Madras University, Tamilnadu, India

PROFESSIONAL EXPERIENCE

ELEVENTH HIRE, Oct 2015 - Present

EVOLVINGERP (Oct 15 – Present)

Consultant

Work on project for EvolvingERP to develop different applications per the requirement with an emphasis on accounting, automating financial processes, improving analytics, and streamlining the production of financial reports. The business calls for much more than just good accounting practices and standard financial reporting. The current systems have evolved to the point where a larger emphasis is placed on business intelligence and transparency across all departments, include accounting.

Responsibilities in this project included:

Involvement in end-user requirement gathering

Analyze requirements, design application or modify current application designs

Plan, monitor and track development

Discover application bugs, identify root causes and fix them

Working performance Tuning/Enhancements.

Developed the mapping & workflow as per the customer’s new request.

Monitored the daily running jobs through OPC/TWS scheduler tool.

Handled the aborted job while loading the data into data warehouse.

Monitored the monthly running jobs in order to avoid the delay of data loading into data warehouse.

Tested the developed mapping and workflow in development environment.

Assisted new team members in coming up to speed and in building their skills in Informatica and DB2.

Ensured on-time, defect free delivery and achieved a high level of client satisfaction.

Environment: Informatica 9.1, DB2, QMF tool, OPC/TWS Job Scheduler, UNIX and Linux.

ATOS (Formerly Siemens Information Systems), Nov 2004 – Sep 2015

Investment Management Data Mart (Mar 14 – Sep 15)

Consultant

This project was for T-Rowe Price, one of the largest retirement solutions and Investment Banking Firms in the USA. The Project involves design and development of Investment Data mart to support retail investment of T-Rowe Price. The transformation involves collecting data from various Sources, data cleansing and processing data using ETL process. Investment Data Mart (IDM) is a Data mart covering various subject areas mostly concerning the investments of retirement funds, end user perspectives, and effectiveness of sales promotions, channel effectiveness, contact method effectiveness, literature requirements, and the investment position profitability.

Responsibilities in this project include:

Identified the significant attributes to extract through understanding of client database.

Created Repository in Informatica 8.6.1

Designed mappings in Informatica for the loading of the data in the Data warehouse from the system.

Extensively involved in the extraction of different flat files and Oracle OLTP system.

Created complex mappings/mapplets using expressions, aggregators, joiners, ranking, filters, look up transformation in Informatica Power Mart.

Developed and Deployed Informatica ETL components

Developed and tested ETL processes and components using Informatica

Performed maintenance, troubleshooting and fine tuning of existing ETL processes

Designed, Developed and Tested Mappings that migrated data from the legacy sources to the ADS (warehouse).

Involved in Unit Testing, Performance tuning & Functional Testing activities.

Environment: Informatica 8.6.1, DB2, QMF tool, OPC/TWS Job Scheduler, UNIX and Linux.

ATOS LOG ANALYSIS (Mar 14 – Sep 15)

Consultant

Working for Centralized Log Analysis Tool. Virtually every process running on a system generates logs in some form or another. Usually, these logs are written to files on local disks. When your system grows to multiple hosts, managing the logs and accessing them can get complicated. Searching for a particular error across hundreds of log files on hundreds of servers is difficult without good tools. A common approach to this problem is to setup a centralized logging solution so that multiple logs can be aggregated in a central location specifically. This tool’s main purposes include:

Enabling & capturing the syslog, apache logs across the multiple servers to centralised storage.

Indexing centralized log for efficient search.

Searching and analytics on log metrics captured on centralized storage.

Replication of centralized log to handle failover of centralized storage.

Responsibilities in this project include:

Design the system architect of multiple internal ATOS log systems.

Gather the complete business requirement about all of the logs (syslog, Apache log, Log4 etc).

Importing and exporting data into HDFS using SQOOP.

Hbase used for real time accessing the data from hadoop.

Performed Data enrichments such as filtering, pivoting, format modeling, sorting and aggregation using Hive and Pig tools.

Pig used for performing a long series of data operations for ETL data pipelines, research the raw data and iterative data processing.

Hive used for interface with Hadoop based on data warehouse and SQL for querying data.

Manage and coordinate the team to develop as per the business requirement.

Prepare the documentation as per the requirement process.

Execute and advise on the optimal solution implementation.

Environment: Hadoop, Linux, Log Stash (Shipper), Redis/RabbitMQ Server, Storm (Real time processing), Sqoop, Pig, Hive, Hbase, Cassandra/Elastic Search, Kibana visualization.

PROLOG / PROcesses LOGistic (Nov 12 – Feb 14)

Associate Consultant

This project is for Karstadt Warenhause, one of the most recognized retail brands in Germany with multiple brick & mortar stores and online retail sites. PROLOG is one of the most critical applications in Karstadt retail warehousing,

managing the logistic process for moving and tracking inventory, analyzing costs, and generating data for accounting purposes. Prolog gathers data from different legacy systems like KARLOS, LBASE, L-EDV and QUISS for comprehensive logistic booking and billing data from various vendors.

Responsibilities in this project included:

Developed the mapping & workflow as per the customer’s new request.

Monitored the daily running jobs through OPC/TWS scheduler tool.

Handled the aborted job while loading the data into data warehouse.

Monitored the monthly running jobs in order to avoid the delay of data loading into data warehouse.

Tested the developed mapping and workflow in development environment.

Handled the monthly task for LBASE in order to validate the customer DHL data and our table data.

Assisted new team members in coming up to speed and in building their skills in Informatica and DB2.

Ensured on-time, defect free delivery and achieved a high level of client satisfaction.

Environment: Informatica 8.6.1, Oracle 9i, SQL Navigator, HTML, Windows NT, MS-Excel.

SIYAYA / Department of Labour – RSA (Jul 10 – Oct 12)

Associate Consultant

The purpose of the SIYAYA project is to manage Unemployment Insurance Fund (UIF) for the Department of Labour for the RSA (Republic of South Africa). This system maintains the information, declarations & registrations of all the Commercial & Domestic employers. There are approximately 4000 UIF users accessing this system from all over the country. The most critical part of SIYAYA is the Claims module which provides different types of benefit to 5 million users. The employers can register themselves and insure their employees with the department of Labour, South Africa. The system can analyze and process claims per South Africa UIF policies. This project was developed to create UIF reporting and analysis requirements as a consolidated application across the enterprise, using Oracle database.

Responsibilities in this project included:

Prepared daily status report for data loading status to the customer.

Used FTP to transfer the files between the different UNIX boxes.

Manually spooled out data file from MQ series queue and transferred the source data file SAP system to UNIX system.

Solved data source issue for CSV or flat file; In order to load the data, manually correct the source data and load into DWH system.

Provided weekly status reporting for all sub projects, infrastructure issues, and data file delivery on UNIX.

Created the Correction Characteristics report for New/Change requests.

Assisted new team members to build their skills in Data Stage and Oracle.

Prepared documents as new requirements developed from offshore team.

Provided the data extract as per user request for analyzing on weekly & monthly reports.

Environment: Data stage 7.5.2, Oracle 10g, UC4-Job Scheduler and Windows XP.

Qimonda corporate Data warehouse/Application Operation Support (August 07 – Jun 10)

Software Engineer

QIMONDA is a leading innovator in the international DRAM industry that designs, develops, manufactures and markets a broad range of DRAM and complete system solutions targeted at select industries. This project contains a central info hub where all the data is loaded and made available for data mart access. In the info hub the data is stored on a granular level whereas the data marts have designed and implemented aggregations to optimize user accesses and defined different kinds of reports which are generated from the data warehouse system

Responsibilities in this project included:

Prepared daily status report for data loading status to onsite/customer.

Manually extracted data from SAP source system in order to load the data into DWH.

Maintained the weekly logbook to cover all issues.

Involved in Trafo business logic and flow of data loading into data warehouse

Solved and debugged blocked jobs in UC4 & Data Stage; thereafter reporting Error Character tics (analysis result) to the customer.

Involved in production support issues.

Involved in manual data loading into Integration Environment during Environment (I/P Switch) change before backup days.

Involved in manual spool out data file from MQ series queue and transfer the source data file SAP system to UNIX system.

On the database side, created and replaced the view, Rebuilt Index and partition, etc

Environment: Data stage 7.5.2, Oracle 9.2/10g, UC4-Job Scheduler and Windows XP.

Infineon corporate Data warehouse (Nov 04 – Jul 07)

Software Engineer

INFINEON is a leading innovator in the international semiconductor industry which designs, develops, manufactures and markets a broad range of semiconductors and complete system solutions targeted at select industries. The data warehouse contains a central info hub where all the data are loaded and made available for data mart access. In the info hub the data are stored on a granular level whereas the data marts have designed and implemented aggregations to optimize user accesses.

Responsibilities in this project include:

Involved in production server monitoring for daily data loading status and any jobs blocking state; analysis and debugging of blocked job in scheduler and sending analysis results to the customer.

Used the Data Stage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions.

Involved in data source issue for CSV or flat file; In order to load the data, manually correct the source data and loaded into DWH system.

Involved in customer requirement and prepare the documents according to complete the request and send Correction Character tics to the customer.

Involved in Environment (I/P Switch) change, start manually loading the data into Integration (test) Environment so data loading provision for both environments are equal and switch can transfer to single Environment.

In database created the View, Replace View; Rebuild Index & partition, etc.

Involved in validating the data for both P/I environments; if any discrepancies, delete the bad data and reload into DWH system.

Environment: Data stage 7.1, Oracle 9i, UC4-Job Scheduler and Windows NT.

Digital Systems & Software, May 2002 – Oct 2004

Annual Maintenance Contract Monitoring System (Jan 04 – Oct 04)

Training Programmer

The Annual Maintenance Contract Monitoring System is used to maintain the computer spare parts and daily survey of annual maintenance system. We created the entire customer AMC through an online system and customers receive the SMS through the online system as per the AMC date. The sales return and purchase return details are also computerized. We generated the reports for daily, weekly & monthly wise reporting. The Reports are generated for spare parts and the list of good returns. The daily stock list also generated.

Responsibilities in this project include:

Designed logical and physical database model using case tools

Wrote programs in Visual Basic 6.0 using SQL Server database

Generated reports, which helped management tightens control over purchases

Development of Triggers, Stored Procedures and PL/SQL

Performed System and Integration testing, supported QA testing.

Requirements analysis and preparation of specification document

Involved in requirements gathering for GUI screens

Environment: Visual Basic 6.0, Oracle, ADO, SQL Server 6.5, Windows 98.

Hospital Management System (May 02 – Dec 03)

Training Programmer

The Hospital Management System is used to maintain the inpatient & outpatient details of the SKS Hospital, buying and selling medicines, automating employee wages, tracking attendance of employees, and so forth. We generated multiple reports as per the management requirements.

Responsibilities in this project include:

Involved in requirements gathering for GUI screens

Designed logical and physical database model using case tools

Wrote programs in Visual Basic 6.0 using Oracle database

Generated reports, which helped management tightens control over purchases

Environment: Visual Basic 6.0, SQL, ADO, SQL Server 6.5, Windows 98.



Contact this candidate