Mr. RAJASEKHAR POLA Mob#: +1-469-***-****
**********.****@*****.***
I.Professional Summary
Have 9 Years of experience in ETL Informatica, Talend development and Tableau Development.
Competency in understanding business application, business dataflow, and data relation.
Expert in preparing documentation as per Business requirements.
Experience in diversified fields in Data Warehousing. Worked on various projects involving Data warehousing using Informatica Power Center 8x, 9x & 10x.
Expertise in designing and developing mappings from varied transformation logic using Transformations.
Expertise in UNIX shells scripting and validate the source file data in-between the ETL process flow as per business needs.
Excellent problem solving skills with a strong technical background and result oriented team player with excellent communication and interpersonal skills.
Worked on tableau 9.x and 10.x Development of visualizations.
Schedule the reports to extract and share with users.
Experienced in Tableau dashboards development based on client requirements with all kinds of visualizations.
Experienced in spotfire reports development based on client requirements with visualizations.
Experience in agile & waterfall environment.
Experience in full development of life cycle i.e. front end to back end.
Work well with customers to determine requirements and application scope. Experienced in leading teams of developers on larger Modules.
Validations of Soptfire, OBIEE, BO reports on daily basis after data processing.
Employment profile:
Currently Working as a Tech Lead in Houston for BHGE client.
Achievements:
Achieved appreciations from clients, Reporting Manager many times.
Achieved Kudos for PIT STBT, EP6-MFG and GEMP project for multiple times.
Got special appreciations from PITSTBT and Oil & Gas clients for individual performance to complete the tasks in time.
Got Appreciation from AT&T Client for Data analysis and fixes in critical time.
Got Appreciation from NNA Client for Data analysis and delivery in time.
II.Expertise in Software Technologies
Primary Skill : ETL Informatica
Secondary Skill : Oracle & Shell Scripting
Primary Domain : Energy, Treasury and Oil & Gas
Secondary Domain : Manufacturing
Solution : Use of Standard Databases and BI Tools
Trained Skill : Oracle 9i, 10g, 11g and Teradata.
Projects Details
1.BHGE OIL & GAS – KPI
Project Domain : Oil & Gas
Solution : Use of Standard Databases and BI Tools
Project Name : SLT KPI
Client : BHGE Oil & Gas
Role : Onshore Tech Lead
Duration Onshore : Jun 18 – Till Date
Environment Languages : Unix Shell scripting
Database : Oracle
Tools : ETL Informatica 9.5/10,Talend, tableau 9.x & 10.x
O/s : windows
1.1 Project Description
BHGE Oil & Gas - Currently engineering dashboards are being generated by gathering information from different sources and manipulating the data manually in excel. There is lot of manual effort involved in preparing the dashboards. The project aims at reducing the manual effort in generating the dashboards and building a platform that is modular and extensible for future expansion to support product company needs.
The dashboard in tableau should be generated by fetching the information from Data Lake (Argo) where ever possible (if the data from the source is getting replicated to Argo). The current data processing logic that is being done in the to calculate only GE and should be consolidated with Baker Hughes data in the KPI Dashboard.
KPI Dashboard – It contains high level information about the BHGE Oil & Gas Growth, Quality, delivery, feedback, Revenue, cost, New Models demands and customer satisfaction in visualization.
1.2 Role and Responsibilities:
Analyzed the business specifications provided by the clients.
Business Analysis on Requirements from Integration end.
Design the mappings/Sessions/workflows using 9.5 Designer.
Worked with various sources such as Flat files and Relational Tables(Oracle) and Created complex mappings.
Created the Sessions for each mapping in the workflow manager to run the workflows.
Design the visualization in tableau as per client requirement.
Prepare navigation and definitions of each dashboards for layman understanding.
Conduct sessions on each visualization for better understanding.
Provide the functionality to send mails or open new hyperlink from tableau to individual.
Schedule the reports to avoid the performance issues.
Validate the dashboards along with User.
Conduct Discussions with client for any requirement changes or analysis.
Perform unit, integration and user testing for every task.
Prepare overall development documents to share with client and operations team.
Prepared mail alerts using shell scripts in case of any issues from the source side.
2.GE Treasury
Project Domain : Treasury
Solution : Use of Standard Databases and BI Tools
Project Name : GE Treasury Support
Client : GE Treasury
Role : Offshore Lead
Duration Onshore : Sep 17 -Till Date
Environment Languages : Unix Shell & R scripting
Database : Oracle & Greenplum
Tools : ETL Informatica 9.5, Denodo & Appworx
O/s : windows
1.1 Project Description
GE Treasury GE Capital is the financial services unit of the American multinational conglomerate General Electric. It provides commercial lending and leasing, as well as a range of financial services for commercial aviation, energy, and support for GE's industrial business units. We have built datawarehouse EDW and the integration layers among different source systems and downstream applications.
The dashboards are in spotfire should be generated by fetching the information from Data Lake where ever possible. The current data processing that and display regular data loads and status required in generating the Dashboard.
1.2 Role and Responsibilities:
Requirement gathering and analysis and creation of the technical design documents.
Developed the new Custom mappings to LOAD the DATA into Target tables.
Coordinating, planning and executing project. Interacting with different users, clients and other stakeholders.
Prepare comprehensive Project Plan/Schedule. Scheduling meetings & preparing meeting agenda.
Source Systems Data Analysis and Business analysis of feed from source system to data warehouse.
Analysis on different subject areas which are involved in treasury data warehouse like Trade, Trading party, Cashflow, Mark to Market.
Involved in writing complex generic UNIX shell scripts for creating the list files, archive the files and log files.
Implementation of Greenplum procedures/functions and calling them from Informatica.
Process Improvement and automation of manual work through ETL. Enhancement of existing ETL code.
Unit testing of the code. Analyze and resolve the bug raised by QA team if any.
Root causes analysis of PROD issues and providing solution for it. Tracking and Mitigating the Risks.
Support and Maintenance for going forward requirements and issues Post deployment.
Implemented the performance tuning techniques in various levels.
Batch monitoring which includes trouble shooting the jobs, and fixing the issues encountered in the job processing, data cleanup.
Strong skills in Data Analysis, Data Requirement Analysis and Data Mapping for ETL processes.
Implemented Performance tuning activity to increase the through put for both mapping and session level and SQL Queries Optimization as well.
.
3.GE OIL & GAS – GEAR_DW
Project Domain : Manufacture
Solution : Use of Standard Databases and BI Tools
Project Name : GE Oil & Gas – GEAR DW
Client : GE Oil & Gas
Role : Onshore Tech Lead
Duration Onshore : Apr 17 – Aug 17
Environment Languages : Unix Shell scripting
Database : Oracle
Tools : ETL Informatica 9.5, tableau 9.x & 10.x
O/s : windows
1.1 Project Description
GE Oil & Gas - Currently engineering dashboards are being generated by gathering information from different sources and manipulating the data manually in excel. There is lot of manual effort involved in preparing the dashboards. The project aims at reducing the manual effort in generating the dashboards and building a platform that is modular and extensible for future expansion to support product company needs.
The dashboard in tableau should be generated by fetching the information from Data Lake (Argo) where ever possible (if the data from the source is getting replicated to Argo). The current data processing logic that is being done in the excel should be transformed to ETL so that there is no manual effort required in generating the Dashboard.
1.2 Role and Responsibilities:
Analyzed the business specifications provided by the clients.
Business Analysis on Requirements from Integration end.
Design the mappings/Sessions/workflows using 9.5 Designer.
Worked with various sources such as Flat files and Relational Tables(Oracle) and Created complex mappings with use of transformations like Lookup, Router, Joiner, union, expression, Sorter, Filter, Update Strategy, Source Qualifier, stored procedure.
Created the Sessions for each mapping in the workflow manager to run the workflows.
Design the visualization in tableau as per client requirement.
Prepare navigation and definitions of each dashboards for layman understanding.
Conduct sessions on each visualization for better understanding.
Provide the functionality to send mails or open new hyperlink from tableau to individual.
Schedule the reports to avoid the performance issues.
Validate the dashboards along with User.
Conduct Discussions with client for any requirement changes or analysis.
Perform unit, integration and user testing for every task.
Prepare overall development documents to share with client and operations team.
Prepared mail alerts using shell scripts in case of any issues from the source side.
4.GEH Nuclear
Project Domain : Manufacture
Solution : Use of Standard Databases and BI Tools
Project Name : GEH Nuclear
Client : GE Nuclear
Role : Onshore Tech Lead
Duration Onshore : Sep’2016- Mar 17
Environment Languages : Unix Shell scripting
Database : Oracle
Tools : ETL Informatica 9.5, tableau 9.x & 10.x
O/s : windows
1.1 Project Description
GE Nuclear Maintaining all the Parts details. Collect the all parts details and calculate the Cost of Quality of Parts. Find out the slowly moving Inventory of all parts. Design the visual dashboards to find the Parts orders and delivery items which are come under late delivery, find the reasons for of it and % of late items for every quarter. Also calculate the defects of delivery parts. Design the forecast of cost of quality of items. Visualize the Parts orders feedback and promoters count for business analysis.
1.2 Role and Responsibilities:
Analyzed the business specifications provided by the clients.
Business Analysis on Requirements from Integration end.
Understands the project specifications provided by the client.
Design the mappings using Informatica 9.1 and 9.5 mapping Designer
Worked with various sources such as Flat files and Relational Tables(Oracle) and Created complex mappings with use of transformations like Lookup, Router, Joiner, union, expression, Sorter, Filter, Update Strategy, Source Qualifier, stored procedure.
Created the Sessions for each mapping in the workflow manager to run the workflows.
Design the visualization in tableau as per client requirement.
Validate the dashboards along with User.
Conduct Discussions with client for any requirement changes or analysis.
Perform unit, integration and user testing for every task.
Prepare overall development documents to share with client and operations team.
Prepared mail alerts using shell scripts in case of any issues from the source side.
5.NNA & NMEX
Project Domain : Manufacture
Solution : Use of Standard Databases and BI Tools
Project Name : NMEX Informatica Conversion
Client : Nissan
Role : Onshore ETL Developer
Duration Onshore : Jan’2016-Jul’2016
Environment Languages : Unix Shell scripting
Database : Oracle
Tools : ETL Informatica 9.1 and 9.5
O/s : windows
1.1 Project Description
NISSAN – Maintaining all the Budget and Forecast values to process the data to Hyperion Cubes. It contains monthly sales and Budget of vehicles and manufactured body parts of all the Nissan Vehicles. I have developed a code in Informatica to process the data automatically and place the consolidated result set files in the Hyperion server. Hyperion will team will analyze the data to view the demand and supply of business.
1.2 Role and Responsibilities:
Analyzed the business specifications provided by the clients.
Business Analysis on Requirements from Integration end.
Extracted source definitions from databases like oracle and Flat files into the Power Center repository.
Developed a DB design.
Understands the project specifications provided by the client.
Design the mappings using Informatica 9.1 and 9.5 mapping Designer
Worked with various sources such as Flat files and Relational Tables(Oracle) and Created complex mappings with use of transformations like Lookup, Router, Joiner, union, expression, Sorter, Filter, Update Strategy, Source Qualifier, stored procedure.
Created the Sessions for each mapping in the workflow manager to run the workflows.
Prepared mail alerts using shell scripts in case of any issues from the source side.
6.Barracuda- DW
Project Domain : Telecommunication & services
Solution : Use of Standard Databases and BI Tools
Project Name : Amdocs Billing & AT&T Corp Sys
Client : AT&T
Role : ETL Developer
Duration Onshore : Mar’2015-Till Date
Environment Languages : Unix Shell scripting
Database : Oracle
Tools : ETL Informatica 9.1 and 9.5
1.1 Project Description
AT&T – Maintain overall data and developed reports about internet usage customers, plans and customer issues. In case of any failures occurs then support team will take of the issues and fix in timeline and update the customer. Reports will help to the client to know daily customers issues, feedback on their issues and response time to fix and issues.
1.2 Role and Responsibilities:
Monitor all systems jobs and the performance of Applications including environment issues, response time, batch load/SLA delays, etc.
In case of any issues analyze the issue and found the root cause for immediate fix.
Serve as applicable point of contact for upstream or downstream applications, network and outage management.
Communicate and remain engaged with upstream or downstream application teams when potential or actual service interruptions occur.
Review Application logs for conditions indicating failures, errors, and performance issues and analyze conditions against experience and documentation.
Create, modify, and maintain job streams as defined in design.
Follow Change Management procedures.
Review OBIEE and BO reports daily and update in case of any issues in reports.
Update clients and teams with daily activities and updates.
Back track the issue to find the cause and reprocess data to resolve the issues.
Prepare the monthly analysis report for all.
7.GEMP
Project Domain : Oil & Gas
Solution : Use of Standard Databases and BI Tools
Project Name : Global Engineering Metrics Portal
Client : GE
Role : Team Lead
Duration Offshore : May’2013-Feb’2015
Environment Languages : Unix Shell scripting
Database : Oracle
Tools : ETL Informatica 9.1 and 9.5
O/s : windows, Unix
2.1 Project Description
GE Oil & Gas – Global Engineering Operations team has the goal of implementing metrics digital cockpit, this should address the key business goals like Digitization across engineering imperatives (Safety, Quality, Finance, Fulfillment, Productivity). Automate metric process to eliminate manual workaround to compile metrics for leadership review. Enhance data accuracy across the board.
2.2 Role and Responsibilities:
Analyzed the business specifications provided by the clients.
Business Analysis on Requirements from Integration end.
Extracted source definitions from databases like oracle and Flat files into the Power Center repository.
Participated in DB design.
Understands the project specifications provided by the client.
Design the mappings using Informatica 9.1 and 9.5 mapping Designer
Worked with various sources such as Flat files and Relational Tables(Oracle) and Created complex mappings with use of transformations like Lookup, Router, Joiner, union, expression, Sorter, Filter, Update Strategy, Source Qualifier, stored procedure.
Developed many Shell Scripts for screen door validations.
Worked on set up of SSH keys.
Worked on ftp and sftp of the files using Shell Scripting
Created the Sessions for each mapping in the workflow manager to run the workflows.
Prepared Tibco Spotfire reports based on client requirements.
Prepare maps to identify the number of working people and opportunities counts group by country.
Prepared mail alerts using shell scripts in case of any issues from the source side.
8.EP6
Project Domain : Aviation
Solution : Use of Standard Databases and BI Tools
Project Name : EP6 Manufactures
Client : GE Aviation
Role : Team Member
Duration Offshore : May’2012 Mar’13
Environment Languages : UNIX Shell Scripting
Database : oracle
Tools : ETL Informatica 9x
O/s : windows, UNIX, Linux
3.1 Project Description
Engineering parts 6(EP6) project building a solution for the engineering parts of all turbines and respected sub parts of turbines. Have to collect the all the turbines and respected parts for all the business.
EP6 have 3 modules like Manufactures, Repairs and outage. The main motto is to collect the manufactured parts, parts current location and find out the number of repairs for each part. Also have to find out the unused parts. Group of Data will be displayed for each turbine to send to the downstream application. These are should be one data warehouse.
3.2 Role and Responsibilities:
Analyzed the business specifications provided by the clients.
Business Analysis on Requirements from Integration end.
Extracted source definitions from databases like oracle, Sybase, SQL server and Flat files into the Power Center repository.
Design the mappings using Informatica 9x mapping Designer
Design the TDD, FMEA and ETL Requirement documents.
Design and Development of Informatica mappings and Procedures.
Worked with various sources such as Flat files and Relational Tables (oracle and Sybase) and Created complex mappings with use of transformations like Lookup, Router, Joiner, expression, Sorter, Filter, Update Strategy, Source Qualifier, rank, aggregator etc…
Testing - unit testing & integration testing.
Testing of Informatica Mappings in Development, QA and Production environments involving all the integration teams.
Created the Sessions for each mapping in the workflow manager.
Create a workflow, Run and Scheduling the Workflow if required.
Done the performance tuning to load the parts and sub parts of turbines.
9.PITSTBT
Project Domain : Energy
Solution : Use of Standard Databases and BI Tools
Project Name : PIT STBT INFA
Client : GE ENERGY
Role : Team Member
Duration Offshore : Jun’2010-May’2012
Environment Languages : Unix Shell scripting
Database : Oracle
Tools : ETL Informatica
O/s : windows, Unix
4.1 Project Description
This Project aims at building a solution for the Personal Income Tax compliance in 100+ countries to assist GE Energy Services in the process of determining Tax and how much tax is owed by STBT. Overall project scope involves design, build & deployment of the solution this project will be a 2 phased project. The first phase will involve a combined approach to the P&Ls file delivery. The P&L owners will be able to manually encrypted files onto the Informatica utility UNIX directory using SFTP. The P&L owners’ ability to automate their data will have Informatica access sources directly.
Report will be display in java for each and every employee tax for every month based on their working days in the other location.
4.2 Role and Responsibilities:
Understands the project specifications provided by the client.
Prepared required understanding documents and technical design documents before coding and after development.
Design the mappings using Informatica 8.1 mapping Designer.
Worked with various sources such as Flat files and Relational Tables (Oracle) and Created complex mappings with use of transformations like Lookup, Router, Joiner, union, expression, Sorter, Filter, Update Strategy, Source Qualifier, stored procedure, hypertext (HTTP) and aggregate etc.
Developed many Shell Scripts for screen door validations.
Worked on set up of SSH and GPG keys.
Worked on the Encryption and Decryption of the files in UNIX and Linux.
Worked on ftp and sftp of the files using Shell Scripting.
Created the Sessions for each mapping in the workflow manager to run the workflows.
Upgraded the system coding from 8.1 to 9.1 as per client requirement.
Involved in all kinds of testing Unit, Integration, SIT and UAT.
Trainings:
Trained on Talend.
Trained on DIH.
Trained on IDQ.
Trained on Tableau.
Sessions:
I have conducted ETL and Unix sessions to freshers.
Educational Background:
Master of Computer Applications (MCA) from Andhra University Visakhapatnam.
Bachelor of Science – B. Sc. from Andhra University Visakhapatnam.
Visa Details:
Have valid H1-B with validity till Sep’2019.