Post Job Free
Sign in

Data Analyst Informatica

Location:
Chennai, Tamil Nadu, India
Salary:
1200000 p.a.
Posted:
March 21, 2021

Contact this candidate

Resume:

Ganesh Ram Data Analyst E-mail Id: ******.*.***@*****.*** Phone: +91-994/046-7873

Linked-In Profile: https://www.linkedin.com/in/ganesh-ram-b-34615523 Page: 1

PROFESSIONAL SUMMARY

SUMMARY:

• Senior Business Data Analyst/Analytics Consultant with extensive experience in the Financial Services Industry.

• Possess a strong programming background with proficiency in programming languages like Python, R and SQL.

• Rich experience performing data analysis and creating visualizations using Tableau.

• Extensive experience working in Data Warehouse Environments and using SQL for data analysis including joins and Analytical functions.

• Proficient in Data Analytics with a track record of delivering results.

• Dedicated team player with excellent communication and presentation skills.

• Possess solid organization, negotiation, analysis skills and demonstrated ability to deliver on multiple assignments, meet tight deadlines and be effective and decisive under pressure. TECHNICAL SKILLS:

Data Warehousing & Business Intelligence

Informatica ETL with Administration, (till v10.x)

Big data ecosystem of Hadoop with the Hive database

AWS and Azure Skills

Teradata database

OLAP Reporting modules of Business Objects XI n Crystal Reports XI EDUCATION:

MBA in Analytics and Finance, ANNA UNIVERSITY, Chennai, completed in 2014

Bachelors of Engineering in Electronics and Communication Engineering, MIT Campus, Anna University, completed in 2007

Relevant Projects

Comcast – RA Operations 2020

Monitored and process-handled the Revenue Assurance Operations. Prepared the daily operational reports that are required for the businesses to run. If for any reason the dashboard shows red, sufficient recovery steps are taken to ensure the jobs are run correct and turned green. Handled the AAA files on a daily basis;

Monitored the AAA dashboard to ensure the tables are loaded properly with the RA data. Managed the hierarchies in the ETL framework in the Informatica v10.x, using the Administrator and Data Steward roles available in the Informatica MDM Hub Console.

Handled the Amdocs loading process to the RA tables and if any failures occur in the load, sufficient steps are taken @ my end to ensure the jobs were re-run and the loading process is completed successfully to the Amdocs tables, on a daily basis.

Analyzed and retrieved the data reports required to be shared, for the business to run.

Have used the AWS and Azure, to extract data from the web console. Ganesh Ram Data Analyst E-mail Id: ******.*.***@*****.*** Phone: +91-994/046-7873 Linked-In Profile: https://www.linkedin.com/in/ganesh-ram-b-34615523 Page: 2

Horizon BCBSNJ – Data Services Support 2019

Took care of the Horizon Informatica batch jobs in the Dev environment; Created the new connection strings required in Informatica, as needed to execute the jobs in the Dev or higher environments.

Worked as an Informatica MDM Administrator to ensure the jobs pick up the required files from the backend.

The confirmed batch jobs were moved or migrated to the next higher environment of SIT/UAT as needed, on a case-by-case basis.

Took care of the backend files in the UNIX, required to run the Horizon-BCBS-NJ batch jobs. Some jobs may fail because the UNIX permissions in the file may not be sufficient to run the jobs. In such cases, the file permissions were altered appropriately from the Admin side, to enable the developer have access on the required files and to run the job in his backend.

Worked in the big data ecosystem of Hadoop with the Hive database to extract records for the requirements handled; Integrated this data with the Informatica MDM Hub data to effectively retrieve the data sets for the seeking consumers. Analyzed the extracted records from the Hadoop ecosystem to retrieve the end-user’s sought data.

The Non-PROD migrations were done manually moving the files from the required location to the relevant Non-PROD boxes.

All the PROD migrations were done using the Jenkins automated tool to push the code to the relevant higher environments.

Worked with the Informatica Data Quality (IDQ) to retrieve all the records with high quality and rejected the records which did not meet the Informatica MDM Hub standard template. Duty Free Shopping 2018

DFS has warehouses of PIM at these locations: Hauwei, HongKong, North America and Singapore.

As part of the regular export for the PIM feed, the csku’s and styles of the items and products will be included in the assortment. When the export job is run, the DFS job will pick the items/styles from the assortment and load the relevant PIM backend tables. The backend tables of PIM were loaded by such an export process in Phase 1.

Worked in the Phase 2 upgrade of the PIM system, seen operational in the Informatica Product Information Management.

Done the business analysis to retrieve the data sets, for the seeking users.

Took care of any suspensions occurring in the PIM database, to resolve the end-user’s queries regarding their missing data.

Analyzed the databases using the Amazon Web Services and Azure, to get the required data for the clientele.

Symantec (3S development) 2017

Involved in the data model creation for the final 3rd Normal form load.

Handled all the development of Informatica ETL maps to load the stage and the final 3NF target tables of Teradata.

Worked in the Informatica Big data Management with the Hadoop ecosystem and their Hive database.

Analyzed the ETL flow to schedule the Batch jobs in the Appworx module to load the TD tables’ data in the final 3NF layer.

Ganesh Ram Data Analyst E-mail Id: ******.*.***@*****.*** Phone: +91-994/046-7873 Linked-In Profile: https://www.linkedin.com/in/ganesh-ram-b-34615523 Page: 3

Cloud Computing Support 2016

Supported the Cloud applications; Have analyzed the failures and reached the respective teams for the issue resolution at a brisk pace; Prepared the various reports that need to be shared with the clients using the Cognizant Cloud-Computing tools.

Performed the archival process that involves archiving the 3 months old data in the client database, so that the entire client’s data are meticulously preserved; this will serve the data requirements for the FICO – clients.

Handled the various client requirements of the PepsiCo, FICO processes and retrieved the archived purchase order copies, for the seeking clients.

Monitored the client-side mail-box volumes of the NAM mail-boxes and reported its over usage when the mail-box size gets full; Monitored the NotificationQueue table in the PEG database to retrieve the mails that were pending to be sent from the Cloud back-end. Mass Mutual Retirement Services 2015

Created the Audit Control tables for the Business Information Factory in the Teradata database.

Developed the Etl maps to load the Prestage database, for the data load from the input MorningStar xml and Silverpop csv files.

Served a part in the creation of Unit Test Cases for the afore-mentioned developed Etl maps. Paypal – IMD Data Integration 2014

Built the ETL flow for the maps to load the payment domain tables, in Teradata.

Analyzed the target tables in Teradata to retrieve the target data sets for the seeking users and ensured they are based on their respective GDPR regulations.

Worked in their big data requirements with the Hadoop ecosystem.

Built and deployed the various DB load scripts in Teradata which includes FastLoad, MultiLoad and have created the Teradata BTEQ scripts to load the relevant target tables.

Built the schedule plan for the Teradata tables load, in the UC4 Unix scheduler. AT&T – u-DAS (Data Access Services) 2013

Built the u-Verse – Data Access Services framework of the AT&T system; Involved in the development of Informatica maps to load data to the tables of AT&T.

Loaded all the relevant target tables of AT&T u-DAS - Infrastructure.

Have implemented several ETL Logic design changes and built new transformation logics pack for the required maps using Informatica v9.x.

Analyzed the COMM domain tables and ensured the tables are loaded properly with their required data-sets.

Pfizer – Microsoft Business Intelligence 2012

Involved in the scripting of relevant design documents and Unit Testing of the developed PPS model (Performance Point Server) Proof of Concept.

The Key Performance Indicators (KPIs) for the different pharmaceutical units were developed in the MSBI.

Analyzed the data and the manual load into the SQL Server db was done for the data entered through the Java Web UI panel that correspond to the data for the KPIs, in the pharmaceutical units. The completed PPS’ 07 model was unit tested and defect-logged. Dun & BradStreet – Global Hygiene & Match 2011

Provided full Production Support for the Application, resolved the pertinent issues and provided Maintenance, Enhancement support in the stages.

Ganesh Ram Data Analyst E-mail Id: ******.*.***@*****.*** Phone: +91-994/046-7873 Linked-In Profile: https://www.linkedin.com/in/ganesh-ram-b-34615523 Page: 4

Prepared the Time Tracking Excel that gives the Time Statistics of the process flow of each Batch in the 2 different Production Environments giving the throughput of each ETL job with the record count and shared the same with the IBM on a daily basis.

Updated the quality Documents and made sure that assignments were delivered on time. Branch Banking and Trust 2010

Developed the reports like the IRA_List, Banking Regulations form, etc. in the Crystal Reports framework based on the given business requirements of the banking customers of BB&T.

Done the needs analysis and formatted the developed reports based on the clients’ feedback and their business goals.

Migrated the developed reports from the Legacy environment to the BI4 – QA and PROD environments.

Retrieved the backend data from the SSIS database and verified the same with the report data seen generated in the Crystal Reports UI.

Northern Trust Retirement Services – Strategic Transfer Agency 2009

Created reports in BO XI and Crystal XI, based on the ad-hoc needs of the partners of NT.

Involved in the Monthly/Quarterly reports to be delivered to the leads of NT.

Done the Business Analysis and developed the packages for the reports of various Management Companies like Ashmore, Del Rey, Shay, RCM and UHG, those that come under the Northern Trust. J P Morgan Chase – BO XI Reports Development 2008

Analyzed the business requirements and prepared the business reports that should be delivered to the business clients on a weekly basis.

Developed the business–critical reports using the BO XI and Crystal Reports XI frameworks. CERTIFICATION DETAILS:

I ‘m a Cognizant-Certified Professional with proven work experience, in these specializations:-

(i) Data-Warehousing

(ii) SQL Scripting

(iii) Informatica Powercenter v9.x

(iv) Informatica v10.1

(v) Cloud-Computing Fundamentals and

(vi) Python – Basics



Contact this candidate