Post Job Free
Sign in

Data Manager

Location:
Middlesex, NJ, 08846
Posted:
October 04, 2017

Contact this candidate

Resume:

VENKATA CHALLA

SENIOR INFORMATICA DEVELOPER

E-mail: ac2lnq@r.postjobfree.com

Ph.: 862-***-****

SUMMARY:

Over 11 years of IT experience working on ETL Architecture design, System Analysis, Application Design, Development, Testing, Implementation, Maintenance and Supporting for an Enterprise level Data Integration, Data Warehouse (EDW) Business Intelligence (BI) solutions using Operational Data Store (ODS), Data Warehouse (DW)/Data Mart (DM), using Informatica Powercenter ETL tool.

7 + years of working experience in finance industry with domain knowledge in Securities, Fixed Income, Trades, Customers, Employee and other financial/compliance data.

11 years of extensive experience in Informatica Powercenter ETL tool including, Power Exchange, IDQ, MDM

Good Knowledge in Big data technologies using Informatica ETL with HDFS, Hive adaptors.

Experience in Data Modeling techniques like Dimensional Data modeling, Star/Snowflake/Hybrid Schema modeling, and Conceptual/Logical/Physical data modeling using Erwin/Visio tools. Build data quality assessments and measurement of critical data utilized strategic systems and processes

Experience in integration of various data sources like Oracle, Teradata, Exadata, Netezza, Sybase, flat files, XML, Share Networks, Web services, CDC (Change Data Capture) and Informatica Data Replication (IDR).

Experience in power centre IDQ of developing plans for Analysis, standardization, Matching and merge, Address doctor, consolidating data from different components.

Analyze the data quality issues and support the Global Functions, ensuring the data quality scorecards are measuring fit for purpose data and for audit purpose

Applied the rules and profiled the source and target table's data using IDQ.

Developed the mappings, applied rules and transformation logics as per the source and target system requirements

Experience in creating Perl, Python and UNIX shell scripts

Expertise in creating Complex Informatica Mappings and reusable components like Reusable transformations, Mapplets, Worklets and reusable control tasks to work with reusable business logic.

Experienced in configuring Nodes and Repositories in Informatica Power Center Designer, Workflow Manager, Workflow Monitor and Repository Manager.

Experience in implementing SCD (Slowly Changing Dimension) Type 1, Type 2, and Type 3.

Good experience in performance techniques like partitioning and push-down optimization, SQL Queries, tuning of informatica mappings and other database related tuning.

Conducted System, UAT and Functionality testing and investigated software bugs.

Extensive experience in conducting Requirement-gathering sessions, writing Business Requirement Document, Functional Requirement Document.

SQL Tuning and creation of indexes for faster database access, better query performance in Informatica by partitions, creating Explain Plan, SQL hints for query and indexing

Scheduled the batch jobs; deployment of objects between environments.

Knowledge in design and development of Business Intelligence reports using BI tools Hyperion Essbase, Business Objects and Cognos.

Match and Merge rules will be implemented in Informatica MDM 10.1v to find the duplicates and to analyze the golden record.

Have Knowledge on Informatica MDM concepts and implementation of De-duplication process and IDD.

Hands on experience in maintaining version controls

EDUCATION/CERTIFICATION:

Bachelor of Computer Science & Information Technology (CSIT), Hyderabad, India (Apr'02-Apr-06)

Organizational level Certified Professional in Performance Re-Engineering and SIX SIGMA GREEN BELT Certified

TECHNICAL SKILLS

ETL Tools

Informatica Power Centre 9.6/8.6/8.5/8.1/8.0/7.x/6.x/5.x, IDQ 9.5/10, & Power Exchange, Informatica Data Replication (IDR)

Business Intelligence

Tableau, Business Objects, Cognos and Hyperion Essabase

RDBMS

Oracle 12C/11G/10G/9i, SQL*Loader, Teradata, Netezza, Exadata, Sybase

Data Modelling

Erwin 9.X, Microsoft Visio

Tools

Toad, SQL DBx, PL/SQL Developer, SQL Assistant, Rapid SQL 8.x,

Putty, WinSCP, Autosys, Remedy, Quality Centre, Control-M, JIRA

Languages

SQL, PL/SQL, Perl, Python, UNIX Shell Scripting, PL/SQL

Operating Systems

Windows, UNIX/LINUX, Sun OS

PROFESSIONAL EXPERIENCE

CREDIT SUISSE Mar 11- Till date

Technical Architect

INSIGHT (Credit Risk Monitoring system)

INSIGHT is Credit Suisse's global Credit Risk Monitoring system. INSIGHT functionality includes Collection and collation of all open trades and loans (over 3,000,000 per day received from over 300 feeds), Collection and collation of all positions, Exposure calculation and aggregation (including potential exposure calculation, netting and collateral/CDV offset. Insight Report Manager allows all reports to be run from a single screen. Reports include Counterparty and Risk Reporting, Business Area and Inventory Limit reporting, and Control

Project:

DQM (Data Quality Management)

Senior ETL Developer /Technical Architect

Provided Architectural Road Map, direction and work packets for ETL needs.

Created detail ETL Standards documents for Design, Development, Release Management and Production Support.

Design Detail ETL spec for development and ensured Quality ETL Deliverables.

Created detail ETL Migration Processes for Informatica, Database, Scheduling, O/S and H/W teams.

Design and Develop Reusable Common objects shared across various Repositories.

Automated, Redesigned and tuned several ETL Process for optimal utilization of Time and resource

Identify and investigate the sensitive fields for each source file to determine fields to be anonymized

Identify best target solution for the entire anonymization process, to be incorporated in the current development process

Re-engineering of Informatica mappings affected by anonymization

Testing of the changes and preparation for production implementation

Over Technical Responsibilities in Credit Suisse:

Designing Extraction, Transformation and Load strategies for Trades, Sensitivities, Collaterals, Positions, Securities, Issuers, Ratings, Counterparty and Reference data.

Involved in creating UI prototypes and preparing layout for the system

Responsible for coordinating JAD sessions with business users to analyze gather and define the key performance indicators and presented the requirement specifications

Managed scope creep (Change control), and Business User expectations and various levels of approval.

Interacted with stakeholders and business operation team and performed Gap Analysis of end user requirements

Involved in End to end data analysis and data Quality.

Experience in Estimation & Proposal writing, scoping, requirements study, client interfacing and creating and providing solution to customers.

Applying the Data Validation rules and raising exception handling mechanisms on Trades, Sensitivities, Collaterals, Positions, Securities, Issuers, Ratings, Counterparty and Reference data.

Validated data accuracy and integrity for partner's sales and delivery metrics

Applying the reject reprocessing.

Involved in building and maintaining the Data Dictionary for Trades, Sensitivities, Collaterals, Positions, Securities, Issuers, Ratings, Counterparty and Reference data.

Involved in impact analysis and estimating development efforts for CRs

Tuning the huge data loads for optimal performance.

Worked in design, development and unit testing of the jobs

Responsible for overseeing the quality procedures related to project

Tuned Informatica Mappings and Sessions for optimum performance

Involved in Code review, bug fix while testing is going on.

Scheduling the jobs, running and monitoring the jobs through Dollar U scheduler

Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and maintainability.

Preparing SQL Queries for testing the mappings and balancing the data

Created test case scenarios for data verification and provided the necessary input to the QA team for validating the data

Created deployment groups, migrated the code into different environments

Created functional specifications for reporting module and been appreciated for a quick turnaround.

Involved in user-training of reporting related to business process and also was responsible for implementing the same

Analysis on the failure Jobs and fix the issue to for smoother Production run.

Feed issues were resolved in time manner to avoid affecting the processes and business. Feeds monitor regularly later on and update RMM regularly on that issue.

Every actions track in Remedy and assign to particular team/person to get it resolved in time Remedy tickets get update on each steps of the action taken to resolve the issue.

Script developed to notify the Failure of a particular feed with the component size and record counts. This script running recursive in system in backend and notify myself as mail with details.

Script developed to check the crontab entries for critical process (EUR COB, NA COB, PAC COB etc.) whether commented or not before the scheduled time to avoid any delay of the start of the process.

Script developed to check the FeedGen source entries of a particular feed and the arrivals of the components of the feeds.

Script developed to help to know feed current status, time taken during loading, encrypting password.

Identify and escalate the performance issue of particular feed on their component arrivals, process times, Powermart process and loading time and processes to higher level to get it resolve.

Updating users with detailed information and solutions with in time frame.

Customer Relationship

Provided timeous feedback and maintain professional communication to all stakeholders of the Bank.

Worked with business users to clarify requirements and translate the requirement into technical specifications. Involved in business analysis and technical design sessions with business and technical staff to develop requirements document, and ETL specifications.

Environment: Informatica Power Center9.6, CDC, Oracle 11G, Exadata, PL/SQL, Flat Files (XML/XSD, CSV, EXCEL), VISIO, UNIX/LINUX, Shell Scripting, Autosys/Informatica Scheduler, HP Quality Centre, Remedy, SharePoint

UBS Sep-10- Mar 11

Informatica System Analyst

FXCC (Foreign Exchange Currencies and Commodities)

The foreign exchange is an exchange where different currencies are traded. FXCC is essential in commodity trade, due to the international nature of the commodity market. This helps traders to purchase commodities and derivatives in a currency that is not their own currency.Liaise with Front Office or Middle Office for any booking errors. Timely Monitor repair queue to ensure all trades that are booked by Front Office flows into Back Office's TP system.

Overall responsibilities in UBS:

Provide Application Support for users in FXMM/FXCC.

Interacts with end users directly like traders, business users & other support teams to coordinate for resolving the issues.

Proactive monitoring of applications to ensure system is up & to identify the incidents before user complains regarding the same.

Responsible for diagnosing and remediation of the issue and reporting to the end user.

To make sure all the critical reports are sent to the Business users and down streams as per SLA.

Involved in resolving batch issues by doing initial investigation and coordinate with Development teams accordingly.

Liaise with development teams for bugs, fixes & new feature to be implemented in the applications.

Suggesting improvements to enhance supportability of the application and enhancing the documentation around common support tasks

Involves Data investigation such as wrong feed delivery, missing information in the data files and corrupted data flown to down streams and communicate to the Business users regarding the delay.

Make sure that the issues are resolved within SLA period & KPI’s will be followed without miss.

Investigate the root cause of the incident and fix it with the respective remedial solutions with in the agreed period of time.

Ensure prompt communication of the resolution of the issue to the user and ensure to get his/her confirmation on the resolution.

Part of support includes health checks of the trading applications and proactive approach towards the resolution of the incidents before business complains to us. This will ensure that most of the incidents within the system would be resolved and that will result in smooth business for the day

Applying the proper work around for the known solutions to improve the resolution time for the incidents & keep the business informed always about the incident status from time to time.

Maintenance, support and enhancement of new and existing scripts as part of the change/release management process.

Having experience in intensively using Autosys and HP Open view tools.

Involved in enhancements of the reports based on user request.

Environment: Informatica Power Center9.6, Oracle 11G, PL/SQL, Flat Files (XML/XSD, CSV, EXCEL), VISIO, UNIX/LINUX, Shell Scripting, Autosys/Informatica Scheduler, HP Quality Centre, Remedy, SharePoint

Accenture/Client : CISCO Inc Mar-08- Sep 10

Sr. System Engineer

Project: Profit & Loss Cube (P&L)

Project Description:

Profit and Loss (P&L) is a finance program that enables multi-dimensional reporting by expenses and actuals.

The goal of the program is to enable timely, accurate and consistent reporting relying on data from a single source of truth for all of Gross Margin.

Responsibilities:

Design and Developer – Expense, workforce and Under Writing(U/W) Modules

Created Business Rules,Calculation Scripts

Created Web forms and standard reports using smart view.

Created Rules file to load data into Essbase

Created tables in Oracle SQL developer to facilitate data load into Essbase Staging

Created required Hierarchies in the respective cubes using Outline Load Utility.

Migrated artifacts like web forms, Smart lists from one environment to the other by shared services.

Created Maxl Scripts for scheduling Calc Scripts to run in order for data loading in Essbase

Assign access to users/groups

Environment: Informatica Power Center7, Teradata, Hyperion Essabse, PL/SQL, Flat Files (XML/XSD, CSV, EXCEL), VISIO, UNIX/LINUX, Shell Scripting, Autosys/Informatica Scheduler, HP Quality Center, Remedy, SharePoint

Project: FETCH (Financial and Expense Tracking including with Capital and Headcount)

Project Description:

The FETCH (Finance Expense Tracking including Capital and Headcount) Program is an integrated approach to tracking Cisco's expense, capital, and headcount data via the FETCH Enterprise Reporting Suite. The Suite includes the FETCH Essbase Cube, the FETCH Business Objects Universe, and the FETCH Dashboard. Together, these tools deliver accurate, multi-dimensional financial data including summary level information and drill to detail capabilities.

The FETCH Business Objects Universe pulls from the same source as the FETCH Cube, but provides more customized reporting capabilities. Through web based report creation, users directly query the database by dragging and dropping objects into the report user interface.

Responsibilities:

Created new dimension build, data load rules for new ASO and BSO cube.

Resolving DTD (Drill to Detail) issues in Excel Spreadsheet Add-In.

Providing timely support whenever it failed on data load / dimension build batch and browsed through the root cause and eliminated the same so that it is not encountered again.

Creating user accounts and extending access to users.

Creation of filters and assigning filters to groups and users.

Written and ran the PL/SQL procedures in oracle to load data into data sources.

Ran and enhances the UNIX scripts in the process of cube loading.

Solving data integrity P1 priority issues as our application cube consists P1 priority.

Generating the reports using Essbase Excel add-in and Business Objects 6.5 and supported if any issues.

Handling the Remedy Tickets raised by the EA Business Fraternity and resolving them within the SLA(s) as defined.

Environment: Informatica Power Center7, Teradata, Hyperion Essabse, PL/SQL, Flat Files (XML/XSD, CSV, EXCEL), VISIO, UNIX/LINUX, Shell Scripting, Autosys/Informatica Scheduler, HP Quality Centre, Remedy, SharePoint

Satyam/Client : CISCO Systems Nov-06- Mar 08

Role: Software Engineer

Project: Travel Reporting

Project Description:

Travel Reporting application Reports amount incurred by individuals at Cisco in Company Related travel. Cisco Employees can travel by their own expenses and later can claim money for their spending through tools like I expense at Cisco. Also individuals can make travel through Cisco’s travel partner Amex.

This application Consolidates Travel expense details coming from different sources Like ERPs, GL systems and Amex XML files, Loads data in to a centralized ware house. Bo reports were built on this ware house for reporting expenses data at different levels like Individual level, Manager Level, Department level and node Level.

Responsibilities:

Created new dimension build, data load rules for new ASO and BSO cube.

Involved in analyzing the data that is loaded in the data warehouse on a daily basis through the files provided from 16 data providers.

Built Informatica mappings to extract data from XML sources and loading data to stage Tables.

Analyzed the star schema models to understand data in the data warehouse. Developed mappings to implement slowly changing dimensions and built code with reliable error/exception handling and rollback framework.

Migrated workflows, sessions, mappings and database scripts from Development environment to production at the time of elevation.

Used version control to check-in and check-out the code that is being migrated.

Extensively involved in production support, testing the data and day to day aspects of the data warehouse including deploying new code and loading history data into the data warehouse.

Involved in supporting and monitoring the oracle databases and coordinating with Oracle DBA for any database related issues.

Extensively used PL/SQL code while loading data and updating data in the warehouse.

Developed pre source shell scripts in UNIX to format the data files coming from mainframe and convert them into CSV files for processing the files through Informatica.

Developed shell scripts for automating Parameter file updates on daily basis.

As a production support person coordinated file transfer process with Amex.

Responded to help desk calls from customers and coordinated solutions with the rest of the developer’s team. Took care of any process failures, easily identified cause of the problem and resolved issues.

Environment: Informatica Power Center6, Teradata, Hyperion Essabse, PL/SQL, Flat Files (XML/XSD, CSV, EXCEL), VISIO, UNIX/LINUX, Shell Scripting, Autosys/Informatica Scheduler, HP Quality Centre, Remedy, SharePoint.

Project: Enterprise Reporting Systems for Worldwide Bookings

Project Description:

Cisco Systems, Inc. is a multinational corporation headquartered in San Jose, California; it designs and sells networking and communications technology and services under five brands, namely Cisco, Linksys, WebEx, Iron Port, and Scientific Atlanta. Initially, Cisco manufactured only enterprise multi-protocol routers but gradually diversified its product offering to move into the home user market with the purchase of Linksys while also expanding its offering for corporate customers.

Responsibilities:

Created new dimension build, data load rules for new ASO and BSO cube.

Maintaining, monitoring and supporting dimension build, data loads and Reporting.

Monitoring daily runs of jobs from $Universe (scheduling tool) on UNIX environment.

Involved in Solving Priority 1 Business Cases and Maintaining, supporting the production in a Full Cycle basis.

Involved in critical and high priority time bound issues providing detailed root cause and resolution to business users, documenting the issues and solutions for future reference.

Batch job support (24x7) including Month End, Quarter End and Year End Support to make data available for reporting at the earliest.

Providing Production and Non-Production Support and also coordinating with DBAs.

Publishing the reports to the users (monthly/quarterly/yearly) or as and when demanded by the Business users.

Responsible for maintenance of proper quality standards in deliverables, version control, defect tracker, error log etc., and Major Business and Operations Support.

Ensured that the whole team is in sync with the Customer Advocacy Finance Business policy.

Maintaining robust security access to the applications as defined by the Business users.

Maintaining and supporting the production in a Full Cycle basis.

Environment: Informatica Power Center6, Teradata, Hyperion Essabse, PL/SQL, Flat Files (XML/XSD, CSV, EXCEL), VISIO, UNIX/LINUX, Shell Scripting, Autosys/Informatica Scheduler, HP Quality Centre, Remedy, SharePoint



Contact this candidate