Resume

Sign in

Data Developer

Location:
Fairfield, California, United States
Salary:
120
Posted:
January 28, 2018

Contact this candidate

Resume:

Sudheer Talasila

ac38c2@r.postjobfree.com

404-***-****

Lead Informatica Developer and administrator

PROFESSIONAL SUMMARY

9+ years of experience in ETL methodology for supporting data transformations & processing in a corporate wide ETL Solution using Informatica Power Center 9.x, IDQ, PowerExchnage CDC 9.x.

Hands on tools and databases –

Proficient in Informatica administration work including installing and configuring Informatica PowerCenter and repository servers on windows and Unix platforms, backup and recovery, folders and user account maintenance.

Experience in Informatica Cloud to extract the data from salesforce.

Created PMCMD commands to monitor the production workflows.

Good knowledge on upgrading in Informatica to latest version.

Informatica Power Exchange for loading/retrieving data from Oracle and MS SQL server system.

Installing and configuring the Informatica PowerExchange servers on windows and UNIX for Oracle and MS Sql server.

Hands on experience on several key areas of Enterprise Data Warehousing such as Change Data Capture (CDC), Slowly Changing Dimensions (SCD Type I & II).

Informatica Developer client 9.6 SQL Server 2008/2005, Oracle 9i/10g, Netezza, My SQL, and SQL, PL/SQL programming.

Informatica PowerCenter 9.6.1/9.5 SQL Server 2008/2005, Oracle 9i/10g, and SQL, PL/SQL programming.

Oracle, SQL Server, MS Access, Teradata and database development using SQL, PL/SQL, SQL PLUS, TOAD, SQL-LOADER.

Knowledge in UNIX

Domain Knowledge – Proven ability to implement Data warehousing technology based solutions for business problems in

Financial/Investment Banking/Credit Cards,

Food and Beverage/Retail,

Insurance/Pharmaceutical/Healthcare and

Manufacturing of Pharma.

Highly proficient

Data Modeling retaining concepts of RDBMS,

Logical and Physical Data Modeling,

Multidimensional Data Modeling Schema (Star schema, Snow-Flake Modeling, Facts and dimensions).

Designing and developing complex mappings to extract data from diverse sources including flat files, RDBMS tables, legacy system files, XML files, Applications and Teradata.

integration of various data sources like Oracle, SQL Server, Flat Files, into Data Warehouse and also experienced in Data Cleansing and Data Analysis.

Business Objects 5.x/6.x to build user defined queries and reports to enable drill-down and slice and dice analysis on multiple databases.

Performance Tuning for targets, mappings and sessions, Task Automation using UNIX Shell Scripts, Job scheduling and Communicating with Server using pmcmd.

Provided data matching, data cleansing, exception handling, reporting and monitoring features to data warehouse.

Project Management Expertise

Knowledge of Software Development Life Cycle including Planning, Analysis, Design, Implementation, and Maintenance.

Project and Change request estimations.

ETL production support role, excellent communication and interpersonal skills.

Key player in handling end-to-end Data warehouse engagements that includes

Data Architecture and Design

Requirement Analysis

Feasibility Study and Impact Analysis

Designing Solution Architecture

Maintaining excellent Client Relationships and expert in effective communication

Onsite Design Lead

Resource Management and Leading the Development team

Project Management

Periodic Status Reporting in regular intervals

Onsite-Offshore Coordination

Implementation and UAT support

EXPERIENCE SNAPSHOT

Company Name

Date

SAAMA Technologies

August 2014 to Till Date

Informatica Corporation

March 2014 to July 2014

Cognizant Technology Solutions

December 2008 to Feb 2014

UST Global

May 2006 to November 2008

EDUCATION

MS in Computer Science, Bharathidasan University, India, 2003

BS in Computer Science, Nagarjuna University, India, 1999

CERTIFICATIONS

Informatica Certified Developer v 8.6

Microsoft Certified (MCP) in MS SQL Server 2008 Business Intelligence, 01/2014

SOFTWARE PROFICIENCY

ETL

Informatica Administration, PowerExchnage CDC 9.5, 9.6.1, Informatica Developer client 9.6,Informatica Powercenter 9.x/8.x/7.x/6.x (Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager, Workflow Monitor and Repository Manager) Informatica Power Analyzer, Informatica Power Exchange 8.1.1/5.2.1/Power Connect, SSIS (SQL Server Integration Services)

Reporting

Business Objects 5.x/6.x, Microstrategy, SSRS (SQL Server Reporting Services)

Data Modeling

Dimensional Data Modeling, Star Schema, Snow Flake, Fact and Dimension Table Modeling, Physical and Logical Data Modeling in Erwin 3.X

Databases

Oracle 10g/9i/8i/8.x, MS SQL Server 2008/2005/2000/7.0, Teradata V2R6/V2R5, Vertica, DB2, Netezza.

Other Utilities

Teradata utilities (Fast Load, Multi Load, BTEQ etc.), Toad, CVS, VSS, SQL*Loader, SQL Navigator, MS Visio, MS Office

Scheduling Tools

Control M, CA7

Operating Systems

UNIX (Sun Solaris), Windows 2003/2000/XP/98/NT Server, Mainframes.

EXPERIENCE DETAILS

Customer : Genentech, Vacaville, CA

Project : CCP2 _Return-To-Service

Role : Lead Developer and Informatica Administrator (Aug 2014 till date)

Environment : Informatica PowerExchange 9.5,Informatica power center 9.5 Oracle, MS SQL Server.

Project Description:

The BHDS2 system will allow Vacaville CCP2 Manufacturing and Quality Assurance groups to perform batch review and disposition, enter comments to a batch, and generate batch and equipment use log reports. The database schema shall be called BHDS2 and the web-based GUI shall be called CCP2 Online Review Tools (OLRT2).

The CCP2 Manufacturing Control System (MCS) produces a Lot of product through execution of a recipe. During recipe execution, the MCS captures process parameters, operator actions, material usage, equipment usage, phase details, POMSnet exceptions, TPB batch history, Experion events (operator actions) and Experion alarms. The MCS exports this process data to the CCP2 Batch History Data System (BHDS2). Genentech Manufacturing and Quality Assurance groups utilize OLRT2, a web-based Graphical User Interface (GUI), to access BHDS2 for interactive batch review to determine a lot’s suitability for further GMP manufacturing. Following batch review, the OLRT2 provides the lot’s review status and batch report to the Batch Assay History Record System (BAHR).

Key Deliverables:

Define Data source for both MS SQL and Oracle source systems.

Deployed the Informatica workflows to production.

Installed and configured PowerExchnage Oracle CDC server and client.

Creating user for developer and providing the permission based on the role.

Define and active the capture registrations for source database.

Creating the extraction maps for the data source.

Configure parameters in the dbmover.cfg file for the PowerExchange Listener.

Configure parameters in the pwxccl.cfg file for the PowerExchange Logger.

Developed mapping using Informatica power center.

Worked on Slowly Changing Dimensions i.e. Type I & II and data ware housing Change Data Capture (CDC).

Developed heartbeat mapping for each and every CDC session to track the Performance of CDC workflows and find out the data movement from source to target.

Responsible for testing, modifying, debugging, documenting and implementation of Informatica mappings.

Developed test cases and test plans to complete the unit testing. Support System testing.

Troubleshoot data issues, validated result sets, recommended and implemented process improvements.

Responsible for performance tuning in Informatica PowerExchange,Informatica PowerCenter at the Target Level, Source level, Mapping Level, Session Level, and System Level.

Environment: Informatica 9.6.1, IDQ 9.6, PowerExchnage CDC 9.6.1, Tableau Desktop 8.x, Oracle11g, PL/SQL, SQL Server, UNIX, Unix Shell scripts, SQL Developer, SQL server Management Studio and XML.

Customer : Genentech, San Francisco, CA

Project : gSonar

Role : Lead Developer (Oct 2014 Jan 2016)

Environment : Informatica PowerExchange 9.5.1,Informatica power center 9.5.1 Oracle, Hadoop, Hive.

Project Description:

The gsonar project will facilitate the integration of multi-sourced data into a big data platform and expose the data to standard and ad hoc reporting and visualization as well as analytical tools. This data will support the Marketing Science, Marketing Planning, Industry Analytics, Competitive Intelligence, FOIM, IM and F&BA groups to derive insights across all Genentech brands and products. In current process, there are multiple data sources that are not fully available to the various analytical groups at Genentech. Additionally, many analyses have different business rules applied on an ad-hoc basis and lack consistency. This platform will integrate and process all existing Genentech product data so that different groups are able to fully access the data with clearly defined business rules in place. Integrated tables of the mentioned data sources will be available to users for custom analytics and reporting through this platform.

System processes will have consistent definitions defined by Genentech business teams to link, integrate, and process the incoming data. Although data sources from various vendors may be available at different times, the gSonar platform will create weekly / monthly refreshes of the data (with a pre-defined calendar). The gSonar project will have the capability to incorporate new data sources as and when they are available in the future, and extend the exposure to other analytical /visualization tools when required. These new data sources and exposure to additional tools are not in the scope of the current vertical, but the platform should have the flexibility to extend when required

Key Deliverables:

Deployed the Informatica workflows to production.

Developed incremental mappings using Informatica Developer client and deploy the applications.

Creating physical data objects for relational database, flat files.

Creating the Hive table and loading the data into Hive tables using Informatica.

Worked on Informatica Power Exchange for Oracle CDC, HDFS and Hive.

Creating the workflow and deploy the workflow to data integration service to run the workflow.

Create the application and deploy the application to data integration services.

Installation and Configuration of Informatica Power Exchange for Mainframe DB2, Oracle CDC, Green Plum, HDFS and Hive

Responsible for testing, modifying, debugging, documenting and implementation of Informatica mappings.

Developed test cases and test plans to complete the unit testing. Support System testing.

Responsible for performance tuning in Informatica PowerCenter at the Target Level, Source level, Mapping Level, Session Level, and System Level.

Customer : INTUIT, Mountain View, CA

Project : Intuit 12 in 12 Big Data Project

Role : Lead Developer (Mar 2014 July 2014)

Environment : Informatica Developer 9.6.1,Informatica power center 9.6.1, Neteeza, Hadoop, SQL Server 2008/2005, MySQL, Linux, Python Scripts.

Project Description:

We are working on 12 source systems and to bring the data into target which is both Netezza and Hive. The source systems are in different database like oracle and MySQL. For Example we have source system called QBO (Quick Books Online) we have more than 50 tables out them 30 are full refresh and 20 are incremental load. For the incremental load we developed stage mapping in the Power center client and incremental mapping is Developer Client because stage table load the data into Hive.

Key Deliverables:

Developed incremental mappings using Informatica Developer client and deploy the applications.

Creating physical data objects for relational database, flat files.

Creating the Hive table and loading the data into Hive tables using Informatica.

Creating the workflow and deploy the workflow to data integration service to run the workflow.

Create the application and deploy the application to data integration services.

Developed the mapping by using MAV (mapping architecture Visio).

Generate the DDL for both Hive and Netezza using java based on source structure.

Responsible for testing, modifying, debugging, documenting and implementation of Informatica mappings.

Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.

Developed test cases and test plans to complete the unit testing. Support System testing.

Troubleshoot data issues, validated result sets, recommended and implemented process improvements.

Customer : Core Logic, USA

Project : Vectore Securities

Role : Team Lead Developer

Environment : Informatica 9.5, Vertica, Vsql, SQL Server 2005, Windows Server 2003

Project Description:

With VectorTM Securities, you have unparalleled access to the industry’s most comprehensive non-agency RMBS dataset containing over 95% of outstanding pool balances with industry leading loan modification, Core Logic HPI and Forecast, and TrueLTV property record information. It provides scalable, real-time, web access to this wide array of information enabling investors to monitor their investments at loan level and perform advanced analysis and reporting. The underlying securities data is updated on a daily basis, providing current point in time analysis.

Key Deliverables:

Taking care of Daily Loads and fixing the production issues with in short time.

Giving support on weekend loads and performing the validations after the Load complete.

Understanding the complete business flow.

Designing mappings in order to load data into the tables using transformations.

Involved in Data Migration form one module to another.

Reviewed mappings developed by Team members for functionality, Performance and Standards.

Data quality checking and interacting with the business analysts

Good experience in communicating with offshore team and client on weekly status report

Performed testing as per UTCs

Customer : Kaiser Permanente, USA

Project : XCELYS 7.1 upgrade

Role : Developer

Environment : Informatica 8.x, Oracle 10g, unix

Project Description:

Kaiser Permanente is implementing a National Claims Platform to reduce the cost of processing a claim, improve auto adjudication rates, and deliver an extensible solution that can adapt to emerging and future business needs and regulatory compliance needs. The Xcelys application is the claims processing system at the center of the National claims Platform.

Key Deliverables:

Understanding the complete business flow.

Designing mappings in order to load data into the tables using transformations.

Reviewed mappings developed by Team members for functionality, Performance and Standards.

Good experience in communicating with onsite team status report.

Performed testing as per UTCs

Customer : Dean Foods, Dallas, TX

Project : DEANFOODS_ESW

Role : Developer

Environment : Informatica Power Center 9.1, Teradata,SQL Server 2008/2005, PL/SQL, Windows XP.

Project Description:

Dean Foods grew rapidly during the last decade by acquiring and integrating independent regional Dairy companies. These regional dairies have a variety of underlying transaction processing systems including DMS, Ross RAMS, RMS, Destiny, SAP Heartland and BPCS. FDD and Morningstar business units also have legacy Sales databases with significant differences in how Sales Reports are generated, including how metrics such as Net Revenue and Gross Margin are calculated.

The current scope of implementation involves bringing DMS, Ross/RAMS, RMS, Destiny, SAP Heartland and BPCS data into the Enterprise Sales Data Warehouse (ESW).

Key Deliverables:

Requirement gathering, analysis and designing technical specifications for the data migration according to the business requirement.

Developed logical and physical dimensional data models using ERWIN.

Designed, developed and improved complex ETL structures to extract transform and load data from multiple data sources into data warehouse and other databases based on business requirements.

Extracted data from Flat files, DB2, SQL and Oracle to build an Operation Data Source. Applied business logic to load the data into Global Data Warehouse.

Developed complex mappings and SCD Type-I, Type-II and Type III mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Lookup, Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router, Stored Procedure, XML and SQL transformations.

Responsible for testing, modifying, debugging, documenting and implementation of Informatica mappings.

Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.

Developed test cases and test plans to complete the unit testing. Support System testing.

Troubleshoot data issues, validated result sets, recommended and implemented process improvements.

Responsible for performance tuning in Informatica PowerCenter at the Target Level, Source level, Mapping Level, Session Level, and System Level.

Extensively worked with PL/SQL, performance tuning of Oracle, SQL queries and T-SQL statements. Created Cursors, functions, stored procedures, packages, Triggers, views, materialized views using PL/SQL.

Extensively worked with incremental loading using Parameter Files, Mapping Variables and Mapping Parameters.

Created Data Breakpoints and Error Breakpoints for debugging the mappings using Debugger.

Involved in database testing, writing complex SQL queries to verify the transactions and business logic like identifying the duplicate rows.

Good experience with Informatica Administration 9.x.

Customer : Dean Foods, Dallas, TX

Project : DEANS_CASETRACKER

Role : Developer

Environment : Informatica Power Center 9.1, Teradata, SQL Server 2008/2005, PL/SQL, Windows XP.

Project Description:

Case Tracker project is a short term solution to track the Dean’s Assets such as Bosy’s, Cases etc. These Asset no’s are called Unit of measures in which the dairy products (can’s, bottles etc) are delivered to the customer location. The Basic idea is to track the no. of assets that are delivered to the customer and no of assets that are returned by the customer.

Key Deliverables:

Involved in Systems Study/ Analysis and understand the business needs and implement the same into a functional database design.

Created Source definitions from Flat Files, Oracle 9i, Imported Target definitions and created Reusable Transformations in Informatica Powercenter Designer 9.1

Used Informatica Power Connect to extract information from the Oracle ERP systems.

Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data.

Used Shell Scripting to get data from source for the transformation to cleanse data as per business rule

Experienced in developing informatica mappings using transformations such as Source Qualifier, Aggregator, Lookup, Filter, Sequence Generator, Expression, Router, Update Strategy, Rank, XML SQ/Parser/Generator, Normalizer, etc., to load data from different sources like Oracle, Flat Files, Excel Spread Sheets, XML, COBOL files to the target Data Warehouse.

Developed transformation Logic in informatica Powercenter Designer 9.1 to cleanse the source data of inconsistencies before loading the data into the staging area, which is the source for stage loading.

Worked on Connected & Unconnected Lookups, Router, Expressions, Source Qualifier, Aggregator, Filter and Sequence Generator etc.

Used PMCMD commands for automating Batches and Sessions.

Identified and fixed bottlenecks and tuned the Informatica mappings for better performance.

Created Test cases for Unit test, used TOAD 9.7 to validate the data with SQL Query.

Scheduling of Informatica workflows using Tidal Scheduler

Customer : AMERICAN EXPRESS, Phoenix, AZ

Project : Rewards & Recognition

Role : Developer

Environment : Informatica Power Center 8.6, SQL Server 2008/2005, PL/SQL, Windows XP, Flat Files.

Project Description:

The Global Reward and Recognition (R & R) Platform Project will implement OC Tanner technology system for administering global reward and recognition programs at American Express. This project is being initiated to use ETL as the intermediate process between OC Tanner and Millennium. The ETL extracts the required fields from the input file sent from OC Tanner, performs validations on the input data and passes the output file to Compensation Team in the required Millie format.

The R & R Project will implement the system currently in use in the US, Canada and UK markets.

Key Deliverables:

Responsible for documentation, version control of schema and version release.

Analyzed specifications and identified source data needs to be moved to data warehouse, Participated in the Design Team and user requirement gathering meetings.

Interpreted logical and physical data model for business users to determine common data definitions and establish referential integrity of the system.

Participated in the analysis of development environment of Extraction process and development architecture of ETL process.

Coordinated with customer in finding the sources and targets for data conversion.

Involved in the preparation of documentation for ETL standards, Procedures and Naming conventions as per ETL standards.

Created Reusable transformations and Mapplets for use in Multiple Mappings by using informatica power center design .

Created multiple universes and resolved loops by creating table aliases and contexts.

Used session partitions, Dynamic cache memory and Index caches for improving performance of Informatica server.

Good experience with Informatica Administration 8.x.

Optimized and performed Tuning in mappings to achieve higher response times.

Involved in the migration of existing ETL process to Informatica Powercenter.

Created effective Test data and developed thorough Unit test cases to ensure successful execution of the data loading processes.

Organized data in reports using Filters, sorting, ranking data with alerts.

Created reports using Business Objects functionality like queries, slice and dice, drill down, functions and formulas.

Customer : AMERICAN EXPRESS, Phoenix, AZ

Project : AMEX

Role : Developer

Environment : Informatica Power Center 8.6, SQL Server 2008/2005, Oracle10g, PL/SQL, Windows XP, Flat Files.

Project Description:

AMEX (formerly American Express Financial Advisors) is a diversified global financial services company. The company is best known for credit cards, charge cards and travelers check businesses. Data Management Technology Division sponsored Data Integration Center of Excellence as an ETL platform for Enterprise Data movement. Informatica 8.1 is used as the ETL Tool.

Key Deliverables:

Understanding the complete business flow.

Designing graphs and mappings in order to load data into the tables using transformations.

Involve in Data Migration form one module to another.

Reviewed mappings developed by Team members for functionality, Performance and Standards.

Managed the privileges of users and folders using Repository Manager and used Informatica Administration Console for Admin activities such as starting / stopping server, upgrading the repository etc.

Data quality checking and interacting with the business analysts

Good experience in communicating with offshore team

Communication with the client for weekly status report

Performed testing as per UTCs

Testing and debugging of code

Customer : BLUE SHIELD OF CALIFORNIA, Phoenix, AZ

Project : BSC_INTERFACE

Role : Developer

Environment : Informatica Power Center 8.6, SQL Server 2008/2005, Oracle10g,PL/SQL, Unix, Facets

Project Description:

The provider system of record used by Blue Shield of California for storing Provider identification Numbers (PIN) is the California Automated Provider System, or CAPS. The PIN certification or assignment process and system are maintained by the Provider Services department within the Network Management division. The purpose of CAPS is to establish and maintain a central source of provider information. The scope of the document is restricted to ETL Informatica and to the requirements addressed in the file based interaction specification (SPC) document.

Key Deliverables:

Used Conversion process for VSAM to ASCII source files using Informatica Power Exchange

Implemented a New Project by changing the current DW, written details design document & turn it over to the developers

Helped the team in analyzing the data to be able to identify the data quality issues

Work with Business Analyst and business users to clarify requirements and translate the requirement into technical specifications.

As a lead member of ETL Team, responsible for analyzing, designing and developing ETL strategies and processes, writing ETL specifications for developer, ETL and Informatica development, administration and mentoring.

Work closely with various levels of individuals to coordinate and prioritize multiple projects.

Estimate, schedule and track ETL projects throughout SDLC

Coordinating with user to gather the new requirements and working with existing issues.

Involved in business analysis and technical design sessions with business and technical staff to develop requirements document, and ETL specifications

Developed standard and re-usable mappings and mapplets using various transformations like expression, aggregator, joiner, source qualifier, router, lookup, and Router.

Customer : HEALTH NET INC., Los Angeles, CA

Project : BSC_INTERFACE

Role : Developer

Environment : Informatica Power Center8.6, SQL Server 2008/2005, Oracle10g, PL/SQL, Unix. Facets

Project Description:

HealthNet (ClaimSphere) - Cognizant’s end-to-end Managed Care Reporting Solution, collates data from various information sources of a Payer/PBM, and generates reports to meet both operational and analytical requirements. Thus, HealthNet (ClaimSphere) integrates data across LOBs & across applications to offer various analytics and drill-down reports in diverse areas such as of Claims, Membership, Provider Management, Billing, Commissions and Utilization Management.

Key Deliverables:

Created new mappings and updating old mappings according to changes in Business logic.

Used ETL (Informatica) to load data from sources like Oracle database and Flat Files.

Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, and Transformation Developer for defining Source & Target definitions and coded the process of data flow from source system to data warehouse.

Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the requirements.

Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Sequence Generator, Filter, Sorter, Source Qualifier

Created various Mapplets in designer using Informatica Powercenter Mapplet Designer.

Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.

Involved in creating desktop intelligence and web intelligence reports.



Contact this candidate