Post Job Free
Sign in

Data Manager

Location:
Toronto, ON, m6k 3g2, Canada
Salary:
$50/HR
Posted:
March 27, 2012

Contact this candidate

Resume:

Yaswanth Potla

Mobile: 647-***-****

****.*******@*****.***

Summary

• Around Six years of overall experience in IT Industry with Data warehousing tools using industry accepted methodologies and procedures.

• Expert knowledge in working with Informatica Power Center 9.0/8.x/7.x (Designer, Repository manager, Repository Server Administrator console, Server Manager, Work flow manager, workflow monitor).

• Extensive experience in using various Informatica Designer Tools like Source Analyzer, Mapping Designer, Transformation Developer, Mapplet Designer, Schedulers and Warehouse Designer.

• Extensively worked on Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Sequence Generator, Normalizer, Union, and XML Source Qualifier.

• Highly experienced in developing, designing, reviewing and documenting Informatica work products like Mappings, Mapplets, Reusable transformations, Sessions, Workflows, Worklets, Schedulers and experienced in using Mapping parameters, Mapping variables, Session parameter files.

• Extensive experience in supporting Informatica applications, data extraction from heterogeneous sources using informatica Power Center.

• Extensive experience in error handling and problem fixing in Informatica.

• Designed complex Mappings and expertise in Performance Tuning.

• Experience in troubleshooting by tuning mappings, identify and resolve performance bottlenecks in various levels like source, target, mappings, and session.

• Involved in Implementing Slowly Changing Dimensions, Star Join Schema modeling, Snowflake modeling, FACT tables, Dimension tables, denormalization.

• Proficient in using Informatica workflow manager, Workflow monitor, pmcmd (Informatica Command line utility) to create, schedule and control workflows, tasks, and sessions.

• Knowledge of Informatica 7.x/8.x/9.0 in installation and setup.

• Experience in Oracle9i/10g/11g, SQL, Teradata, and DB2 etc

• Extensive Experience in UNIX (AIX/Solaris/ HP-UX 11.x), and Windows Operating system.

• Extensively used SQL and PL/SQL to write Stored Procedures, Functions, Packages.

• Good knowledge about Pushdown feature of Informatica 8.1

• Excellent overall software Development life cycle (SDLC) experience, Conceptual and Logical Thinking, Goal oriented, self-motivated and able to work independently and as a member of a team.

• Excellent communication, analytical and interpersonal skills.

• Quick learner and adaptive to new and challenging technological environments

Technical Skills

Data warehousing Tools: Informatica Power Center 6.x/7.x/8.X, Power Exchange, Repository Server

Administrator Console.

Databases: Oracle 11g/10g/9i/8i, T-SQL, MS-SQL SERVER 7.0/2000/2005, Teradata.

Data Modeling: Star Schema Modeling, Snowflakes Modeling, FACT and Dimensions tables

Languages: C, C++, Java, XML, CSS.

GUI Development Tools: Data Transformation Services (DTS), Oracle designer

Database Utilities: TOAD 8.0/7.1, Winsql, SQL*Plus, SQL Developer, Squirrel SQL.

Reporting Tools: MS Access Reports, Oracle reports 6i.

Environment: UNIX (Solaris, AIX, HP-UX), LINUX, Windows 9x/2000/NT/XP/VISTA

Other Tools: MS Visio, Legacy (COBOL 3.x/2.x), Control-m, AutoSys, Redwood.

Education

• Bachelor’s in Computer Science

Professional Experience

Aviva Insurance, Toronto, ON Jan ’11 – Till Date

Senior ETL/Informatica Engineer, MAP (Marketing Analytics Platform)

The Marketing Analytics Platform will deliver a business intelligence tool which enables sales and marketing to capture and leverage information to support and provide insights into the business. The solution involves the extraction of customer level data; merging with outside lists and demographics; and includes data from Aviva transactional systems, results of sales efforts from call center and sales systems, as well as other vendor services. Create a common reporting platform with consistently defined measures and common business rules across sales and marketing for all regions. Establish a data mart for sales and marketing based on standard definitions and business rules in order to report on current marketing program events, build history, and create foundational metrics for which marketing can track and measure results.

Responsibilities:

• Study the existing system to identify the issues in the design phase.

• The solution provides the ability to integrate external list purchases maintaining the required data quality against Aviva’s information for use in the marketing process.

• Established the Sales & Marketing Data mart, which includes the data model, business rules, Extract/Transformations/Load procedures and schedules.

• Developed common definitions using the business rules for all business regions reporting on member, lead or prospect data.

• The solution provides the ability to receive, load, and integrate data from external third party vendors (i.e. mail house, marketing agencies).

• Forecast pipeline activity, which defines when a Lead becomes the member.

• Lead Management System (LMS) historical data that is critical for campaign management (primarily campaign related measures based on the campaign history needs) will be migrate to the MAP environment.

• Installed and Setup Informatica Client tools on client machines.

• Written bulk insert scripts to load Microsoft Excel extract data from sources.

• Worked on Informatica transformations like Expression, connected and unconnected Lookups, Source Qualifier, Filter, Aggregator, sorter, Sequence Generator, Normalizer, Router, Joiner, Stored Procedure etc.

• Worked on the Performance tuning, ETL Procedures and processes.

• Created ETL mappings using Informatica Power Center to move information from multiple sources into staging tables and then to common consolidated data area of Data Warehouse and then to Flat files as requested by the vendor.

• Written PL/SQL Procedures and Functions for Procedure transformations also created and used different tasks like Decision and email tasks.

• The solution provides the ability for end-users to create new reports without vendor intervention.

• Developed a system where the Vendor receives ad hoc requests to extract data from MAP and deliver it to Aviva or third parties as needed.

• Capture customer changes with respect to status or key information shall be tracked (name, address, demographics, lifecycle events, product, payment status).

• Developed a support all the internal source extracts and interface to the vendor who will build and host the MAP solution.

• Worked in debugging using Session Log messages, Server Messages.

• Involved in development and review of Unit Testing and User Acceptance testing Specifications.

Environment: Informatica Power Center 9.0, WindowsXP, UNIX, Oracle 11g, SQL Server 2008, MS Excel (Source), Flat files (End Targets), TOAD (Query Tool), PL/SQL.

CVS Caremark/Pharmacy, Woonsocket, RI Jul ’09 – Dec ‘10

ETL Informatica Application Developer, Product Information Management (PIM)

The PIM is designed to maintain the CVS supply and distribution of the Retail Product related information. So far, the company is totally depending upon the Legacy Mainframes, flat files etc to manage the Product related information. The Product Information Management was a part of the new MDM suite that was designed to maintain and enhance the existing procedures, supply chain management, product related information, Ad Promotions etc.

Responsibilities:

• Integrated the Product related information from Mainframes with the EDW (Enterprise Data Warehouse) using Informatica Power Center and Informatica Power Exchange.

• Worked with the teams across the Organization and multiple vendors (who maintains and does business the organization).

• Involved mainly in the Integration development, Testing, Implementation and Post-implementation (Production Support).

• Implemented the Pushdown Optimization strategy to balance the load across the Informatica Repository Server and Database Engine.

• Worked extensively on Normalizer, Expression, Filter, Router, Update Strategy, Lookup, XML, Stored Procedure, Union, Rank, Joiner, and Sequence Generator Transformations.

• Worked with the Architecture teams to create the Informatica and UNIX Profiling.

• Worked extensively on Informatica Workflows, worklets, mapplets etc.

• Developed ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translates business rules and functionality requirements into ETL procedures using Informatica Power Center.

• Responsible for the definition, development and testing of the processes/programs necessary to extract data from the client's operational databases, Transform and cleanse the data, and Load it from the MDM suite main tables.

• Used advanced techniques of Informatica to load data from MS SQL Server, Flat files into the target.

• Designed the Informatica flows across multiple environments like Oracle, Mainframes, Flat files, SQL Server etc.

• Performed UAT testing as requested by the business support team, new patches have been implemented after the roll out to enhance the business usage.

• Performed Data conversion, Quality and Verification activities.

• Performed the tuning of the database stored procedures, Informatica sessions etc.

• Performance Bottlenecks have been identified and removed from time to time.

• Developed UNIX (Korn shell) scripts for ftp’ing files, to execute the SQL scripts, Stored Procedures etc.

• Used SQL Loader to load data from flat files to the Database tables.

• Implemented the concept of Oracle wallet.

• Performed code migration from Informactica 8.1 to Informatica 8.6.

• Scheduled the UNIX jobs/batches comprising of Informatica workflows, Oracle SQL scripts, and ftp process using the control-m job scheduler.

Environment: Informatica Power Center 8.6/8.1, Informatica Power Exchange 8.x, Mainframes, UNIX (SUN-Solaris), Oracle 10g, Windows XP Professional, Flat files, SQL Server 2005, SQL Loader, Serena Version Manager, Toad (Query Tool), Control-m scheduler, Korn shell Scripting.

Bank of New York, NY July ’08 – Jun ’09

Informatica developer

MIRA application is a Management Information system developed for Bank of New York. This application gets its input from 3 different systems of the bank and analyzes the status of different service request. Depending on the status of the service requests banks gain an overview of how requests are allocated and served promptly by the bank employees.

Responsibilities:

• Analyzed, conceptualized/designed the database that serves the purpose of proving critical business metrics.

• Assisted in the design and Maintenance of the Metadata environment.

• Extended packages by writing components available within SSIS Designer to provide custom functionality in a package.

• Used SSIS to create, configure, and run packages programmatically from own (developer) applications.

• Implemented logical and physical data modeling with STAR schema techniques using Erwin in Data warehouse as well as in Data Mart.

• Worked on Informatica - Source Analyzer, Data warehousing designer, Mapping Designer & Mapplet and Transformations.

• Worked extensively on Normalizer, Expression, Filter, Router, Update, Lookup, XML, Stored Procedure, Sequence Generator Transformations.

• Worked extensively with Workflows, worklets, different tasks.

• Developed ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translates business rules and functionality requirements into ETL procedures using Informatica Power Center.

• Responsible for the definition, development and testing of the processes/programs necessary to extract data from the client's operational databases, Transform and cleanse the data, and Load it into the data marts.

• Performed Data conversion, Quality and Verification activities.

• Experience in Creation of ETL mappings using Informatica Power Mart/Power Center to move information from multiple sources into a common target area such as Data Marts and Data Warehouse.

• Written PL/SQL Procedures and Functions for Procedure Transformations.

• Worked on the Performance tuning of the programs, ETL Procedures and processes.

• Worked in debugging using Session Log messages, Server Messages.

• Responsible for writing Shell scripts for manipulation of files.

Environment: Informatica Power Center 8.1, AIX UNIX 4.1, Windows 2000, Oracle 10g, Sql Server 2005, SSIS, T-SQL, Toad, ERWIN 4.1, XML, UNIX Shell Scripting.

Hartford Life Insurance, CT Oct ’07 – Jun ’08

ETL Informatica Developer

The project includes designing, developing and maintaining a data mart for Claim department. It includes different kinds of claims like Life Claim, Property Claim, Commercial Claim and Auto Claim. The name of the data mart is Claim_Tran. The project involves extracting data from different sources and loading them into the data mart.

Responsibilities:

• Business Analysis and Requirements Gathering.

• Design, development and maintenance of Insurance policies and claims transactions Data Mart using Informatica Power Center 7.1.

• Installed and Setup Informatica Client tools on client machines.

• Proficiently managed and created folders and users for ETL development using the Repository Manager module of the Informatica Power Center 7.1.

• Used advanced techniques of Informatica to load data from MS SQL Server, Flat files, Focus files into the target.

• Extensively worked with transformations Lookup, Update Strategy, Expression, Filter, Stored Procedures, Router and others.

• Performed Unit Testing and tuned the mappings for better performance.

• Documented the complete Mappings and also migrated the data from Informatica 7.1 to Informatica 8.1

Environment: Informatica Power Center 7.1, MainFrames (COBOL), Oracle 9i, Teradata, Flat files, XML, PL/SQL, SQL, Windows NT, UNIX(HP-UX 11.x), Unix Shell Scripting.

OHIO Department of Education, Columbus, OH Jun ’07 – Sep ‘07

ETL Informatica Developer, Decision Framework (DF)

The decision framework (DF) is a decision-making process designed to assist districts in making informed decisions – based on what their data tell them – about where to spend their time, energy, and resources to make significant and substantial improvements in student performance. The test results are captured at the Subscale level for each subject.

Responsibilities:

• Study the existing system to identify the issues in the development phase.

• Involved mainly in QA and Production Support.

• Worked extensively on Source Qualifier, Sorter, Aggregator, Normalizer, Expression, Filter, Router, Update strategy, Lookup, XML, Stored Procedure, Sequence Generator, and Union Transformations.

• Moved the ETL mappings from Development to QA and from QA to Production.

• Enhanced the existing system while working in QA and Production support.

• Designed complex ETL mappings and configured the sessions and workflows to populate the Decision Frame work tables in the Data Warehouse.

• Interacted with the source data vendors while assisted in the Production Support.

• Used Debugger to remove the bugs from the system.

• Involved in Performance tuning to increase the performance of the mappings.

• Worked with Star Schema model to design dimensional and fact tables.

Environment: Informatica Power Center 7.1, UNIX (HP-UX), Oracle 10g, Windows XP Professional, Flat files, SQL Server 2005, SQL Developer (Query Tool), Unix

Toyota Motors Sales (TMS), Torrance, CA Jun ’06 – May ‘07

Informatica Developer, CMS Link (Customer Marketing & Sales Link)

Toyota Motor Sales (TMS) has grown over the years, consequently the complexity of the customer data environment. Today, several sources across TMS fed more or less 2 terabytes of customer information, controlled by multiple sets of business rules and stored in multiple databases within the customer environment. As a result, TMS is experiencing issues with data quality, sourcing, and maintenance costs. TMS needs timely, precise and comprehensive customer information in order to improve their ability to launch and maintain a complete and successful CRM program. CDB/CDW systems are developed with data consistency, timeliness, accuracy, and completeness as the single and authoritative source for customer data within TMS.

Responsibilities:

• Study the existing system to identify the issues and problems facing business users.

• Worked on Informatica client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.

• Identified all the dimensions to be included in the target warehouse design and confirmed the granularity of the facts in the fact tables.

• Analyzed and Created Facts and Dimension Tables in the Star-Schema.

• Extracted data from flat files and oracle database, applied business logic to load them into the Customer Data Warehouse (CDW).

• Developed the Informatica Mappings, Mapplets, Sessions and Workflows to populate the Customer Data Warehouse (CDW) tables.

• Created mappings using fact and dimensional tables from Data modeling

• Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Warehouse.

• Performed the uploading and downloading of flat files from UNIX server.

• Tuned performance of Informatica sessions for large data files by implementing pipeline partitioning and increasing block size, data cache size, sequence buffer length and target based commit interval and resolved bottlenecks.

• Installed Informatica Client tools on client machines and Created Repository as well as assigned permissions for user groups.

• Converted the old PL/SQL Procedures to Informatica mappings.

• Scheduled the jobs with Redwood job scheduler.

• Tune the SQL Queries for Performance Issues.

• Scheduled the Jobs fit in the available time window.

• Assisted the Reporting team in design and development of Reporting System using Cognos.

Environment: Informatica Power Center 7.1, Oracle 9i, UNIX, SQL Server 2000, Flat files (Sources), XML (Source), TOAD (Query Tool), Erwin 4.1 Data model, SAS, and PL/SQL, Redwood.



Contact this candidate