Post Job Free
Sign in

Manager Data

Location:
KCMO, MO
Posted:
January 24, 2013

Contact this candidate

Resume:

Ranjith Reddy

Phone: 816-***-**** Email: *******.****@*****.***

SUMMARYAbout 7+ years of overall experience in the Information Technology IndustryExperience in data warehousing and business intelligence using various ETL tools Informatica, SQL Server SSIS and Business Objects4+ years of data modeling experience..Extensive knowledge with dimensional data modeling, star schema/snowflakes schema, fact and dimension tables and process mapping using the top-down and bottom-up approach.Extensive experience with ETL tool Informatica in designing the Workflows, Worklets, Mappings, Configuring the Informatica Server and scheduling the Workflows and sessions using Informatica Power Center 8.1.1/7.1/6.2/5.1.Vast experience in Designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc.Experience in creating Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).Well Experienced in doing Error Handling and Troubleshooting using various log files.Extensively worked on developing and debugging Informatica mappings, mapplets, sessions and workflows. Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.Extensive work with PL/SQL, performance tuning of Oracle using Tkprof, SQL trace, SQL plan, SQL hints, Oracle partitioning, various indexes and join typesUnderstanding & working knowledge of Informatica CDC (Change Data Capture) Implementation experience of CDC using stored procedures, triggers and using informatica power exchange.Experienced with Informatica Data Explorer (IDE) / Informatica Data Quality (IDQ) tools for Data Analysis / Data Profiling and Data Governance.Experience in working with Mainframe files, COBOL files, XML, and Flat FilesProficient working with Unix/Linux environments and writing UNIX shell scripts,Java&SybaseExperienced in working with tools like TOAD,SQL Server Management studio and SQL plus for development and customization Experience with Teradata as the target for the datamarts, worked with BTEQ, FastLoad and MultiLoadProficient in using pmcmd and pmrep commandsStrong skills in data analysis, data requirement analysis and data mapping for ETL processesAbility to prioritize and execute tasks in a high pressure environment Experience in mentoring and providing knowledge transfer to team members, support teams and customers.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center,8.x/7.x/6.x, SSIS

BI Tools: Cognos 8 BI Suit, SSRS, Business Objects XI r2/6.5/5.x

Databases: MS SQL Server 2005/2008, Oracle 8i/9i /10g, Teradata, IBM DB2, Sybase

Client Side Skills: SQL, T-SQL, PL/SQL, UNIX shell scripting, Java, HTML, XML, CSS, JavaScript, C, C++, VB 6.0

Web Servers: IIS v5.0/6.0, Apache Tomcat

OS: Windows 2000/NT, UNIX/Solaris, Red Hat Linux, AIX

Version Control: Visual Source Safe 6.0, CVS

Tools: Erwin 4.1, Toad, Rational Rose, MS Project, Test Director, MS Visio, Autosys

Educational Qualification:

Bachelor’s degree in computer science,Kakatiya University-Warangal.

PROFESSIONAL EXPERIENCE

Warner Brothers, Los Angeles, CA Oct 2009 - Present

Sr. ETL Developer

This project was to implement an Enterprise data warehouse (EDW) by means of integrating data from different feeder systems situated across various locations into a central repository of business information. The EDW provides quick access to data to enable a more informative decision making process.

Responsibilities:Involved in all phases of SDLC from requirement, design, development, testing, training and rollout to the field user and support for production environment.Working with the Business Analysts and the QA team for validation and verification of the developmentInvolved in designing relational models for ODS and datamart.Data Quality Analysis to determine cleansing requirements.Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica power center.Extract data from flat files, Oracle, DB2, and SQL Server 2008 & MS SQL Server 2005/2008 and to load the data into the target database.Extensively used Informatica to load data from fixed width and delimited Flat files.Worked with Repository Manager, Designer, Workflow Manager and Workflow Monitor, also imported and created Source Definitions using Source Analyzer and Target Definitions using Warehouse Designer.Developed complex mappings using corresponding Source, Targets and Transformations like update strategy, lookup, stored procedure, SQL, sequence generator, joiner, aggregate, Java and expression transformations in extracting data in compliance with the business logic Developed Informatica using Java&Sybase environments.Extensively used Informatica Power center and created mappings using transformations to flag the record using update strategy for populating the desired slowly changing dimension tables. Implementation experience of CDC ( Change Data Capture) using stored procedures, triggers and using informatica power exchange.Querying and analyzing multiple databases and handling the errors as per the client specifications. Created server optimized database routing and mappings and focused on performance tuning.Handling larger database queries and applying transformations to make the business solution applicable to project. Involved in Performance tuning.Created Transformations and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.Wrote stored procedures in PL/SQL and UNIX Shell Scripts for automated execution of jobsIdentified the errors by analyzing the session logs.Working with Mainframe files, COBOL files, XML, and Flat FilesResponsible for monitoring all the sessions that are running, scheduled, completed and failed Debugged the mapping of the failed session to check the progress of data load.Involved in Unit and System Testing of ETL Code (Mappings and Workflows)Scheduled workflows using autosys job plan.Implemented best practices suggested by Informatica to simplify deployment Process.Created Test scripts and executed in the Test Director.Worked with PMCMD to interact with Informatica Server from command mode and execute the shell scripts.Wrote documentation to describe program development, logic, coding, testing, changes and corrections.

Environment: Informatica Power Center 8.6(Informatica Designer, Repository Manager, Workflow Manager, Workflow Monitor, Repository Server Administration Console), Erwin, SQL, PL/SQL, Sybase, MS SQL Server 2005/2008,Java,Oracle 11g, UDB, DB2, Flat Files, Solaris 10

State Street Corporation, Los Angeles, CA Jan 2009 – Sep 2009

Sr. Informatica Developer

State Street Corporation is the world's leading provider of financial services to institutional investors. It provides core investment custody, fund or investment/securities accounting, fund administration, securities finance and transfer agent services to insitutional clients. Its broad and integrated range of services spans the entire investment spectrum, including research, investment management, trading services and investment servicing. State Street also provides "middle office" services such as trading operations and a California affiliate provides reconciliation services to investment banks with the help of Syntel. State Street Syntel Sourcing Private Limited (SSSPL) is a joint venture entity that executes reconciliation for investment banks worldwide.

Responsibilities:Involved with requirement gathering and analysis for the data marts focusing on data analysis, data quality, data mapping between ODS, staging tables and data warehouses/data marts.Designed and developed processes to support data quality issues and detection and resolutions of error conditions.Working with the Business Analysts and the QA team for validation and verification of the development.Extract data from flat files, Oracle, DB2, and SQL Server 2008, and to load the data into the target database.Analyzed the session logs, bad files and error tables for troubleshooting mappings and sessions.Implemented various scenarios related to slowly growing targets and slowly changing dimensions(Type1, Type2, Type3)Implemented various business rules of data transformations using various Informatica transformations like Normalizer, Source Qualifier, Update Strategy, Look up(connected/unconnected/static cached/dynamic cached), Sequence Generator, expression, Aggregator, XML(source and generator), Stored Procedures.Worked on CDC(Change Data Capture) to implement SCD (Slowly Changing Dimensions).Worked with newer Informatica transformations like Java transformation, Transaction Control. Used Teradata as a Source and a Target for few mappings. Worked with Teradata loaders within Workflow manager to configure FastLoad and MultiLoad sessions.Experience with Teradata as the target for the datamarts, worked with BTEQ, FastLoad and MultiLoadProvided administrative functions like creating repositories, backing up repositories, setting up users, assigning permissions and setting up folders in Repository manager.Wrote shell script utilities to detect error conditions in production loads and take corrective actions, wrote scripts to backup/restore repositories, backup/restore log files.Heavily involved with performance tuning of Oracle database – using TKProf utility, working with partitioned tables, implementing layer of materialized views to speed up lookup queries, working with Bitmap indexes for dimension tables, DBMS Stats package to update statistics, using SQL hints.Scheduled workfiows using autosys job plan.Did QA of ETL processes, migrated Informatica objects from development to QA and production using deployment groups.Provided production support and involved with root cause analysis, bug fixing and promptly updating the business users on day-to-day production issues.Co-ordinated with the off-shore teams and mentored junior developers.

Environment: Informatica Power Center 8.6, Oracle 10g, Autosys, Oracle 9i, Erwin 4.5, CMS, MS PowerPoint, MS Visio, TOAD 9.0, PL/SQL, UNIX,SQL Loader*,SQL server 2005, MS SQL Server 2005/2008.

Travelers Insurance, Hartford, CT Nov07 - Dec 08

ETL Informatica developer/Analyst

Travelers offers a wide variety of property and casualty insurance and surety products and services to businesses, organizations and individuals in the United States and in selected international markets. It provides commercial and personal property and casualty insurance products and services to businesses, government units, associations, and individuals.

Responsibilities:Interpreted logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system.Assist data modeling using ErWin 4.x.Extensively used all the features of Informatica Versions 6.x and 7.x including Designer, Workflow manager and Repository Manager, Workflow monitor.Created workflows and tasks in Workflow Manager and linked the database through server setup and various other connections. Created users in Informatica, assigned them various permissions. Worked with mappings using expressions, aggregators, filters, lookup,update strategy and stored procedures transformations.Created flexible mappings/sessions using parameters, variables and heavily using parameter files.Involved in monitoring Informatica jobs/processes using Workflow Monitor.Developed and modified UNIX shell scripts to reset and run Informatica workflows using pmcmd on Unix Environment. Conversant with the Informatica API calls.Partitioned sources to improve session performance.Backup/Restoration of repositories. Upgrade of repositories when upgrading to new versions of Informatica.Completed various data load simulations to stress test the mappings.Tuned performance of Informatica sessions for large data files by increasing block size, data cache size, sequence buffer length, and target based commit interval.Improved session run times by partitioning the sessions. Was also involved heavily into database fine tuning (creating indexes, stored procedures, etc), partitioning oracle databases.

Environment: Informatica Power Center 7.1.x, 8.1.1 SP4, Oracle 10g, MS SQL Server 2005 PL/SQL, SQL*Plus, SQL*Loader, Unix, Linux, Windows XP.

National Grid Corporation, Melville, NY Jan 2007– Sep 2007

ETL Developer

National Grid is a provider of gas to different parts of North East US like Long Island (NY), Boston, Rhode Island, Connecticut etc. The project involves addition of Upstate New York locations to the already running gas lines. The project involves interface and migration between the old transmissions in Upstate NY (known as TSA) into National Grid’s new transmission (EBB).

Responsibilities:Analyzed the system, architect and business flow in order to define the requirement.Involved in the requirement definition and analysis in support of data warehousing efforts.Designed and developed business rules to generate consolidated (fact/summary) data identified by dimensions using Informatica ETL (Power Center) tool.Update or add new mappings to EBB to incorporate TSA data and flat files will be sent out with the required information back to TSA.Maintained stored definitions, transformation rules and targets definitions using Informatica repository manager.Created and scheduled sessions and Batch process based on demand, run on time, run only once using Informatica server manager.Involved in the performance tuning of mappings and sessions.Created java scripts for automated scheduling of Informatica jobs.Developed training materials for deployment of data marts and reports.Scheduled and monitored transformation processes using Informatica Server Manager.

Environment: Informatica Power Center 7.1.4, Oracle 9i, PL/SQL, SQL* Plus, SQL*Loader, UNIX.

Well Point HealthCare, Mason, OH July 06 – Dec 06

ETL Designer

Sr. Informatica/Teradata Developer

Responsibilities:Involved in full life cycle development including Design, ETL strategy, troubleshooting Reporting, and Identifying facts and dimensions.Prepared the required application design documents based on functionality requiredInstalled and configured Informatica 7.1Migrated repository and mapping from Informatica 6.2 to 7.1Created and Maintained users and User profilesDesigned the ETL processes using Informatica to load data from SAP R/3, Oracle, Flat Files (Fixed Width), and Excel files to staging database and from staging to the target Teradata Warehouse database.Wrote the Algorithm for ETL (Extract, Transform and Load) team for Data Validation.Provided the BTEQ scripts to validate the Edge Model and to generate Serial Key.Created Data Marts and did System Testing of the Application.Automated Test Scenarios using BTEQ scripts to validate Source System Data against the Data Mart Views generated by Edge.Wrote Queries, Procedures and functions that are used as part of different application modulesOptimized Teradata SQL and TSQL queries for better performance.Extract and Load data using different tools in Teradata like Multiload, FastExport, Fastload, OLE Load, BTEQ. Implemented the best practices for the creation of mappings, sessions and workflows and performance optimization.Created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator.Designed and developed the logic for handling slowly changing dimension tables load by flagging the record using update strategy for populating the desired.Involved in cleansing and extraction of data and defined quality process for the warehouse.Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions and data/index cache to manage very large volume of data.Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis.Involved in migration of mappings and sessions from development repository to production repositoryInvolved in Unit testing, User Acceptance Testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.Involved in production support working with various mitigation tickets created while the users working to retrieve the database. Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various needs of the transformations while loading the data.

Nissan North America, Gardena, CA Apr 05- Jun 06

Data warehouse / ETL, Oracle, UNIX Developer

Nissan North America Inc. (NNA) was created coordinate all of Nissans various activities in North America to enhance the design, development, manufacturing and marketing of Nissan vehicles and provides solutions for regional data warehouse.

Responsibilities:Requirement gathering and Understand the Functional Business processes and Requirements given by the Business Analyst. Involve in Designing High level Technical Documentation based on specification provided by the Manager Created mappings, workflows for Nissan extended services north America [NESNA] and Nissan acceptance holding company [NAHC] Extraction, Loading and Unit testing has been done using both SAP AND NON-SAP sources Worked on different parallelism concepts in AbInitio. Experienced in ETL Administration responsibilities for the enterprise Data using informatica tool Involved in writing Shell Scripting for ETL JOBS to run. Power Exchange Change Data Capture has been done for data updates Technical Documentation has been done for all the mappings for presentation with Business logic

Environment: Informatica Power Center 7.1, IBM DB2, AbInitio, Mainframes, UNIX

HSBC BANK, India Aug 03 - Mar 05

Database Developer

This system enables traders/senior managers to mitigate the credit/market risk associated with non-standard trades and regulatory compliance because of the ability to monitor and manage activities and display transaction audit trails thereby avoiding risks. The secondary goal is to reduce the work-effort associated with constant changes in regulatory rules related to equity trading.

Responsibilities: Actively involved in gathering requirements from Business Users, and converting them into system requirement specifications and creating detailed use-cases and design documents Designed, developed, and managed the workflow processes to reflect business requirements with several adapters, exceptions and rules Was involved in data modeling. Designed data flows using UML. Designed and developed User Group Management modules to implement complex business rules for permissions. Coordinated in setting up the development, test, production and contingency environment Designed, developed, managed database star schema, with various hierarchical and lookup tables Developed and maintained complex stored procedures Involved in setting up of application server clustered environment Underwent training in Standard Software Process in implementing CMM level 5 in an enterprise organization

Environment: Oracle 8i, Shell Scripts, UML, Test Director, SunOS



Contact this candidate