Summary of Qualifications
●Around *Years of IT experience with expertise in Data Warehouse Process involved in Analysis, Design, Development of various business applications, different platforms using Informatica Powercenter9.5.1/8.6,PowerExchange 9.5/9.1, Oracle 11g,10g.9i, Teradata 12/13/14, Sybase,DB2.
●Strong Experience in developing Sessions/tasks, Worklets, Workflows using Workflow Manager Tools - Task Developer, Workflow &Worklet Designer.
●Experience in using the Informatica command line utilities like pmcmd to execute workflows in non-windows environments
●Expertise in optimizing the session performance by eliminating performance bottlenecks by optimizing target, source, mapping, transformations, session, grid deployment and optimize the system
●Expertise in tuning the throughput of source and targets not only relational source and targets but also flat files and XMLs
●Worked with Teradata 12 utilities like BTEQ, Fast Load, Multi Load and Query man and Have experience on implementing business rules in BTEQ scripts.
●Extensive experience in fine tuning the session by using pushdown optimization, increasing DTM buffer memory and error tracing
●Involved in generating Mload and Tpump scripts to load the data into Teradata tables.
●Expertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.
●Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
●Expertise in Database Design, Entity-Relationship Modeling, Dimensional data modeling, Star schema modeling, Snowflake modeling with Ralph Kimball methodologies, Data Normalization. Experience working with Rapid SQL, SQL Developer and TOAD.
●Experienced in integrating various data sources/targets like Oracle, Teradata, SQL Servers, B2B, DB2, Fixed Width delimited Flat Files & XML Files.
●Experience on both UNIX and Windows platforms. Extensive experience in creating UNIX Shell Scripts.
●Experience in scheduling informatica jobs using external job schedulers likeIBM Tivoli Workload SchedulerandAutosys
●Experience in performance tuning of Informatica Sources, Targets, Mappings and Sessions.
●Highly proficient in processing tasks, scheduling sessions, import/export repositories, manage users, groups, associated privileges and folders.
●Experienced in interacting with Business users in analyzing the Business process requirements and transforming them into documents, designing, and rolling out the deliverables.
●Good understanding of Project Life Cycle and gathering requirements for ETL Development.
●Strong experience in coding using SQL, PL/SQL Procedures/Functions, Triggers and Packages. Extensive experience working with Mainframe datasets.
●Experienced with coordinating cross-functional teams, project management and presenting technical ideas to diverse groups.
●Good Analytical, Strong Interpersonal and Excellent communication skills.
●Self-motivated, able to set effective priorities to achieve immediate and long-term goals and meet operational deadlines.
TECHNICAL SKILLS:
ETL TOOLS
PowerCenter9.5.1/8.6,PowerExchange9.5/9.1, XML/XSD files, Teradata SQLAssistant, Teradata Manager
DATA MODELLING
Star-Schema Modeling, Snowflake Modeling, MS Visio
DATABASE EXPERIENCE
DB2,Sybase,
Oracle 11g/10g/9i
Teradata 12.0/13.0/14.0
Microsoft SQL Server 2012/2008
ACCESS
LANGUAGES
C, C++,UNIX Scripting, PL/SQL, Unix Shell scripting
REPORTING TOOLS
Rapid SQL, Toad,Sql developer, SQL server management studio, MS Visio
INDUSTRY
Healthcare
Finance
Retail
PROFESSIONAL EXPERIENCE – SUMMARY
CVS, Buffalo Grove,ILDuration: Mar2015to Present
ETL Developer
Description- Data Warehouse Acceleration project is multi release project intended to expedite loading data on EDW2 in ELDM format.
Responsibilities:-
Handled multiple Project Under this client.
●Designing & documenting the functional specs and preparing the technical design.
●Implementing Slowly Changing Dimension (SCD type II) design for the Data Warehouse.
●Developing several complex mappings in Informatica a variety of PowerCenter transformations, Mapping Parameters, Mapping Variables, Mapplets& Parameter files in Mapping Designer using both the InformaticaPowerCenter and IDQ.
●Created scripts using FastLoad, Multi-Load to load data into the Teradata data warehouse.
●Involved in Sql scripts Macros, stored procedures in Teradata to implement business rules.
●Updated numerous Bteq/Sql scripts, making appropriate DDL changes and completed unit and system test.
●Developing workflows with Worklets, Event waits, Assignments, Conditional flows, Email and Command Tasks using Workflow Manager.
●Migration of code between the Environments and maintaining the code backups.
●Expertise in Performance Tuning by identifying the bottlenecks at sources, targets, PowerCenter transformations and sessions using techniques like Explain plans, Oracle hints, re-designing the mappings. Collected performance data for sessions and performance tuned by adjusting Informatica session parameters.
●Creating Checklists for Coding, Testing and Release for a smooth, better & error free project flow.
●Creating the Release Documents for better readability of code/reports to end users.
●Handling User Acceptance Test & System Integration Test apart from Unit Testing with the help of Quality Center as bug logging tool. Created & Documented Unit Test Plan (UTP) for the code.
Environment:InformaticaPowerCenter 9.1/9.5,PowerExchange 9.5/9.1, DB2 UDB, Informatica Data Quality 9.1, Oracle 11g, TOAD, Teradata 12/13/14, Teradata SQL Assistant, SQL Developer, Putty, TWS Tivoli Job scheduler, Sun Solaris Unix and Windows Enterprise 2000
Walgreens, Deerfield,ILDuration: Mar 2013 to Feb 2015
Teradata Developer
Description-Walgreens is America's online pharmacy serving needs for prescriptions, health & wellness products, health information and photo services.Creation of single enterprise data warehouse for Retail, Mail-order and Specialty.
Responsibilities:-
●Performed data analysis and gathered columns metadata of source systems for understanding requirement feasibility analysis.
●Created Logical Data flow Model from the Source System study according to Business requirements on MS Visio.
●Transformed Logical Data Model to Physical Data Model ensuring the Primary Key and Foreign key relationships in PDM, Consistency of definitions of Data Attributes and Primary Index considerations.
●Worked on the Teradata stored procedures and functions to confirm the data and have load it on the table.
●Developed procedures to populate the customer data warehouse with transaction data, cycle and monthly summary data, and historical data.
●Worked on optimizing and tuning the Teradata views and SQL’s to improve the performance of batch and response time of data for users.
●Worked closely with analysts to come up with detailed solution approach design documents.
●Provided initial capacity and growth forecast in terms of Space, CPU for the applications by gathering the details of volumes expected from Business.
●Prepared low level technical design document and participated in build/review of the BTEQ Scripts, FastExports, Multiloads and Fast Load scripts, Reviewed Unit Test Plans & System Test cases.
●Provided support during the system test, Product Integration Testing and UAT.
●Verified if implementation is done as expected i.e. check the code members are applied in the correct locations, schedules are built as expected, and dependencies are set as requested.
●Done the impact assessment in terms of schedule changes, dependency impact, code changes for various change requests on the existing Data Warehouse applications that running in a production environment.
●Provided quick production fixes and proactively involved in fixing production support issues.
●Have strong knowledge in data mover for importing and exporting of data.
●Creating and maintaining source-target mapping documents for ETL development team.
●Providing requirement specifications and guide the ETL team for development of the ETL jobs through Informatica ETL tool.
●Development of test cases and testing.
●Used Teradata Data Mover to copy data and objects such as tables and statistics from one system to another.
●Developed technical design documents (HLD and LLD) based on the functional requirements.
●Coordinate with Configuration management team in code deployments.
Environment:Teradata12, Teradata 5500, SQL Assistant 12.0, Business Objects, Hyperion, BTEQ, FastLoad, FastExport, Multiload, Korn Shells, MS Visio, Data Mover.
Federal Reserve Bank, St.Louis,MODuration: Jun’2012 to Mar’13
ETL Analyst
Description- Production support on Member services application.
Responsibilities:-
●Monitoring the ETL jobs, generating tickets and fixing the bugs in HP ALM.
●Extensively used Tivoli Enterprise Scheduler for running the shell scripts, performing actions, event tasks, scheduling the workflows and dynamically updating the parameter file.
●Used Teradata utilities fastload, multiload, tpump to load data
●Extensively worked with XML targets and tuned them. Solved various performance issues while working with XML targets.
●Identifying the bottlenecks and tuning them accordingly
●Extensively involved in validating the mappings/objects developed by the team
●Created and maintained the shell scripts and parameter files in UNIX for the proper execution of Informatica workflows in different environments.
●Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system.
●Extensively used Debugger to validate the mappings and gained troubleshooting information about data and error conditions.
●Prepared the error handling document to maintain the error handling process.
●Used PL/SQL procedures/functions in mappings to build business rules to load data.
●Used Pushdown Optimization to push the transformation logic on to the Database both on the Source and the Target sides where ever possible to improve performance of the mapping.
Environment:InformaticaPowerCenter 9.1/9.5,PowerExchange 9.5/9.1, DB2 UDB, Informatica Data Quality 9.1, Oracle 11g, TOAD, Teradata R12/R13, Teradata SQL Assistant, SQL Developer, UNIX and Tivoli Job Schedule
Bank of New YorkMellon, New York DurationFeb’2010 to Jun’ 2012
ETL Developer
Description- It deals with the history data of Mellon application is merged with the history data of Bank of New York.
Responsibilities:-
Extensively used ETL to transfer data from different source system and load the data into the target database.
Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data Warehouse.
Developed number of complex Informatica mappings, mapplets, reusable transformations to implement the business logic and load the data incrementally.
Extracted data from Oracle and SQLServer and load the data into target database.
Handled slowly changing dimensions of Type1/ Type 2 to populate current and historical data to dimensions and fact tables in the Data Warehouse.
Developed Informatica mappings by usage of aggregator, SQL overrides in lookups, source filter in source qualifier and data flow management into multiple targets using router transformations.
Extensively involved in performance tuning of the ETL process by determining the bottlenecks at various points like targets, sources, mappings, sessions or systems. This led to a better session performance.
Involved in creating the runbook and migration document (from Development to Production).
Document the process for further maintenance and support.
Worked on test cases and use cases.
Environment:Informatica Power center 9.1, PL/SQL, Oracle 11g, Toad, Flat file, C,C++, SQL Server, UNIX Shell Scripting.
Ameriprise Auto & Home Insurance,Ashwaubenon, WIDuration: Feb’2009 to Jan ‘2010
ETL Developer
Description- Ameriprise Auto & Home Insurance is a division of Ameriprise Financial, a FORTUNE 500 ® company. Ameriprise Auto & Home Insurance has been providing affordable, high-quality insurance for more than two decades.
Responsibilities:
●Involved in requirements study and understanding the functionalities
●Used heterogeneous data sources like flat files and Oracle.
●Created mappings using the transformations such as the Source qualifier, Aggregator, Expression, Router, Filter, Rank, Sequence Generator, and Update Strategy.
●Createdand Monitored Informaticasessions.
●Created complex mappings using Aggregator, Expression, Joiner transformations including complex lookups, Stored Procedures, Update Strategy and others.
●Involved in testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
●Checked and tuned the performance of InformaticaMappings.
●Identified and created different source definitions to extract data from input sources and load into relational tables using Informatica Power Center.
● Extensively involved in performance tuning of the ETL process by determining the bottlenecks at various points like targets, sources, mappings, sessions or systems. This led to a better session performance.
●Involved in creating the runbook and migration document (from Development to Production).
●Document the process for further maintenance and support.
●Worked on testcases and usecases.
Environment:-InformaticaPowerCenter 8.6, Oracle 10g, SQLServer2005,Flat Files, Windows XP, SQL developer.
General Information:
Educational Qualification
Bachelor of Engineering with Distinction from Andhra University
Consultant’s home Location
Chicago, US
Visa Status
Valid H1-B
Reference available on request!