SRINIVASA YERRAM
PROFESSIONAL SUMMARY
**+ Years of IT Experience with approximately 20 years in Designing, Development and Implementation of Data Warehouses & Data Marts with ETL& OLAP Tools using Informatica PowerCenter 10.5/9.6/9.5/9.0/8.6/8.1/7.1 and PowerMart 6.2, Data Explorer (Data Profiling), Informatica Data Quality (IDQ), PowerExchange, Oracle 12c/10g/9i/8i, SQL Server, DB2, Teradata, Business Objects 6.1/5.1.3, PL/SQL, WebI, WebiSDK, JSP, JAVA, HTML, SQL, Bit Bucket, Visual Source Safe/Subversion on Windows & UNIX.
Vast experience in designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Transaction Control, Stored Procedure, Normalizer, Union Rank, Update Strategy, Java, PowerExchange etc.,
Created Profiling documents and kept the documents in Visual Source Safe/Subversion.
Created UNIX scripts to find the delta’s from yesterday and today data files and file validations and moving files from one server to another.
Strong experience with Informatica Metadata and Repository Management.
Involved and implemented fine grain on HIPPA related data at database level (Patient System).
Worked extensively with Data migration, Data Cleansing, ETL Processes and Web reporting.
Experience in debugging and Performance tuning of targets, sources, mappings and sessions.
Directly responsible for the Extraction, Transformation and Loading of data from Multiple Sources to Data Warehouse.
Created Workflows and Worklets and scheduled them using both Workflow Manager and Unix Shell scripts.
Extensively worked on Developing, Monitoring and Jobs Scheduling using UNIX CRON & Shell Script.
Proficient in developing PL/SQL packages, Stored Procedures, Functions, Materialized views & Triggers.
Experience in developing Client/Server applications using Oracle, Forms and Reports.
Experienced with SQL*Loader, Developed various Bulk load and update procedures and processes.
Extensively worked on Business Intelligence (OLAP) Tools such as Business Objects - (Universe, Supervisor, BCA, and WebI).
Work done includes system analysis, data warehouse design and creation, Universe creation (a semantic layer between real world business objects and database structure) and creation of complex reports having 'Drill Down', 'Slice and Dice' and 'Alerts' facilities.
Strong ability to work individually and manage tasks based on business requirements. Expertise in Creating Reports using Web Intelligence.
Team player and self-starter with good communication skills and ability to work independently and as part of a team.
TECHNICAL SKILLS:
ETL Tools : Informatica PowerCenter 10.5/9.6/9.0/8.6/8.1/7.1, Informatica Data
Quality (IDQ), Informatica Profiler & Data Explorer, PowerExchange & TDM, Databricks
BI Tools : OBIEE, Business Objects 6.0/5.1, WebI & Cognos Impromptu 6.0
Operating Systems : Sun Solaris, UNIX, Windows & MS-DOS
Languages : WebiSDK, JSP, Java, EJB, CORBA, IDL, C++, Swing, Web Logic, C,
PASCAL, COBOL, HTML, JDK 1.2, SQL & PL/SQL
Data Modeling Tools : Erwin 3.5.2, DB Designer & Power Designer
Databases : Oracle 8i/9i/10g/12c, SQL Server, DB2, Sybase7.0, Teradata & MS-
Access
GUI/TOOLS : TOAD, SQL Developer, Developer 2000, JBuilder 6.0 & Visual Basic6.0.
EDUCATION:
MS (Computer Science), Osmania University, Hyderabad, India.
PROFESSIONAL EXPERIENCE:
INTEGRATED ELIGIBILITY SYSTEM (IES), NY
Sr. ETL Developer 01/2020 – Present
Currently working on WMS Data preparation project to acquire data from legacy systems (Upstate, NYC, BICS & CBIC) to Oracle database WMS is a system for receiving, maintaining and processing information relating to persons who have applied for or been determined eligible for benefits in the OTDA supervised programs of PA, SNAP, and HEAP as well as MA benefits for DOH and services programs for OCFS for Local Social Service Districts other than the district comprising the city of New York. Upstate WMS supports efficient determinations of eligibility by the SSDs and achieving compliance with federal laws and regulations and provides for unified data collection and reporting forms and procedures. It supports the state agencies of OTDA, OCFS and DOH with oversight in managing the programs, detecting fraudulent practices, and helping identify policies or conditions that will reduce or deter fraud and maximizing utilization of federal funds.
FHS Data preparation project to acquire data from legacy systems (FHDMS, FHQP & FHIS) to Oracle database. A Fair Hearing is a chance for you to tell an Administrative Law Judge from the New York State Office of Temporary and Disability Assistance, Office of Administrative Hearings, why do you think a decision about your case made by a local social services agency is wrong. The Office of Temporary and Disability Assistance will then issue a written decision which will state whether the local agency's decision was right or wrong. The written decision may order the local agency to correct your case.
Responsibilities:
Crated mapping/workflows to load Preparation Env (Oracle) from MF files using Informatica PowerCenter
Crated mappings/workflows to load TO-BE env (SQL Server) from Preparation env (Oracle) using Informatica PowerCenter
Did performance tuning on Informatica mappings and sessions for FHIS and WMS projects.
Written Unix shell scripts to capture the errors from FHIS external table log files.
Written generic package and stored procedures using PL/SQL to capture Initial and Incremental (CDC) loads for FHMDS & FHQP.
Written generic packages and stored procedures using PL/SQL to profile (Pattern, Table and Column Analysis) the FHIS, FHMDS & FHQP tables.
Written generic test scripts to validate data from source to data preparations environment (Intake & CDC).
Created Informatica PowerCenter Folders, Mappings, Workflows to Call the stored procedures in parallel to improve the performance for FHDMS & FHQP.
Prepared Data Lineage (source to target mapping) Documents from Intake to CDC tables for FHDMS & FHQP.
Used Parallel and Append HINTS in the source to Intake PL/SQL generic stored procedures for FHIS, FHMDS & FHQP to improve performance.
Used External Tables for FHIS Intake Main Frame data files.
Design and Developed the test cases for Slowly Changing Dimension (SCD Type2), Flat file to Oracle, XML to Oracle and Oracle to Flat Files etc., using Informatica PowerCenter.
Design and Developed the test cases to call the remote and local Unix shell scripts.
Design and Developed the Informatica Admin Unix shell scripts for Informatica Backup and Recovery.
Developed the Informatica Admin Unix shell script to call the Informatica workflows from one folder to another.
Created Naming best practices documents for Informatica PowerCenter & Informatica Data Quality (IDQ).
Created subjects’ area-based Enterprise Discovery Profiles for FHQ, FHIS & FHDMIS using Informatica Data Quality (IDQ).
Used Informatica Data Quality (IDQ) to analyze (Natural Key’s, Patterns, Null’s & Dups) the CDC & DTT tables.
Used Informatica Data Quality (IDQ) multi profile to analyze the primary keys from the source systems (FHDMS & FHQ)
Created templates to Change Data Capture (CDC) using Informatica PowerCenter.
Environment: Informatica PowerCenter 10.5, Informatica Data Quality (IDQ), Webservice, PowerExchange for Kafka, ORACLE 12c, SQL Server, PL/SQL, External Tables, Flat Files, SQL Developer, Unix & Window10.
DAVIS VISION, NY
Sr. ETL Developer/Lead Developer 02/2019 – 12/2019
Worked on the Medicaid Data Warehouse (BI 2.0) to have a centralized non-volatile data store for enterprise reporting for Davis Vision. The BI (SQL Server) will acquire data from sources such as Compu Vision (Main Frames) and CVX (SQL Server) to report on current and historical trends, generate campaigns, audit billing, provide financial analysis, analyze customer Claims, Providers, Diagnosis, Procedure code and allow for an overall presentation of every facet of the Davis vision/Superior Vision. Consolidate.
Responsibilities:
Involved in requirement gathering, analysis and study of existing systems
Prepared requirement definitions and analysis document in support of Data Warehousing efforts.
Extensively worked on transformations like Lookup, Filter, Expression, Aggregator, Router, Source Qualifier, Sorter, Sequence Generator, Rank, etc.
Prepared ETL specifications (source to target mapping) for Medicaid Data Warehouse.
Created Slowly Changing Dimensions (SCD) Type1, Type2 & Type3 and Facts from Staging Using Informatica PowerCenter Designer.
Used Informatica Data Quality (IDQ) to analyze (Natural Key’s, Patterns, Null’s & Dups) the data.
Analysis, Design, Develop, testing implement and maintain the Informatica mappings/workflows to read DB2 and SQL Server databases as a source (used Provider matching criteria) and writing into SQL Server as target using Informatica PowerCenter.
Developed stored procedures using PL/SQL
Analysis, Design, Develop, testing & implement the Informatica mappings/workflows for the COBOL copybooks to load the Main Frames VSAM files into SQL Server using Informatica PowerExchange and PowerCenter.
Analysis, Design, Develop, testing & implement the Informatica mappings/workflows for the COBOL copybooks to load the Main Frames ASCII format files into SQL Server using Informatica PowerCenter Normalizer Transformation.
Extensively worked on VSAM files to bring data from Main Frames to SQL Server using PowerExchange.
Created Reusable Transformations and Mapplets and used them in Mappings to develop business logic for transforming source data into target.
Involved in improving performance of the Server Sessions by identifying and eliminating the Performance Bottlenecks.
Moved Mappings/Workflows from Dev, Testing to Production Environment Using Informatica Repository Manager.
Used Parallel and Append HINTS in the PL/SQL generic procedures, functions, triggers and packages to improve performance.
Created mappings to read/write XML source/targets using Informatica PowerCenter.
Used SKYBOT scheduling tool to schedule workflows and Batch jobs
Developed stored procedures, triggers and packages using PL/SQL Oracle and used Bulk collect option to improve performance.
Used Bit Bucket to maintain the versions for Code & Documents.
Involved in Production support to monitor the daily scheduled ETL jobs.
Wrote generic UNIX Shell Scripts to Move Flat Files from SFTP Server to Informatica Server and Scheduled with Informatica PowerCenter.
Wrote generic UNIX Shell Script to archive the Flat Files from SFTP Server and Scheduled with Informatica.
Environment: Informatica PowerCenter, Informatica Data Quality (IDQ), PowerExchange 9.5, ORACLE, SQL Server, DB2, PL/SQL, SKYBOT, Flat Files, VSAM, XML, COBOL, TOAD, Visual Studio, ERStudio, Bit Bucket/VSS, JIRA, BluJeans, Sun Solaris & Window10.
OFFICE OF MENTAL HEALTH, NY
Sr. ETL Developer/Lead Developer 06/2008 –02/2019
Worked at OMH (projects involved: EDW (NYESS), MDM, Kids indicators, PCS, Caires hist, Netted claims (Medicaid) & CMHSDM) .Experience in Performance Tuning on Informatica, Oracle & Unix (for e.g.: in OMH I had implemented hash value algorithm technique - MD5, while populating temp tables & Star schema (SCD's Type 2, Fact & Summary tables) for NYESS project to improve the performance). Kimball Methodologies for NYESS & PROMISE projects to improve performance while loading Star Schemas. Created Data Profiling (Column & Table) documents to find the data patterns, nulls, dup's & counts, using Informatica Data Quality (IDQ) for the projects NYESS & Kids Indicators. Created slowly changing dims and reading and writing “.TAR” files using Informatica. Given informatica training for MDM team to convert SAS code to Informatica. Generated Flat files (Claims, Provider& Drug) for PROMISE students (DOL Data) using MDW (Medicaid Data warehouse). Worked on Ticket To Work payment process Using DOL & SSA Data.
Responsibilities:
Prepared ETL Data Lineage (source to target mapping) documents for NYESS & Kids Indicators Projects.
Created Profiling documents for NYESS/Kids Indicators Using Informatica Profiler/IDQ and kept these documents into Visual Source Safe/Subversion.
Used Informatica Data Quality (IDQ) to identify Golden records, Natural Key’s, Patterns, Null’s, ref tables & Dups for NYESS tables.
Created Reusable mapplets using Informatica Data Quality (IDQ) for data cleansing, standardization, match and merge functionality for NYESS and PROMISE.
Worked on improving performance of the Sessions/Mappings by identifying and eliminating the Performance Bottlenecks.
Successfully Implemented Change Data Capture (CDC) and Slowly Changing Dimensions (SCD Type 2) to keep the history changes for EDW (NYESS), Kids Indicators in an OLAP & OLTP Using Informatica PowerCenter.
Successfully Implemented Slowly Changing Dimensions (SCD Type3 to keep the partial history changes for EDW, Kids Indicators in an OLAP & OLTP Using Informatica PowerCenter.
Created Mappings, Workflows, Worklets and Sessions to move Data from Flat Files to Oracle.
Acted as a key contributor for direction/development of an Enterprise Data Warehouse (EDW) and Informatica implementation.
Created mappings to read/write XML source/targets using Informatica PowerCenter.
Extensively worked on transformations like Lookup, Filter, Expression, Aggregator, Router, Source Qualifier, Sorter, Sequence Generator, Rank, etc.
Worked on complex mappings (Used regular expressions, Normalizer transformations) and Populated star schema tables.
Populated large tables (big fact table is 95 million records) and worked on partitioned tables.
Created Audit process (Daily load counts for Source/Targets) using Informatica Metadata.
Written PL/SQL Stored Procedures/Functions to create MD5 String generation and Integrated with Informatica.
Used External Tables for ASCII format data files for the project CMHDM.
Written PL/SQL Stored Procedures to extract security information from LDAP and Integrated with Informatica.
Used Parallel and Append HINTS in the PL/SQL generic procedures, functions, triggers and packages to improve performance.
Analysis, design, development, testing & implement the generic pkg/procedures/functions/triggers/materialized views to read the data from LDAP, truncate/load, enable/disable constraints, indexes and analyze the tables using PL/SQL & relational databases. And integrated PL/SQL procedures into Informatica.
Written generic UNIX Shell scripts to call stored procedures to create Indexes and Truncate Tables.
Written UNIX Shell scripts to Extract the “.TAR” files and move the files from OMH SFTP to OMH PROD server.
Written UNIX Shell scripts to find the Delta’s Change Data Capture (CDC) functionality from yesterday and today data files and file validations.
Created Reusable Transformations and Mapplets and used them in Mappings to develop business logic for transforming source data into target.
Used Visual Source Safe/Subversion to maintain the versions for Code & Documents.
Worked closely with OBIEE Developers and tested reports.
Involved in Production support to monitor daily scheduled ETL jobs for NYESS and Kid Indicators.
involved and implemented fine grain on HIPPA related data at database level (Patient System).
Environment: Informatica PowerCenter 9.6/9.5/9.0/8.6/8.1, Informatica Data Quality 9.6.1 (IDQ), ORACLE 9i/10g/12c, PL/SQL, OBIEE11g, SQL*Loader, External Tables, Flat Files, XML, ERStudio, Visual Source Safe/Subversion, TFS, TOAD, SQL Developer, HP-UNIX & Windows.
BLUE CROSS BLUE SHIELD, MI
Sr. ETL Developer/Lead Developer 06/2007 – 06/2008
Worked on Benchmarks Return Data and BHI Stage II Data warehouse Projects Data Processing from flat files to Oracle 10g. Six different categories of Benchmark Summaries are created by the BHI Benchmark Summary Process. Within each category, a number of distinct summaries are generated to describe the data at various levels of aggregation. Each category and level of aggregation has been defined to meet BHI analytical requirements. Prepared Data Profiling for Medco and Med impact Using Informatica Data Explorer.
Responsibilities:
Prepared ETL High Level Design Documents and Detail Design Documents for BHI Stage II.
Worked on Data Profiling for Medco and Med Impact Using Informatica Data Explorer.
Created Slowly Changing Dimensions (SCD) and Facts from BI to BHI Stage Oracle Using Informatica PowerCenter.
Created Workflows, Worklets and Sessions to move Data from Flat Files to Oracle (Bhiret) for Benchmarks Data.
Worked on complex mappings and populated very lager tables (400 million records per table)
Written PL/SQL Stored Procedures/Functions and Integrated with Informatica to create the Mapping templates. Used MapGen Utility in informatica to generate generic mappings.
Written PL/SQL Bulk load Routines to populate the target tables and integrated with informatica.
Written Shell scripts to call stored procedures and Create Indexes and Tables.
Working on creation, backup, restoring and performance tuning of Informatica Repository.
Created Reusable Transformations and Mapplets and used them in Mappings to develop business logic for transforming source data into target.
Worked on VSAM files to bring data from Main Frames to ORACLE database using PowerExchange.
Involved in improving performance of the Server Sessions by identifying and eliminating the Performance Bottlenecks.
Environment: Informatica PowerCenter 8.1/7.1, PowerExchange, ORACLE 9i/10g, Data Explorer, DB2, PL/SQL, SQL*Loader, Flat Files, TOAD, SQL Developer, Sun Solaris 7.0, AIX & Win 10.
SNAP-ON BUSINESS SOLUTIONS, OH
Sr. ETL Developer 11/2006 – 05/2007
I worked on QLINK, GMNA, GME and SAAB VIN and Vin Decoding processing Data warehousing projects.
Responsibilities:
Prepared ETL specifications and Writing PL/SQL Stored Procedures/Functions in Oracle and Integrate with GMCP.
Worked on complex mappings, populated large tables (1 million records per table) and worked on partition tables.
Extensively worked on transformations like Expression, Aggregator, Router, Source Qualifier, Sorter, Sequence Generator, Rank, Lookup, Filter etc.
Experience in creation, backup, restoring and performance tuning of Informatica Repository.
Created Reusable Transformations and Mapplets and used them in Mappings to develop business logic for transforming source data into target.
Involved in improving performance of the Server Sessions by identifying and eliminating the Performance Bottlenecks.
Moved Mappings/Workflows from Dev, Testing to Production Environment Using Informatica Repository Server.
Wrote Unix Shell scripts to call stored procedures and Create Tables, Indexes.
Used Mload/Bteq to Populate Teradata Fact and Dimension Tables.
Wrote PL/SQL Bulk load Routines to populate the target tables and integrated with informatica.
Used Parallel and Append HINTS in the PL/SQL generic procedures, functions, triggers and package to improve Performance.
Environment: Informatica PowerCenter 8.1, ORACLE 9i/10g, Teradata, SQL*Loader, PL/SQL, Flat Files, TOAD, Sun Solaris (UNIX) 7.0 & Win XP.
NATIONAL CITY CORP, STRONGSVILLE, OH
Sr. ETL Developer 05/2006 – 11/2006
The source file created by the Extreme application will be a delimited file containing the summary information at the statement granularity. The file will contain unpacked data and will be produced on business days only whenever a statement is generated. Further, it is proposed that the file will contain the combined data for all the banks in one file, rather than transmitting individual files for each bank from there to CBS process Multiple Output files to CDE and then produce the Statements output file.
Responsibilities:
Prepared ETL High Level Design Documents and Detail Design Documents for CBS-IDH
Created and restored Informatica Repository from Dev, Testing to Production Environment.
Analysis, Design, Develop, test, implement and maintain the Change Data Capture (CDC) and Slowly Changing Dimensions (SCD) Type1, Type2 & Type3, reusable transformations, mapplets and Facts Using Informatica PowerCenter & Informatica Data Quality for Analysis Server to Teradata.
Wrote PL/SQL Stored Procedures/Functions and Triggers and Integrated with Informatica.
Worked in Star Schema Using ERWIN Data Modeling Tool and Power Designer.
Created Reusable Transformations and Mapplets and used them in Mappings to develop business logic for transforming source data into target.
Extensively worked on transformations like Expression, Aggregator, Router, Source Qualifier, Sorter, Sequence Generator, Rank, Lookup, Filter etc.
Involved in improving performance of the Server Sessions by identifying and eliminating the Performance Bottlenecks.
Analysis, Design, Develop, test, implement and maintain the Informatica mappings/workflows for the COBOL copybooks to load the Main Frames VSAM files into ORACLE using Informatica PowerExchange and PowerCenter.
Wrote PL/SQL Bulk load Routines to populate the target tables and integrated with informatica PowerCenter.
Wrote install scripts to start/stop the Business Object servers using Unix Korn Shell
Environment: Informatica PowerCenter 8.1/7.1, PowerExchange, ORACLE 9i/10g, TeraData, ERWIN, PL/SQL, Flat Files, TOAD, Sun Solaris 7.0 & Win2000 Adv Ser.
COX COMMUNICATIONS, ATLANTA, GA
ETL Developer 12/2005 – 05/2006
On a daily basis we made an attempt to test the 2-way communication between our head ends and our customer digital set top boxes. Specifically, the head-in broadcasts an impulse or ping out to the HFC network. The set tops that don’t acknowledge the impulse with a return communication we deem as non-responders. We measured set tops that have no response for a consecutive 14 day as Chronic Non-Responders. The primary goals of these measures were to understand plant health and to proactively address issues with maintenance and justifications for capital investment.
The Enterprise Data Warehouse is an Oracle based system consisting of a Stage database and an EDW database. There are several schemas in each database according to security needs. This project will use EDWSTG and EDW schemas. The databases reside on catl0x21
Responsibilities:
Created and Restored Informatica Repository from Dev, Testing to Production Environment.
Created Slowly Changing Dimensions and Facts from EDWSTG to EDW (Enterprise Data warehouse) Using Informatica Designer.
Worked in Star Schema Using ERWIN Data Modeling Tool.
Created Reusable Transformations and Mapplets and used them in Mappings to develop business logic for transforming source data into target.
Extensively worked on transformations like Expression, Aggregator, Router, Source Qualifier, Sorter, Sequence Generator, Rank, Lookup, Filter etc.
Involved in improving performance of the Server Sessions by identifying and eliminating the Performance Bottlenecks.
Created reports using Business Objects full client and Web Intelligence.
Developed stored procedures, triggers and packages using PL/SQL Oracle.
Created Universes and resolved Loops by creating table aliases and contexts.
Creation of universes using Designer and maintaining the repository using Supervisor
Wrote UNIX Shell Scripts and Scheduled with Informatica to Move Flat Files from PSTAGE to EDWSTG.
Environment: Informatica PowerCenter 7.1, ORACLE 9i/10g, Teradata, ERWIN, PL/SQL, Business Objects 6.5, WebI, BCA, WebiSDK, JSP, JAVA, Flat Files, Query Man, TOAD, Sun Solaris 7.0 & Win2000 Adv Ser.
CINGULAR WIRELESS, ATLANTA, GA
ETL Developer 06/2005 – 12/2005
Worked on The Business Intelligence Data warehouse (BID) to have a centralized non-volatile data store for enterprise reporting for Cingular customers. The BID (TERADATA) will acquire data from sources such as Care (AS/400) and Telegence (Oracle) to report on current and historical trends, generate campaigns, audit billing, provide financial analysis, analyze customer interactions and allow for an overall presentation of every facet of the Cingular Wireless.
Responsibilities:
Extensively worked on transformations like Expression, Aggregator, Router, Source Qualifier, Sorter, Sequence Generator, Rank, Lookup, Filter etc.
Experience in creation, backup, restoring and performance tuning of Informatica Repository.
Created Reusable Transformations and Mapplets and used them in Mappings to develop business logic for transforming source data into target.
Involved in improving performance of the Server Sessions by identifying and eliminating the Performance Bottlenecks.
Moved Mappings/Workflows from Dev, Testing to Production Environment Using Informatica Repository Server.
Created Slowly Changing Dimensions and Facts from Staging (Oracle) to BID (Teradata) Using Informatica Designer.
Worked on VSAM files to bring data from Main Frames to ORACLE database using PowerExchange.
Create reports using Business Objects full client and Web Intelligence.
Wrote Insert/Created tables Scripts Using Teradata in BID.
Used Mload/Bteq to Populate Teradata Fact and Dimension Tables.
Developed stored procedures, triggers and packages using PL/SQL Oracle and used Bulk collect option to improve performance.
Wrote UNIX Shell Scripts and Scheduled with Informatica to Move Flat Files from Pstage to EDW Informatica Server.
Environment: Informatica PowerCenter 7.1, ORACLE 10g/9i, PowerExchange, Teradata, ERWIN, PL/SQL, Business Objects 6.5, WebI, BCA, WebiSDK, JSP, JAVA, Flat Files, Query Man, TOAD, Sun Solaris 7.0 & Win XP.
BELLSOUTH, ATLANTA, GA
ETL Developer 12/2004 – 06/2005
Worked on the service level agreement management (SLAM) system will provide service level reports for NetVPNCoS customers. The SLAM will acquire data from sources such as InfoVista, MCDB98 and Remedy, store the data, determine service level compliance and produce a service level report with credits for noncompliance.
Responsibilities
Installed and Configured Informatica Power Center on Windows.
Created and Restored Informatica Repository from Dev, Testing to Production Environment.
Created Slowly Changing Dimensions and Facts from Analysis Server to Teradata Using Informatica Designer.
Worked in Star Schema Using ERWIN Data Modeling Tool.
Populated very large tables (size is one Tera bite).
Designed and developed Complex mappings.
Created Reusable Transformations and Mapplets and used them in Mappings to develop the business logic for transforming source data into target.
Extensively worked on transformations like Expression, Aggregator, Router, Source Qualifier, Sorter, Sequence Generator, Rank, Lookup, Filter etc.
Involved in improving performance of the Server Sessions by identifying and eliminating the Performance Bottlenecks.
Created reports using Business Objects full client and Web Intelligence.
Wrote Insert/Created tables Scripts Using Teradata in BID.
Developed stored procedures, triggers and packages using PL/SQL Oracle.
Wrote Shell Scripts and Scheduled with Informatica to Move Flat Files from InfoVista to Acquisition Server.
Created Universes and resolved Loops by creating table aliases and contexts.
Creation of universes using Designer and maintaining the repository using Supervisor
Environment: InformaticaPowerCenter7.1, BusinessObjects6.5, WebI, BCA, WebiSDK, JSP, JAVA, JDCBC, PL/SQL, ORACLE8i/9i/10g, Erwin, TERADATA, Flat File, Query Man, TOAD, Sun Solaris 7.0 & Win2000 Adv Ser.
CINGULAR WIRELESS, ATLANTA, GA
ETL Developer 06/2003-12/2004
Worked on IRL, RL and WMS Projects which Involves Data Migration from Oracle and Flat files to Oracle. This Project Involved migrating the data from Blue Oracle (AT&T), Orange WMS and WMS Flat Files into Orange Oracle (Cingular) and Created and Published IRL and WMS Reports into Web to see Cingular the missing Cell phones and their Billing Information.
Responsibilities:
Extensively worked on transformations like Expression, Aggregator, Router, Source Qualifier, Sorter, Sequence Generator, Rank, Lookup, and Filter for PO Recon Report, Open Sales Order Report etc.
Created Reusable Transformations and Mapplets and used them in Complex Mappings to develop the business logic for transforming source data into target.
Involved in improving performance of the Server Sessions by identifying and eliminating the Performance Bottlenecks.
Populated Fact and Dimension tables from multiples source.
Installed and Configured Informatica Power Center on Windows.
Created and Restored Informatica Repository from Dev to Testing and Production Environment.
Developed stored procedures, triggers and packages using PL/SQL
Also involved in moving the mappings from Test Repository to Production after duly testing all transformations
Developed stored procedures and packages using PL/SQL to create Balancing Reports to check differences between Source and Target.
Created WMS and IRL Reports and published into WebI to See the CINGULAR missed Cell Phones and Cingular Billing Information Using Business Objects.
Reporting (Provide standard reports and query functionality that are accessible from the Tools menu). View Snapshot Report, View Trending Report Using WebiSDK.
Created Universes and resolved Loops by creating table aliases and contexts.
Creation of universes using Designer and maintaining the repository using Supervisor
Write to install scripts for Oracle stored procedures, triggers and packages using Korn Shell
Created various geographical and time dimension reports of Fiscal Year, Quarter, Month, week and daily reports for all Domains and published through Web Intelligence.
Environment: Informatica PowerCenter 7.1, Business Objects 5.1.3, WebI, BCA, WebiSDK, JSP, JAVA, JDCBC, PL/SQL, ORACLE 8i/9i, Sybase, TOAD, ISQLW, Sun Solaris 7.0 & Win2000 Adv Ser.
THOMSON