Ravindra babu Kakarla
302-***-**** ********.****@*****.*** Work Authorization, US Citizen
LinkedIn Profile. https://www.linkedin.com/in/ravindrababu-kakarla-10166514/
SUMMARY
Around 22 years of experience in Information Technology, I have served as a Tech Data Management Manager, an ETL architect, and have extensive experience in the analysis, design, and development of Enterprise Data Warehouses (EDW), Data Marts, Data Migrations, Data Integrations, B2B, and Informatica migration applications.
Extensive experience in all phases of Data Warehouse and Software Development Life Cycle (Agile -Scrum, Waterfall methodologies) in Finance, Banking, Communications, Mortgage, Insurance, Sales and Health Care industries.
Expert in ERP (SAP) Data migrations with all informatica connections
Strong working experience in using Informatica Power Center, Power Mart, and Power Exchange and directly responsible for Extract, Transform and Loading of relational, XML, MQ series, COBOL and Flat File sources systems.
Created ETL Specification documents and Mapping Documentation to implement the business logic in Informatica and ETL processes.
Overseas all reporting needs and accommodations as data movement point of view
Extensive experience in SQL server 2000/2005/2008/2012, Teradata and Oracle 8i, 9i, 10G,11G database programming and performance tuning using SQL, PL/SQL to write Procedures/Functions, Triggers and Packages on multiple platforms like Windows 2000, IBM AIX 5.x, HP-UX etc.
Conversant in Data Modeling involving ER and Star Schema models, logical and physical design of databases using ERWIN.
Expert in Migration activities and hyper care.
Shared expertise in building Error Handling and Metadata capture strategies
Experience as a Technical Lead, Senior ETL developer, Informatica Administrator, Oracle Application DBA with requirements gathering, strong mentoring, decision-making and team playing skills.
Led performance testing initiatives to evaluate the responsiveness, throughput, and resource utilization of database systems, optimizing query performance and database tuning parameters as needed.
Extensively worked with Stored Procedures, Triggers, Cursors, Indexes, Views, Functions and Packages.
Expertise in performance tuning of source, mapping, target and sessions.
Designed database schemas and structures to accommodate health data models based on HL7 standards, enabling interoperability and data exchange across healthcare organizations.
Involved in conducting the different phases of testing like unit testing, system testing, UAT and Deployment activities.
Experience with Relational and Dimensional data modeling using Erwin, understanding of Ralph Kimball and Bill Inman methodology for Dimensional Modeling, Star and Snowflake Schemas.
EDUCATION
Bachelor’s degree of science in Engineering from Dr Babasaheb Ambedkar Marathwada University, India
SKILLS
ETL Tools
ICS, IICS, PowerCenter10.0,9.6,9.1,9.0.1,8.6.1/8.1.3/7.1.3/6.2, IDE/IDQ, B2B, Informatica Profile, Informatica Validator, Informatica Analyst, Power Exchange Navigator/Listener, Informatica Developer 9 and MDM, Pentaho 4.x (Open-Source System), Ab-initio GDE1.15/3.02, SSIS & IBM Data Stage
Databases
Oracle 12c/11g/10g/9.x/8.x/7.x, Teradata, Sybase 12.0.0.7, SQL Server 2000 & 2005 & 2008 & 2012, IBM DB2 (UDB), MS Access, AWS redshift, snowflake and SAP
Reporting tools
Cognos 7.x / Report Net, OLAP Manager/Analysis Server
Scripting Languages
Unix Shell Scripts, Perl Scripts, MS Dos Batch Script, Java Script and VB Script, Cron Tab
Design Tools
Erwin 4.1/3.5 (Entity Relationship / Dimensional Modeling)
Languages & Tools
SQL, PL/SQL, C, C++, COBOL, XML, MS Visio 2000, Toad and MS Project 2002
EXPERIENCE:
Accenture Technologies, Philadelphia, PA May ’15 – Till Date
Role: Tech Data Integration Manager
Accenture Technologies
Clients
Roles
1.BMS
Enterprise informatica Upgrade
2.BMS
SAP Data Migrations
3.United Health Group
Data Modeling & Data Architect for Claims v
4.Fitzpatrick Nuclear Plant
IT Conversion
5.Sone par-USA
ODW (Operational Data warehouse)
6.State of Texas
DFPS (Department of Families & Protection service)
7.Shire Pharma
RDPS (Rare Disease Patient Services)
1.Client: BMS
Role: ETL Lead for Enterprise Informatica Upgrade
Worked as ETL Lead for Enterprise Informatica Upgrade. Install the software on Local machine, checked all the Unix folder and run some of the workflows and make sure all are up to date, then install with help of admins and moved the updated the software to Citrix application.
Once the software is up, coordinate with application owners to move on to the updated version.
2.Client: BMS
Role: ETL Lead for SAP Data migrations.
BMS sold one of the Italy units, they want to set up their own SAP environment. Based on the company code and unit, we must supply all SAP date (400 + with parent child relations) tables data. We supplied all the data. Extensive use of IICS for the first time in BMS.
3.Client: United Health Group
Role: Data Modeling & Data Architect for Claims Modernization
UnitedHealth Group Inc. is an American for-profit managed health care company based in Minnetonka, Minnesota. It is sixth in the United States on the Fortune 500. UnitedHealth Group offers health care products and insurance services.
Involved to build modernization Claim Processing system. UHG Claims Process system was build based on the legacy system and with VSAM files process. Modernization of COBOL application using VSAM and DB2 that was developed in the 80's to Java and MySQL. Database may change to SQL Server or Oracle.
4.Client: Fitzpatrick Nuclear Plant
Role: Conversion IT & Data Conversion
Exelon Took over the Fitzpatrick Nuclear plant and per their agreement & Transaction, all the Fitzpatrick activities will be shut down on March 31st and Exelon activities will start from April First. All kind of migration will be held that weekend and involved in IT & Data Integration plan by Using IICS
Worked on below 6 Objects: -
1.Generic Routing Object
2.Financial Transaction Object
3.User Log & Set up
4.Accounts & Accounts Payable
5.Procurement Engineering
6.Operation Log
As a Technical owner for these 6 Objects, worked with the functional team to identify existing business of Fitzpatrick along comparing the data definitions at Exelon. With Data Definitions and business logic used by Exelon, we identified the Conversion logic definition between the 2 systems. Upon business approval for the Conversion logic, converted the Conversion logic as reusable objects and implemented in the migration Process.
Exelon using the Passport technology and Fitzpatrick using asset suite.
Worked with the all Technical and Functional team individuals and making aware of the reusable object’s usage purpose in the process of Migration.
It was 48 hours’ migration and worked with all Functional teams & technical teams and supported for a week after Go Live went.
Generic Routing: - It will hold the information of each entry and it will help as tracker for each and individual Objects (example, all work orders, material request etc.)
Financial Transaction: - It will have all the information of transaction information.
User Log & information: - It will have the User’s information who has the access of the Fitzpatrick
Accounts & Accounts Payable: - It will have multi-directional accounts & payment information.
Procurement Engineering: - It will have Facility information.
Operation Log: - It will have Operation information of Fitzpatrick.
Environment: Power Center 9.6/, IICS, SQL Server 2012, Toad Data Modeler. PLSQL Developer, VSS, HP Mercury Quality center
5.Client: Sone par-USA,
Role: ETL & Data Architect for ODW
Sone par USA is an independent family-owned company with global market leadership in the business-to-business distribution of electrical, industrial and safety products and related solutions. We are a proud member of the Sone par Group, the world's largest privately held electrical distributor. In the USA, we are represented by over 16 locally managed electrical and industrial distributors and have over 700 locations with coverage in all 50 states.
Involved in building the ODW (Operational Date warehouse) for Reporting purpose on day to day analysis. ODW is totally Build on Star Schema model with the hierarchy Product, Vendor, Customer, Account and Sales. On daily basis we get the data from different distributors. Load strategy will be loading the data from all distributors to stage and stage to ODW. The source data will be provided by Eclipse systems.
Involved in Architecture of ETL, Architecture of Data design and Data Modeling.
Built the model with Toad Data Modeler.
Design the ETL Process to load the Data to stage & ODW, converted into detailed design document and sent offshore tam.
Involved in explaining the process to Dev team and Test teams.
Code Reviews with Offshore on daily basis with the ETL process and discussion in ETL build process.
Looking Design and development of Informatica ETL process to load the data in Facts then in to aggregates.
Scheduled jobs by Informatica scheduler.
Designed the flow of Error handling process to accommodate multiple source systems.
Review the process with QA team, production team and Implementation process.
Environment: Power Center 9.6/, SQL Server 2012, Toad Data Modeler. PLSQL Developer, VSS,
, HP Mercury Quality center
6.Client: State of Texas
Role: Sr ETL Consultant for DFPS
DFPS is one wing of state operated Program to protect Families of Texas state. We created RDM Data warehouse. The sources for the data ware house is Worker (state) information, Person information, Address information, Case information, Intake information, person charge information, Faster Care Information, case denial information. This is built based on Snowflake schema.
RDM (Review Division management) is built with information of people complaints, case statues, case information, officer’s intake information, denial information. Based on this Data reports, state department will start arranging programs for the people and will decisions based on the reports
Involved design develop ETL mappings for Case study, Intake, resource facilities.
Built Initial ETL design and detailed design document and transferred to Offshore.
Design and development of Informatica ETL process to load the data in Facts then in to aggregates.
Scheduled jobs by using Control M scheduling tool.
Implemented Informatica mappings to orchestrate the flow of data from various source systems to the database, ensuring data integrity and consistency.
Review the process with QA team and handover code to Production team (DFPS).
Involved in the knowledge transfer to Production support team.
Environment: Power Center 9.6/, Oracle 11g, PLSQL Developer, SQL Loader, VSS, UNIX, Erwin 9.5, Control M - Enterprise Scheduler, HP Mercury Quality center
7.Client: Shire Pharmaceuticals
Role: Data Integration Consultant for RDPS
RDPS is the old system which Shire maintains separately. Shire wants to bring that information to their centralized data warehouse. Based on existing ETL mappings we started we tweaked all the code and integrate the RDPS data in to the exciting data ware house.
Involved design documents and data migration documents.
Created one time Joins to bring the data from RDPS to Existing warehouse.
Design and development very few Informatica ETL process to load the data in Facts then in to aggregates.
Review the process with QA team.
Involved in the knowledge transfer to Production support team.
Environment: Power Center 9.6/, IICS, Oracle 11g, PLSQL Developer, SQL Loader,
UNIX, HP Mercury Quality center
Client: Campbell’s Soup Company, Camden, NJ Sep ’14 – April ‘15
Role: Sr. Data warehouse Consultant for Immediate Consumption
Campbell’s Soup Company is a global manufacturer and marketer of high-quality foods and simple meals, including soups, backed snacks, vegetable-based beverages and prime chocolate products.
To deploy a reporting and analytics solution that will allow the immediate consumption (IC) leadership team to monitor operational performance metrics and identify opportunities for increased sales and market presence growth. The project scope includes development and delivery of IC performance dashboards and scorecards and as well as delivery of IC operational reports.
For this project purpose we built a DataMart from new source VIP and file system produced by Business users and internal system. We created a data mart based on star schema, build the new dimensions, facts and aggregates which can serve for the reporting purpose. The loading methodologies are, loading in to stage area (ODS), then loaded into EDW.
Involved in all modeling and design meetings initially to create a data mart with the new systems and using of the Existing Data warehouse data.
Built the model and initial structure of data mart using Erwin.
Design and develop the ETL process for the file system using the SQL Loader to Load into the USER_ODSSTAGE (standards of Campbell’s to load data from file system to ware house from the outside vendors)
Design and develop the ETL Informatica Process to load the data from internal systems to ODS and created the ETL Informatica process from USER_ODSSTAGE to ODS.
Design and develop the ETL Informatica process to load the data in to warehouse (as separate data mart for immediate consumption) with dimensioning model.
Design and development of Informatica ETL process to load the data in Facts then in to aggregates!
Scheduled jobs by using Tivoli scheduling tool.
Review the process with QA team, production team what the data flow to load into end-to-end process
Environment: Power Center 9.6/, Oracle 11g, PLSQL Developer, SQL Loader, VSS, UNIX, Erwin 9.5, Tivoli - Enterprise Scheduler, HP Mercury Quality center
Client: JPM Chase Bank (CCB), Wilmington, DE Oct ’12 – Aug ‘14
Role: ETL Lead for Integrated Consumer Data warehouse (ICDW)
As of now Chase is having CDW (Card Data warehousing) which was built with Ab-initio and Oracle and want to build new Enterprise Data Ware house based on the IBM BDW model; called as ICDW with the Informatica and Teradata by using the Informatica and Teradata latest features and utilities. The BDW model follows basic methods of the big data implementation. As of now Worked as Architect and successfully implemented two business areas and working on the third area. Successfully implemented LISTODS (Marketing database), PMON (Posted Monitor transaction) and presently implementing the Tokenization.
Worked as an Architect and involved in all modeling and design meetings initially how to convert the data warehouse and exploring the options of IBM BDW.
Involved in the Architecture of design standards for ICDW for Card LOB.
The BDW model follows pre stage (for few process), Stage, SA2, W2, PROD and Symantec (Reporting layer). All layers are designed based on the TERADATA utilities and Informatica PDO utilities.
Took the ownership of tech design documents and follow with ICDW teams and Chase ARB (Architecture review Board of ARB)
Take the ownership of creating the process and make sure different (Vendors) got clarified the design and got make sure standards of ICDW.
Reviewing the code on timely basis with onsite team and offshore vendor teams.
Review the process with QA team, internal and external project teams and make sure all the teams got the clear idea about the flow, process and standards.
Involved in the Design and development of informatica ETL process for Stage to Symantec layers.
Allocation of Development work with the different developers.
Reviewing the code with all PSG (Production Support Group), owning the Run book for the process flow and make sure all developers will register the end-to-end process incidents for failures.
Data comparisons between the OLD warehouse and new warehouse and make sure we follow the SLA for BAU.
Worked with business users in scheduling and involved in preproduction UAT process.
Coordinated with MDES (Files moving vendor) for scheduling file transfer process for the new environments.
Make sure creation of SVN folders and owning the migration process for upper environments.
Environment: Power Center 9.1, Teradata 14, Teradata SQL Assistant, SVN, UNIX, Business Objects, CTRL M - Enterprise Scheduler, HP Mercury Quality Center 9.0
Client: Comcast Cable, Philadelphia, PA Apr ‘12 – Sep’12
Role: ETL Lead
Comcast Corporation is one of the leading Media. Entertainment and communications Companies. Comcast is principally involved in the operation of cable systems through Comcast Cable and in the development, production and distribution if entertainment, news, sports and other contents to the global audience. The assignment involves providing the data for reporting purpose which involves the number of active Devices and the corresponding active accounts based on the reporting area, User logs based on the reporting day and reporting week for reporting Area, Billing information and ticketing information. Technically lead the design of Central data mart where we load the data from different feeds and few different databases. SPLUNK (User log information), AMDOCS (Biller information), TTS (Tickets information which is reported by customer), CARE (Shipping information of the Devices), DSERVE (Device, account and activation information). The design was built to load the data in Staging area named XBI_STG, XBI_PRD and then XBI_PRES (Presentation Layer), archiving files after the prod load. The model was built in dimensional model. The design includes aggregated tables aggregated tables and designed to load after PROD load has been completed.
Revive the LDM and had the discussions with the analyst and Business users and filled the gaps between the Dev team and Business requirements.
Lead the team to design the stage using the Pentaho to Load data into XBI_STG.
Implemented data quality monitoring and alerting mechanisms within Informatica Data Quality to proactively identify and remediate data quality issues in real-time.
Designed the mappings and sessions flow using Informatica and Loaded to XBI_PRES (Used both Informatica and Pentaho for this project due to the limitations of Comcast policies)
Identified where we need build the additional Dimensions and Facts based on the bottlenecks which got identified.
Review the building process of the team and monitored the implementation process.
Worked with business users in UAT process.
Involved in the migration form BXBII(Dev.) to BXBIQ(QA) and BXBIQ(QA) to BXBIP (press or PROD)
Environment: Power Center 9.0.1 & Pentaho (Open-Source system), Oracle 10g/11g, Toad 8.5, PLSQL Developer, SVN, UNIX, Cron, Mercury Quality Center 9.0, AutoSys
Client: Astra Zeneca Wilmington, DE Oct ‘11 – March’ 12
Role: Solution Architect
AstraZeneca is a global, innovation-driven, integrated biopharmaceutical company We discover, develop, manufacture and market prescription medicines for six important areas of healthcare, which include some of the world’s most serious illnesses: cancer, cardiovascular, gastrointestinal, infection, neuroscience, and respiratory and inflammation Working as a solution architect for PUEBlo (Prescriber Universe Exclusion and Blocking). This is the Central repository data base where all channels consume data and blocks the Prescribers for all kind of promotional blocking as well legal blocking and specialty blockings. The data flown from different sources (IMS, AMA, IMS_SCRUB, ASA, and MDSLN) to the landing area, landing area to Base Object areas then MDM match/merge process through SIPERIAN and finally flows in to HCM (Health care master). Milestones were created a head of two months for T&P (Target and Planning) for promotions to the prescribers. Based on the daily data, we created a centralized database (BCR) where we can update the specialty, AZID, promotions. Based on this BCR database all channels consume the data and blocks the promotions like SNA, DNS and blocks the products.
Revive the BRD and give the directions on how to build the flow.
As part of Agile-Scrum implementation process, worked closely with business analyst, data modeling and QA team in requirements gathering and supporting the documentation of Lending & Risk Modeling BRD
Identified where we need to build the Dimensions and Facts.
Implemented data governance processes and procedures to ensure the accuracy, completeness, and timeliness of health data captured and managed within database systems.
Review the build process and suggestion Design and development of Informatica ETL process based on the incremental load
Developed Rhapsody routes for acquiring and managing millions of records of data, including XML and HL7 message formats, to support data integration requirements.
Implemented data solutions compliant with HIPAA regulations, ensuring the confidentiality, integrity, and availability of protected health information (PHI) within database architectures.
Extensive experience in applying warehousing concepts to design and develop complex mappings within Informatica, ensuring the effective extraction, transformation, and loading (ETL) of data.
Developed data integration processes to handle HL7 message formats, facilitating the exchange of healthcare data between disparate systems and applications in compliance with industry standards.
Designed and implemented PL/SQL stored procedures and functions to handle data processing logic within the database, including promotional blocking and specialty blocking.
Ensured compliance with regulatory requirements and industry standards, such as HIPAA for healthcare data, by implementing appropriate data handling practices.
Worked with business users and suggested the team in scheduling and supporting production UAT process
Environment: Power Center 9.1, 9.0.1, Oracle 9i/10g, Toad 8.5, PLSQL Developer, MS-VSS& start team, UNIX, Autosys, Mercury Quality Center 9.0
Client: Shire Pharmaceuticals, Chester Brook, PA June ’11 –Oct ’11
Role: Solution Architect
I was involved in building the Product Master at Shire Pharmaceutical. It is basically Product information which is nothing but the Brand and Co Brands of the Shire and about the competitors. The information is provided by the Source system ZINC. We loaded the data in the Landing area (PROD, IDENT, PMATX, REL), from there we processed the data through Siperian for match and merger purpose. Once it is completed, we deliver the data which is provided after the siperian process in XML Format to the Warehouse.
Created the Process to build the Pre landing tables for PROD, IDENT, PMATX, REL.
Identified the Process for match and merge process which is nothing but calling the existing stored procedures through Informatica.
Involved in Design and development of informatica ETL process to deliver the data in XML Format to Warehouse
Designed and developed complex SQL queries and PL/SQL code to support the match and merge process using Siperian, ensuring accurate data deduplication and consolidation.
Worked with warehouse team in scheduling and supporting production UAT process.
Involved with the DEV team for creation of UNIX shell scripting and automation of ETL processes.
Used the UC4 for scheduling.
Environment: Power Center 9.0.1/6.2, Oracle 11g/10g, Toad 8.5, PLSQL Developer, UC4
Client: ING DIRECT, Wilmington, DE June ’06 - June ’11
Worked on different projects, name of the projects listed below
1.Project: Mobile app and Mobile Site data to warehouse
Role: Sr. Data Warehouse Consultant Mar ’11 - June ’11
We Brought the Mobile app and Mobile site information to the warehouse. We have a vendor TELEAF who provides the customer information who logged through the Mobile app and who logged through the Mobile site. By this we load all the customers’ info that is using mobile and their transactional information too. By using this we tagged active customers and non-active customers who are using mobile.
As part of Agile-Scrum implementation process, worked closely with business analyst, data modeling and QA team in requirements gathering and supporting the documentation of Lending & Risk Modeling BRD
Identified the Dimensions and Facts.
Design and development of Informatica ETL process based on the incremental load.
Worked with business users in scheduling and supporting production UAT process.
Unix shell scripts to create dynamic parameter files.
Created UNIX shell scripting and automation of ETL processes.
Used PeopleSoft Tidal for scheduling.
Environment: Power Center 9.0.1, Oracle 9i/10g, Toad 8.5, PLSQL Developer, MS-VSS, AS/400, AIX 5.3, Tidal - Enterprise Scheduler, Mercury Quality Center 9.0
2.Project: Data feed to FDIC Regulatory Report
Role: Sr. Data Warehouse Consultant Dec ’10 – Feb ’11
Based on the customer balance for the FDIC regulatory (one customer can have multiple accounts, but customer is going have $250,000 insurance), Daily we create a report and the data populating from the stored procedure based on the warehouse data. We replace the stored procedure and implemented through the Informatica.
As part of Agile-Scrum implementation process, worked closely with business analyst, data modeling and QA team in requirements gathering and supporting the documentation of Lending & Risk Modeling BRD
Identified the existing process and converted the logic through the Informatica.
Design and development of informatica ETL process based on the incremental load.
Implemented test automation frameworks and scripts to streamline repetitive testing tasks, improve test coverage, and accelerate the testing lifecycle for database projects.
Worked with business users in scheduling and supporting production UAT process.
Changed the stored procedures into Informatica mappings and development of Unix shell scripts to create dynamic parameter files.
Created UNIX shell scripting and automation of ETL processes.
Used PeopleSoft Tidal for scheduling.
Environment: Power Center 8.6.1/8.1/6.2, Oracle 9i/10g, Toad 8.5, PLSQL Developer, MS-VSS, AS/400, AIX 5.3, Tidal - Enterprise Scheduler, Mercury Quality Center 9.0
3.Project: Data feed to Epiphany Process from ODW
Role: Sr. Data warehouse Consultant Jan ’10 – Sept ’10
Epiphany is the process which ING Direct uses for marking the product and promotions to the existing customers based on the certain criteria of the customer transactional behavior and balance behavior. Epiphany uses all customers MTD, YTD, LTD transactional information and balance information and based on the products that the customer has. The SAS process gives the data from the old warehouse. To give the data the existing SAS process will run 36 to 48 hours on the warehouse. We changed the SAS process to Informatica process one after the other and we brought the run time to 6 hours by using the incremental load through informatica.
As part of Agile-Scrum implementation process, worked closely with business analyst, data modeling and QA team in requirements gathering and supporting the documentation of Lending & Risk Modeling BRD
Identified the existing process and converted the logic through Informatica.
Design and development of informatica ETL process based on the incremental load.
Worked with business users in scheduling and supporting production UAT process.
Changed the stored procedures into Informatica mappings which Epiphany uses and developed Unix shell scripts to create dynamic parameter files.
Used the PeopleSoft Tidal for scheduling.
Environment: Power Center 8.6.1, Oracle 9i/10g, Toad 8.5, PLSQL Developer, MS-VSS, AS/400, AIX 5.3, Tidal - Enterprise Scheduler, Mercury Quality Center 9.0
4.Project: Orange Data Warehouse (ODW) Phase II
Role: Sr. Data warehouse Consultant Aug ’09 - Dec ’09
As part of the Data Services, ODW phase II provides information to various reporting needs including lending, credit risk & deposit services etc.
Customer Status Definition - Head Office status reporting based on the customer transactions/activities with sources involving oracle, SQL server. We brought that information in to warehouse. We have two other systems ACCESS2NET and AWD SQL Server Data bases, this information is about how many customers are accessing the website and how frequently access is in the ACCESS2NET and all the document information about loan is in AWD. We brought that information also into data warehouse.
We started the Agile methodology, as part of Agile-Scrum implementation process, worked closely with business analyst, data modeling and QA team in requirements gathering and supporting the documentation of Lending & Risk Modeling BRD
Data analysis and profiling of different Loan Processing (PROFILE, CALYX) and loan underwriting (LNI/TRANSACT) source systems.
Design and development of informatica ETL process for change data capture/delta staging loads, loading of dimensions, facts and their data marts, alerting production support team on load errors/rejects, alerting business users on any new product/source codes etc.
Documented test results, defects, and observations from system testing and user acceptance testing phases, communicating findings to development teams and stakeholders for resolution and decision-making.
Worked with business users in scheduling and supporting preproduction UAT process.
Development of Unix shell scripts to create dynamic parameter files, appending date and file archiving process, cleansing incoming source files etc.
Created UNIX shell scripting and automation of ETL processes.
Used PeopleSoft Tidal for scheduling.
Environment: Power Center 8.6.1/8.1, Oracle 9i/10g, Toad 8.5, PLSQL Developer, MS-VSS, SQL server 2005, AS/400, AIX 5.3, Report net, Tidal -